menu

Lecture 7: Eckart-Young: The Closest Rank k Matrix to A

Description

In this lecture, Professor Strang reviews Principal Component Analysis (PCA), which is a major tool in understanding a matrix of data. In particular, he focuses on the Eckart-Young low rank approximation theorem.

Summary

\(A_k = \sigma_1 u_1 v^{\mathtt{T}}_1 + \cdots + \sigma_k u_k v^{\mathtt{T}}_k\) (larger \(\sigma\)'s from \(A = U\Sigma V^{\mathtt{T}}\))
The norm of \(A - A_k\) is below the norm of all other \(A - B_k\).
Frobenius norm squared = sum of squares of all entries
The idea of Principal Component Analysis (PCA)

Related section in textbook: I.9

Instructor: Prof. Gilbert Strang

Course Features

record_voice_over AV lectures - Video
assignment_turned_in Assignments - problem sets (no solutions)
equalizer AV special element audio - Podcast