Download presentation
Presentation is loading. Please wait.
1
Lecture 20 SVD and Its Applications
Shang-Hua Teng
2
Spectral Theorem and Spectral Decomposition
Every symmetric matrix A can be written as where x1 …xn are the n orthonormal eigenvectors of A, they are the principal axis of A. xi xiT is the projection matrix on to xi !!!!!
3
Singular Value Decomposition
Any m by n matrix A may be factored such that A = UVT U: m by m, orthogonal, columns V: n by n, orthogonal, columns : m by n, diagonal, r singular values
4
The Singular Value Decomposition
VT m x n m x m m x n n x n = S r = the rank of A = number of linearly independent columns/rows
5
SVD Properties U, V give us orthonormal bases for the subspaces of A:
1st r columns of U: Column space of A Last m - r columns of U: Left nullspace of A 1st r columns of V: Row space of A 1st n - r columns of V: Nullspace of A IMPLICATION: Rank(A) = r
6
The Singular Value Decomposition
A U S VT = m x n m x m m x n n x n A U S VT = m x n m x r r x r r x n
7
Singular Value Decomposition
where u1 …ur are the r orthonormal vectors that are basis of C(A) and v1 …vr are the r orthonormal vectors that are basis of C(AT )
8
SVD Proof (m x m) AAT (n x n) ATA
Any m x n matrix A has two symmetric covariant matrices (m x m) AAT (n x n) ATA
9
Spectral Decomposition of Covariant Matrices
(m x m) AAT =U L1 UT U is call the left singular vectors of A (n x n) ATA = V L2 VT V is call the right singular vectors of A Claim: are the same
10
Singular Value Decomposition
Proof
11
All Singular Values are non Negative
12
Row and Column Space Projection
Suppose A is an m by n matrix that has rank r and r << n, and r << m. Then A has r non-zero singular values Let A = U S VT be the SVD of A where S is an r by r diagonal matrix Examine:
13
The Singular Value Projection
A U S VT = m x n m x r r x r r x n
14
Therefore Rows of U S are r dimensional projections of rows of A
Columns of SVT are r dimensional projections of columns of A So we can compute their distances or dot products in a lower dimensional space
15
Eigenvalues and Determinants
Product law: Summation Law: Both can be proved by examining the characteristic polynomial
16
Eigenvalues and Pivots
If A is symmetric the number of positive (negative) eigenvalues equals to the number of positive (negative) pivots A = LDL T Topological Proof: scale down the off-diagonal entries of L continuously to 0, i.e., moving L continuously to I. Any change sign in eigenvalue must cross 0
17
Next Lecture Dimensional reduction for Latent Semantic Analysis
Eigenvalue Problems in Web Analysis
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.