Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 20 SVD and Its Applications Shang-Hua Teng.

Similar presentations


Presentation on theme: "Lecture 20 SVD and Its Applications Shang-Hua Teng."— Presentation transcript:

1 Lecture 20 SVD and Its Applications Shang-Hua Teng

2 Every symmetric matrix A can be written as Spectral Theorem and Spectral Decomposition where x 1 …x n are the n orthonormal eigenvectors of A, they are the principal axis of A. x i x i T is the projection matrix on to x i !!!!!

3 Singular Value Decomposition Any m by n matrix A may be factored such that A = U  V T U: m by m, orthogonal, columns V: n by n, orthogonal, columns  : m by n, diagonal, r singular values

4 The Singular Value Decomposition r = the rank of A = number of linearly independent columns/rows AU VTVT m x n m x m m x n n x n · · =  0 0

5 SVD Properties U, V give us orthonormal bases for the subspaces of A: –1st r columns of U: Column space of A –Last m - r columns of U: Left nullspace of A –1st r columns of V: Row space of A –1st n - r columns of V: Nullspace of A IMPLICATION: Rank(A) = r

6 The Singular Value Decomposition · · AU VTVT =  0 0 A U VTVT m x n m x r r x r r x n =  0 0 m x n m x m m x n n x n

7 Singular Value Decomposition where u 1 …u r are the r orthonormal vectors that are basis of C(A) and v 1 …v r are the r orthonormal vectors that are basis of C(A T )

8 SVD Proof Any m x n matrix A has two symmetric covariant matrices (m x m) AA T (n x n) A T A

9 Spectral Decomposition of Covariant Matrices (m x m) AA T =U   U T –U is call the left singular vectors of A (n x n) A T A = V   V T –V is call the right singular vectors of A Claim: are the same

10 Singular Value Decomposition Proof

11 All Singular Values are non Negative

12 Row and Column Space Projection Suppose A is an m by n matrix that has rank r and r << n, and r << m. –Then A has r non-zero singular values –Let A = U  V T be the SVD of A where S is an r by r diagonal matrix –Examine:

13 The Singular Value Projection · A U VTVT m x n m x r r x r r x n =  0 0

14 Therefore Rows of U  are r dimensional projections of rows of A Columns of  V T are r dimensional projections of columns of A So we can compute their distances or dot products in a lower dimensional space

15 Eigenvalues and Determinants Product law: Summation Law: Both can be proved by examining the characteristic polynomial

16 Eigenvalues and Pivots If A is symmetric the number of positive (negative) eigenvalues equals to the number of positive (negative) pivots A = LDL T Topological Proof: scale down the off-diagonal entries of L continuously to 0, i.e., moving L continuously to I. Any change sign in eigenvalue must cross 0

17 Next Lecture Dimensional reduction for Latent Semantic Analysis Eigenvalue Problems in Web Analysis


Download ppt "Lecture 20 SVD and Its Applications Shang-Hua Teng."

Similar presentations


Ads by Google