Download presentation
Presentation is loading. Please wait.
Published byฮแผดฯฯฯฮฟฯ ฮฃฯฮทฮปฮนฯฯฯฯฮฟฯ ฮปฮฟฯ Modified over 5 years ago
1
Lecture 13: Singular Value Decomposition (SVD)
Junghoo โJohnโ Cho UCLA
2
Summary: Two Worlds World of vectors Vector ๐ฃ Linear transformation
basis vectors 1:1 mapping (=isomorphic) World of vectors Vector ๐ฃ Linear transformation ๐ ๐ ๐ฅ +๐ ๐ฆ =๐๐ ๐ฅ +๐๐ ๐ฆ Orthogonal Stretching Stretching factor Stretching direction Rotation Stretching + Rotation World of numbers 1๐ท: 1, 2, 0 vector 2๐ท: โฏ 2 โฑ โฎ โฎ matrix Symmetric matrix Eigenvalue Eigenvector Orthonormal matrix
3
Singular Value Decomposition (SVD)
Any matrix ๐ can be decomposed to ๐= ๐ 1 ๐ท ๐ 2 ๐ where ๐ท is a diagonal matrix and ๐ 1 and ๐ 2 are orthonormal matrix Singular values: diagonal entries in ๐ท Example Q: What is this transformation? What does SVD mean? = โ โ
4
Singular Value Decomposition (SVD)
Q: What does ๐ 2 ๐ mean? ๐= ๐ 1 ๐ท ๐ 2 ๐ = โ โ Change of coordinates! New basis vectors are (4/5, 3/5) and (-3/5, 4/5)! Q: What does ๐ 1 mean? Q: What does ๐ท mean? Rotation! Rotate first basis vector (4/5, 3/5) to (1/ 2 , 1/ 2 ) second basis vector (-3/5, 4/5) to (-1/ 2 , 1/ 2 ) Orthogonal stretching! Stretch x3 along first basis vector (4/5, 3/5) Stretch x2 along second basis vector (-3/5, 4/5)! SVD shows that any matrix (= linear transformation) is essentially a orthogonal stretching followed by a rotation
5
What about Non-Square Matrix ๐?
Q: When ๐ is an ๐ร๐ matrix, what are dimensions of ๐ 1 , ๐ท, ๐ 2 ๐ ? (๐) = (๐ 1 ) (๐ท) ( ๐ 2 ๐ ) For non-square matrix ๐, ๐ท becomes a non-square diagonal matrix When ๐>๐ When ๐<๐ ๐ท= ๐ ๐ โฎ โฎ ๐ท= ๐ โฏ 0 ๐ 2 0 โฏ ๐ ๐ ๐ ๐ ๐ ๐ โdimension paddingโ Covert 2D to 3D by adding a third dimension, for example โdimension reductionโ Convert 3D to 2D by discarding the third dimension, for example
6
Computing SVD Q: How can we perform SVD? ๐= ๐ 1 ๐ท ๐ 2 ๐
๐ ๐ ๐= (๐ 1 ๐ท ๐ 2 ๐ ) ๐ ๐ 1 ๐ท ๐ 2 ๐ = ๐ 2 ๐ท ๐ ๐ 1 ๐ ๐ 1 ๐ท ๐ 2 ๐ = ๐ 2 ๐ท ๐ ๐ท ๐ 2 ๐ Q: What kind of matrix is ๐ ๐ ๐? ๐ ๐ ๐ is a symmetric matrix Orthogonal stretching Diagonal entries of ๐ท ๐ ๐ท ~ ๐ท 2 are eigenvalues (i.e., stretching factor) Columns of ๐ 2 are eigenvectors (i.e., stretching direction) We can compute ๐ 2 of ๐= ๐ 1 ๐ท ๐ 2 ๐ by computing eigenvectors of ๐ ๐ ๐ Similarly ๐ 1 is the eigenvectors of ๐๐ ๐ ๐ท ~ ๐ท ๐ ๐ท or ๐ท ๐ท ๐ SVD can be done by computing eigenvalues and eigenvectors of TTT and TTT
7
Example: SVD Q: What kind of linear transformation is ๐?
๐= โ โ2 2 โ = โ โ
8
Summary: Two Worlds World of vectors Vector ๐ฃ Linear transformation
basis vectors 1:1 mapping (=isomorphic) World of vectors Vector ๐ฃ Linear transformation ๐ ๐ ๐ฅ +๐ ๐ฆ =๐๐ ๐ฅ +๐๐ ๐ฆ Orthogonal Stretching Stretching factor Stretching direction Rotation Stretching + Rotation World of numbers 1๐ท: 1, 2, 0 vector 2๐ท: โฏ 2 โฑ โฎ โฎ matrix Symmetric matrix Eigenvalue Eigenvector Orthonormal matrix Singular value decomposition
9
SVD: Application = X ๐ ๐ ๐ ๐ ๐ Rank-๐ approximation Q: Why?
Sometimes we may want to โapproximateโ a large matrix as multiplication of two smaller matrices Q: Why? ๐ ๐ = X ๐ ๐ ๐
10
Rank-๐ Approximation Q: How can we โdecomposeโ a matrix into multiplication of two matrices of rank-๐ in the best possible way? Minimize the โL2 differenceโ (= Frobenius norm) between the original matrix and the approximation
11
SVD as Matrix Approximation
Q: If we want to reduce the rank of ๐ to 2, what will be a good choice? The best rank-๐ approximation of any matrix ๐ is to keep the first-๐ entries of its SVD. Minimizes L2 difference between the original and the rank-๐ approximation
12
SVD Approximation Example: 1000 x 1000 matrix with (0โฆ255)
62 60 58 57 55 53 54 61 59 56 12
13
Image of original matrix 1000x1000
13
14
SVD. Rank 1 approximation
14
15
SVD. Rank 10 approximation
15
16
SVD. Rank 100 approximation
16
17
Original vs Rank 100 approximation
Q: How many numbers do we keep for each? 17
18
Dimensionality Reduction
A data with large dimension Example: 1M users with 10M items. 1M x 10M matrix Q: Can we represent each user with much fewer dimensions, say 1000, without losing too much information?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.