Lecture 20 SVD and Its Applications

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Covariance Matrix Applications
Section 5.1 Eigenvectors and Eigenvalues. Eigenvectors and Eigenvalues Useful throughout pure and applied mathematics. Used to study difference equations.
Lecture 19 Singular Value Decomposition
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Lecture 19 Quadratic Shapes and Symmetric Positive Definite Matrices Shang-Hua Teng.
Singular Value Decomposition
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Matrices CS485/685 Computer Vision Dr. George Bebis.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
CHAPTER SIX Eigenvalues
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
CS246 Topic-Based Models. Motivation  Q: For query “car”, will a document with the word “automobile” be returned as a result under the TF-IDF vector.
SVD: Singular Value Decomposition
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Chapter 5 Eigenvalues and Eigenvectors
Introduction to Vectors and Matrices
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Eigen & Singular Value Decomposition
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Matrices and Vectors Review Objective
Lecture: Face Recognition and Feature Reduction
Euclidean Inner Product on Rn
Orthogonality and Least Squares
Singular Value Decomposition
Lecture 21 SVD and Latent Semantic Indexing and Dimensional Reduction
Some useful linear algebra
CS485/685 Computer Vision Dr. George Bebis
Recitation: SVD and dimensionality reduction
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Symmetric Matrices and Quadratic Forms
Principal Components What matters most?.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lecture 13: Singular Value Decomposition (SVD)
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Introduction to Vectors and Matrices
EIGENVECTORS AND EIGENVALUES
Elementary Linear Algebra Anton & Rorres, 9th Edition
Corollary If A is diagonalizable and rank A=r,
Lecture 8 Matrix Inverse and LU Decomposition
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Symmetric Matrices and Quadratic Forms
CSE 203B: Convex Optimization Week 2 Discuss Session
Presentation transcript:

Lecture 20 SVD and Its Applications Shang-Hua Teng

Spectral Theorem and Spectral Decomposition Every symmetric matrix A can be written as where x1 …xn are the n orthonormal eigenvectors of A, they are the principal axis of A. xi xiT is the projection matrix on to xi !!!!!

Singular Value Decomposition Any m by n matrix A may be factored such that A = UVT U: m by m, orthogonal, columns V: n by n, orthogonal, columns : m by n, diagonal, r singular values

The Singular Value Decomposition VT m x n m x m m x n n x n · = S r = the rank of A = number of linearly independent columns/rows

SVD Properties U, V give us orthonormal bases for the subspaces of A: 1st r columns of U: Column space of A Last m - r columns of U: Left nullspace of A 1st r columns of V: Row space of A 1st n - r columns of V: Nullspace of A IMPLICATION: Rank(A) = r

The Singular Value Decomposition · · A U S VT = m x n m x m m x n n x n A U S VT = m x n m x r r x r r x n

Singular Value Decomposition where u1 …ur are the r orthonormal vectors that are basis of C(A) and v1 …vr are the r orthonormal vectors that are basis of C(AT )

SVD Proof (m x m) AAT (n x n) ATA Any m x n matrix A has two symmetric covariant matrices (m x m) AAT (n x n) ATA

Spectral Decomposition of Covariant Matrices (m x m) AAT =U L1 UT U is call the left singular vectors of A (n x n) ATA = V L2 VT V is call the right singular vectors of A Claim: are the same

Singular Value Decomposition Proof

All Singular Values are non Negative

Row and Column Space Projection Suppose A is an m by n matrix that has rank r and r << n, and r << m. Then A has r non-zero singular values Let A = U S VT be the SVD of A where S is an r by r diagonal matrix Examine:

The Singular Value Projection · A U S VT = m x n m x r r x r r x n

Therefore Rows of U S are r dimensional projections of rows of A Columns of SVT are r dimensional projections of columns of A So we can compute their distances or dot products in a lower dimensional space

Eigenvalues and Determinants Product law: Summation Law: Both can be proved by examining the characteristic polynomial

Eigenvalues and Pivots If A is symmetric the number of positive (negative) eigenvalues equals to the number of positive (negative) pivots A = LDL T Topological Proof: scale down the off-diagonal entries of L continuously to 0, i.e., moving L continuously to I. Any change sign in eigenvalue must cross 0

Next Lecture Dimensional reduction for Latent Semantic Analysis Eigenvalue Problems in Web Analysis