CPSC 491 Xin Liu November 22, 2010. A change of Bases A mxn =UΣV T, U mxm, V nxn are unitary matrixes Σ mxn is a diagonal matrix Columns of a unitary.

Slides:



Advertisements
Similar presentations
5.2 Rank of a Matrix. Set-up Recall block multiplication:
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Generalised Inverses Modal Analysis and Modal Testing S. Ziaei Rad.
Chapter 6 Eigenvalues and Eigenvectors
Section 5.1 Eigenvectors and Eigenvalues. Eigenvectors and Eigenvalues Useful throughout pure and applied mathematics. Used to study difference equations.
Lecture 19 Singular Value Decomposition
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
Singular Value Decomposition (SVD) (see Appendix A.6, Trucco & Verri) CS485/685 Computer Vision Prof. George Bebis.
Singular Value Decomposition
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Introduction to Information Retrieval Introduction to Information Retrieval Hinrich Schütze and Christina Lioma Lecture 18: Latent Semantic Indexing 1.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Digital Control Systems Vector-Matrix Analysis. Definitions.
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
Matrices CS485/685 Computer Vision Dr. George Bebis.
CE 311 K - Introduction to Computer Methods Daene C. McKinney
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Linear Algebra With Applications by Otto Bretscher. Page The Determinant of any diagonal nxn matrix is the product of its diagonal entries. True.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Section 6.6 Orthogonal Matrices.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Day 1 Eigenvalues and Eigenvectors
Day 1 Eigenvalues and Eigenvectors
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
SVD: Singular Value Decomposition
Linear algebra: matrix Eigen-value Problems
Domain Range definition: T is a linear transformation, EIGENVECTOR EIGENVALUE.
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
CPSC 491 Xin Liu November 19, Norm Norm is a generalization of Euclidean length in vector space Definition 2.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
L 7: Linear Systems and Metabolic Networks. Linear Equations Form System.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
KEY THEOREMS KEY IDEASKEY ALGORITHMS LINKED TO EXAMPLES next.
1.3 Solutions of Linear Systems
Chapter 2 Determinants. With each square matrix it is possible to associate a real number called the determinant of the matrix. The value of this number.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
2.5 – Determinants and Multiplicative Inverses of Matrices.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Review of Eigenvectors and Eigenvalues from CliffsNotes Online mining-the-Eigenvectors-of-a- Matrix.topicArticleId-20807,articleId-
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Linear Algebra With Applications by Otto Bretscher.
Chapter 6 Eigenvalues and Eigenvectors
Review of Eigenvectors and Eigenvalues
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
ISHIK UNIVERSITY FACULTY OF EDUCATION Mathematics Education Department
Elementary Linear Algebra Anton & Rorres, 9th Edition
Euclidean Inner Product on Rn
Some useful linear algebra
2. Matrix Algebra 2.1 Matrix Operations.
4.6: Rank.
Lecture 13: Singular Value Decomposition (SVD)
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Chapter 2 Determinants.
Outline Numerical Stability Singular Value Decomposition
Outline Properties on the Matrix Inverse Matrix Inverse Lemma
Presentation transcript:

CPSC 491 Xin Liu November 22, 2010

A change of Bases A mxn =UΣV T, U mxm, V nxn are unitary matrixes Σ mxn is a diagonal matrix Columns of a unitary matrix form a basis Any b in R m can be expanded in {u 1, u 2, …, u m } b=Ub’ b’=U T b Any x in R n can be expanded in {v 1, v 2, …, v m } x=Vx’ x’=V T x b=Ax U T b = U T Ax = U T UΣV T x b’= Σx’ A reduces to the diagonal matrix Σ when the range is expressed in the basis of columns of U and the domain is expressed in the basis of columns of V 2

Matrix Properties via SVD Theorem 1: The rank of A is r, the # of nonzero singular values. Proof: A mxn =UΣV T Rank (Σ) = r U, V are of full rank Theorem 2: range (A) = and null (A) = Theorem 3: ||A|| 2 = σ 1 and ||A|| F = sqrt (σ 1 2 +σ … + σ r 2 ) 3

Matrix Properties via SVD Theorem 4: The nonzero singular values of A are the square roots of the nonzero eigenvalues of A T A or AA T if Ax = λx (x is non-zero vector), then λ is an eigenvalue of A Theorem 5: If A = A T, then the singular values of A are the absolute values of the eigenvalues of A. Theorem 6: For A mxm, |det(A)| = Π i=1 m σ i Compute the determinant Proof: |det (A)| = |det (UΣV T )| = |det (U)| |det(Σ)| |det(V T )| = |det(Σ)| = Π i=1 m σ i 4

Low-Rank Approximations Theorem 7: A is the sum of r rank-one matrices A = Σ j=1 r σ j u j v j T Proof: Σ = diag(σ 1, 0, …, 0) + … + diag(0,..0, σ r, 0, …, 0) matrix multiplications The partial sum captures as much of the energy of A as possible “Energy” is defined by either the 2-norm or the Frobenius norm For any 0 ≤ v ≤ r 5

Applications Determine the rank of a matrix Find an orthonormal basis of a range/nullspace of a matrix Solve linear equation systems Compute ||A|| 2 Least squares fitting 6