CS 395/495-26: Spring 2004 IBMR: Singular Value Decomposition (SVD Review) Jack Tumblin

Slides:



Advertisements
Similar presentations
Linear Transformations and Matrices Jim Van Verth Lars M. Bishop
Advertisements

Equations of Lines Equations of Lines
4.1 Introduction to Matrices
CS 450: COMPUTER GRAPHICS LINEAR ALGEBRA REVIEW SPRING 2015 DR. MICHAEL J. REALE.
3D Geometry for Computer Graphics
CS 395/495-26: Spring 2002 IBMR: Week 5B Finish Chapter 2: 3D Projective Geometry + Applications Jack Tumblin
Linear Algebra.
Lecture 19 Singular Value Decomposition
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Dan Witzner Hansen  Groups?  Improvements – what is missing?
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 : “shiv rpi” Linear Algebra A gentle introduction Linear Algebra has become as basic and as applicable.
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
CS 395/495-26: Spring 2003 IBMR: Week 3 B Finish Conics, & SVD Review Jack Tumblin
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
Singular Value Decomposition (SVD) (see Appendix A.6, Trucco & Verri) CS485/685 Computer Vision Prof. George Bebis.
CS 395/495-26: Spring 2004 IBMR: Projective 2D DLT Jack Tumblin
CS232.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
3D Geometry for Computer Graphics
3D Geometry for Computer Graphics
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Subspaces, Basis, Dimension, Rank
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Matrices CS485/685 Computer Vision Dr. George Bebis.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
SVD(Singular Value Decomposition) and Its Applications
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
Patrick Nichols Thursday, September 18, Linear Algebra Review.
CS246 Topic-Based Models. Motivation  Q: For query “car”, will a document with the word “automobile” be returned as a result under the TF-IDF vector.
Recap of linear algebra: vectors, matrices, transformations, … Background knowledge for 3DM Marc van Kreveld.
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 : “shiv rpi” Linear Algebra A gentle introduction Linear Algebra has become as basic and as applicable.
SVD: Singular Value Decomposition
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
6.837 Linear Algebra Review Rob Jagnow Monday, September 20, 2004.
Scientific Computing Singular Value Decomposition SVD.
2 2.9 © 2016 Pearson Education, Inc. Matrix Algebra DIMENSION AND RANK.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
KEY THEOREMS KEY IDEASKEY ALGORITHMS LINKED TO EXAMPLES next.
Arab Open University Faculty of Computer Studies M132: Linear Algebra
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
2.5 – Determinants and Multiplicative Inverses of Matrices.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
College Algebra Chapter 6 Matrices and Determinants and Applications
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Some useful linear algebra
SVD: Physical Interpretation and Applications
CS485/685 Computer Vision Dr. George Bebis
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lecture 13: Singular Value Decomposition (SVD)
Linear Algebra A gentle introduction
Subject :- Applied Mathematics
Outline Properties on the Matrix Inverse Matrix Inverse Lemma
Presentation transcript:

CS 395/495-26: Spring 2004 IBMR: Singular Value Decomposition (SVD Review) Jack Tumblin

Matrix Multiply: A Change-of-Axes Matrix Multiply: Ax = bMatrix Multiply: Ax = b –x and b are column vectors –A has m rows, n columns x1x1……xnxnx1x1……xnxn… a 11 … a 1n …… a m1 … a mn = b1b1……bmbmb1b1……bmbm… x1x1x1x1 x2x2x2x2 b3b3b3b3 b2b2b2b2 b1b1b1b1 Ax=b x b Input (N dim) (2D) Output (M-dim) (3D)

Matrix Multiply: A Change-of-Axes Matrix Multiply: Ax = bMatrix Multiply: Ax = b –x and b are column vectors –A has m rows, n columns Matrix multiply is just a set of dot-products; it ‘changes coordinates’ for vector x, makes b.Matrix multiply is just a set of dot-products; it ‘changes coordinates’ for vector x, makes b. –Rows of A = A 1,A 2,A 3,... = new coordinate axes –Ax = a dot product for each new axis x1x1……xnxnx1x1……xnxn… a 11 … a 1n …… a m1 … a mn = b1b1……bmbmb1b1……bmbm… x1x1x1x1 x2x2x2x2 A2A2A2A2 A1A1A1A1 A3A3A3A3 b3b3b3b3 b2b2b2b2 b1b1b1b1 Ax=b x b Input (N dim) (2D) Output (M-dim) (3D)

Sphere of all unit-length x  ellipsoid of bSphere of all unit-length x  ellipsoid of b Output ellipsoid’s axes are always perpendicular (true for any ellipsoid)Output ellipsoid’s axes are always perpendicular (true for any ellipsoid) **THUS** we can make ┴, unit-length axes… How Does Matrix ‘stretch space’? x1x1x1x1 x2x2x2x2 b3b3b3b3 b2b2b2b2 b1b1b1b1 Ax=b 1 Input (N dim.) Output (M dim.) u1u1u1u1 u2u2u2u2 x 1 ellipsoid (disk) b

Sphere of all unit-length x  ellipsoid of bSphere of all unit-length x  ellipsoid of b Output ellipsoid’s axes form orthonormal basis vectors U i :Output ellipsoid’s axes form orthonormal basis vectors U i : Basis vectors U i are columns of U matrixBasis vectors U i are columns of U matrix SVD(A) = USV T columns of U = OUTPUT basis vectors SVD finds: ‘Output’ Vectors U, and… x1x1x1x1 x2x2x2x2 b3b3b3b3 b2b2b2b2 b1b1b1b1 Ax=b 1 Input (N dim.) Output (M dim.) u1u1u1u1 u2u2u2u2 x 1 ellipsoid (disk) b

SVD finds: …‘Input’ Vectors V, and… For each U i, make a matching V i that:For each U i, make a matching V i that: –Transforms to U i (with scaling s i ) : s i (A V i ) = U i –Forms an orthonormal basis of input space SVD(A) = USV T columns of V = INPUT basis vectors x1x1x1x1 x2x2x2x2 Ax=b x Input (N dim.) b3b3b3b3 b2b2b2b2 b1b1b1b1 Output (M dim.) u1u1u1u1 u2u2u2u2 v2v2v2v2 v1v1v1v1 b

SVD finds: …‘Scale’ factors S. We have Unit-length Output U i, vectors,We have Unit-length Output U i, vectors, Each from a Unit-length Input V i vector, soEach from a Unit-length Input V i vector, so So we need ‘scale factor’ (singular values) s i to link input to output: s i (A V i ) = U iSo we need ‘scale factor’ (singular values) s i to link input to output: s i (A V i ) = U i SVD(A) = USV T x1x1x1x1 x2x2x2x2 Ax=b x Input (N dim.) b3b3b3b3 b2b2b2b2 b1b1b1b1 Output (M dim.) u1u1u1u1 u2u2u2u2 v2v2v2v2 v1v1v1v1 s2s2s2s2 s1s1s1s1 b

SVD Review: What is it? Finish: SVD(A) = USV TFinish: SVD(A) = USV T –add ‘missing’ U i or V i, define these s i =0. –Singular matrix S: diagonals are s i : ‘v-to-u scale factors’ –Matrix U, Matrix V have columns U i and V i x1x1x1x1 x2x2x2x2 Ax=b x Input (N dim.) b3b3b3b3 b2b2b2b2 b1b1b1b1 Output (M dim.) u1u1u1u1 u3u3u3u3 u2u2u2u2 v2v2v2v2 v1v1v1v1 A = USV T ‘Find Input & Output Axes, Linked by Scale’ b

‘Let SVDs Explain it All for You’ –cool! U and V are Orthonormal! U -1 = U T, V -1 = V T –Rank(A)? == # of non-zero singular values s i –‘ill conditioned’? Some s i are nearly zero –‘Invert’ a non-square matrix A ? Amazing! pseudo-inverse:A=USV T ; A VS -1 U T = I; so A -1 = V S -1 U T x1x1x1x1 x2x2x2x2 Ax=b x Input (N dim.) b3b3b3b3 b2b2b2b2 b1b1b1b1 Output (M dim.) u1u1u1u1 u3u3u3u3 u2u2u2u2 v2v2v2v2 v1v1v1v1 A = USV T ‘Find Input & Output Axes, Linked by Scale’

‘Solve for the Null Space’ Ax=0 –Easy! x is the V i axis that doesn’t affect the output (s i = 0) –are all s i nonzero? then the only answer is x=0. –Null ‘space’ because V i is a DIMENSION--- x = Vi * a –More than 1 zero-valued s i ? null space >1 dimension... x = a V i1 + b V i2 + … x1x1x1x1 x2x2x2x2 Ax=b x Input (N dim.) b3b3b3b3 b2b2b2b2 b1b1b1b1 Output (M dim.) u1u1u1u1 u3u3u3u3 u2u2u2u2 v2v2v2v2 v1v1v1v1 A = USV T ‘Find Input & Output Axes, Linked by Scale’

More on SVDs: Handout Zisserman Appendix 3, pg 556 (cryptic)Zisserman Appendix 3, pg 556 (cryptic) Linear Algebra booksLinear Algebra books Good online introduction: (your handout) Leach, S. “Singular Value Decomposition – A primer,” Unpublished Manuscript, Department of Computer Science, Brown University. online introduction: (your handout) Leach, S. “Singular Value Decomposition – A primer,” Unpublished Manuscript, Department of Computer Science, Brown University. Google on Singular Value Decomposition...Google on Singular Value Decomposition...

END