Eigen Decomposition Based on the slides by Mani Thomas

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

Eigen Decomposition and Singular Value Decomposition
3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Covariance Matrix Applications
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Section 5.1 Eigenvectors and Eigenvalues. Eigenvectors and Eigenvalues Useful throughout pure and applied mathematics. Used to study difference equations.
Lecture 17 Introduction to Eigenvalue Problems
OCE301 Part II: Linear Algebra lecture 4. Eigenvalue Problem Ax = y Ax = x occur frequently in engineering analysis (eigenvalue problem) Ax =  x [ A.
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
Eigenvectors and Eigenvalues
Compiled By Raj G. Tiwari
Principal Component Analysis Adapted by Paul Anderson from Tutorial by Doug Raiford.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
CHAPTER 2 MATRIX. CHAPTER OUTLINE 2.1 Introduction 2.2 Types of Matrices 2.3 Determinants 2.4 The Inverse of a Square Matrix 2.5 Types of Solutions to.
Some matrix stuff.
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Class 27: Question 1 TRUE or FALSE: If P is a projection matrix of the form P=A(A T A) -1 A T then P is a symmetric matrix. 1. TRUE 2. FALSE.
Solving Linear Systems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. Solving linear.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
7.1 Eigenvalues and Eigenvectors
Lecture Note 1 – Linear Algebra Shuaiqiang Wang Department of CS & IS University of Jyväskylä
7 7.2 © 2016 Pearson Education, Ltd. Symmetric Matrices and Quadratic Forms QUADRATIC FORMS.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Presented by: Muhammad Wasif Laeeq (BSIT07-1) Muhammad Aatif Aneeq (BSIT07-15) Shah Rukh (BSIT07-22) Mudasir Abbas (BSIT07-34) Ahmad Mushtaq (BSIT07-45)
Principal Components Analysis ( PCA)
Unsupervised Learning II Feature Extraction
Unsupervised Learning II Feature Extraction
Review of Eigenvectors and Eigenvalues from CliffsNotes Online mining-the-Eigenvectors-of-a- Matrix.topicArticleId-20807,articleId-
Eigenvalues and Eigenvectors
Review of Eigenvectors and Eigenvalues
Eigen & Singular Value Decomposition
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
The Inverse of a Square Matrix
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Properties Of the Quadratic Performance Surface
Systems of First Order Linear Equations
Principal Component Analysis
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Linear Discriminant Analysis(LDA)
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Linear Algebra Lecture 39.
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Principal Components What matters most?.
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Principal Component Analysis
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE HADAPSAR, PUNE
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Chapter 2 Determinants.
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki

Introduction Eigenvalue decomposition Physical interpretation of eigenvalue/eigenvectors

A(x) = (Ax) = (x) = (x) What are eigenvalues? Given a matrix, A, x is the eigenvector and  is the corresponding eigenvalue if Ax = x A must be square and the determinant of A -  I must be equal to zero Ax - x = 0 iff (A - I) x = 0 Trivial solution is if x = 0 The non trivial solution occurs when det(A - I) = 0 Are eigenvectors unique? If x is an eigenvector, then x is also an eigenvector and  is an eigenvalue A(x) = (Ax) = (x) = (x)

Calculating the Eigenvectors/values Expand the det(A - I) = 0 for a 2 x 2 matrix For a 2 x 2 matrix, this is a simple quadratic equation with two solutions (maybe complex) This “characteristic equation” can be used to solve for x

Eigenvalue example Consider, The corresponding eigenvectors can be computed as For  = 0, one possible solution is x = (2, -1) For  = 5, one possible solution is x = (1, 2) For more information: Demos in Linear algebra by G. Strang, http://web.mit.edu/18.06/www/

Let σ(A) be the set of all eigenvalues of A . Then σ(A)=σ(AT ) where AT is the transposed matrix of A. Proof: The matrix (A−λI)T is the same as the matrix (AT −λI) , since the identity matrix is symmetric. Thus: det(AT −λI)=det( (A−λI)T )=det(A−λI) The last equation follows from the fact that a matrix and its transpose have the same determinant, since both A and its transpose have the same characteristic polynomial. Hence the eigenvalues are the same for both A and AT .

Physical interpretation Consider a covariance matrix, A, i.e., A = 1/n S ST for some S Error ellipse with the major axis as the larger eigenvalue and the minor axis as the smaller eigenvalue

Physical interpretation Original Variable A Original Variable B PC 1 PC 2 Orthogonal directions of greatest variance in data Projections along PC1 (Principal Component) discriminate the data most along any one axis

Physical interpretation First principal component is the direction of greatest variability (covariance) in the data Second is the next orthogonal (uncorrelated) direction of greatest variability So first remove all the variability along the first component, and then find the next direction of greatest variability And so on … Thus each eigenvectors provides the directions of data variances in decreasing order of eigenvalues For more information: See Gram-Schmidt Orthogonalization in G. Strang’s lectures