Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

Eigen Decomposition and Singular Value Decomposition
3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Covariance Matrix Applications
Chapter 6 Eigenvalues and Eigenvectors
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Section 5.1 Eigenvectors and Eigenvalues. Eigenvectors and Eigenvalues Useful throughout pure and applied mathematics. Used to study difference equations.
Symmetric Matrices and Quadratic Forms
ENGG2013 Unit 19 The principal axes theorem
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
5.1 Orthogonality.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Quadratic Forms, Characteristic Roots and Characteristic Vectors
Compiled By Raj G. Tiwari
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Some matrix stuff.
Day 1 Eigenvalues and Eigenvectors
Day 1 Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Chapter 6 Eigenvalues. Example In a certain town, 30 percent of the married women get divorced each year and 20 percent of the single women get married.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvalues and Eigenvectors
Section 2.1 Determinants by Cofactor Expansion. THE DETERMINANT Recall from algebra, that the function f (x) = x 2 is a function from the real numbers.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Unsupervised Learning II Feature Extraction
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
Chapter 6 Eigenvalues and Eigenvectors
College Algebra Chapter 6 Matrices and Determinants and Applications
Mathematics-I J.Baskar Babujee Department of Mathematics
Continuum Mechanics (MTH487)
Review of Linear Algebra
ISHIK UNIVERSITY FACULTY OF EDUCATION Mathematics Education Department
Eigen & Singular Value Decomposition
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Properties Of the Quadratic Performance Surface
Systems of First Order Linear Equations
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Numerical Analysis Lecture 16.
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
MATH 374 Lecture 23 Complex Eigenvalues.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Eigen Decomposition Based on the slides by Mani Thomas
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
Linear Algebra Lecture 32.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Principal Component Analysis
Linear Algebra Lecture 41.
Eigenvalues and Eigenvectors
Linear Algebra Lecture 30.
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Chapter 2 Determinants.
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki

Introduction Eigenvalue decomposition Physical interpretation of eigenvalue/eigenvectors

A(x) = (Ax) = (x) = (x) What are eigenvalues? Given a matrix, A, x is the eigenvector and  is the corresponding eigenvalue if Ax = x A must be square and the determinant of A -  I must be equal to zero Ax - x = 0 iff (A - I) x = 0 Trivial solution is if x = 0 The non trivial solution occurs when det(A - I) = 0 Are eigenvectors unique? If x is an eigenvector, then x is also an eigenvector and  is an eigenvalue A(x) = (Ax) = (x) = (x)

Calculating the Eigenvectors/values Expand the det(A - I) = 0 for a 2 x 2 matrix For a 2 x 2 matrix, this is a simple quadratic equation with two solutions (maybe complex) This “characteristic equation” can be used to solve for x

Eigenvalue example Consider, The corresponding eigenvectors can be computed as For  = 0, one possible solution is x = (2, -1) For  = 5, one possible solution is x = (1, 2) For more information: Demos in Linear algebra by G. Strang, http://web.mit.edu/18.06/www/

Eigen/diagonal Decomposition Let be a square matrix that has m linearly independent eigenvectors (a “non-defective” matrix) Theorem: Exists an eigen decomposition (cf. matrix diagonalization theorem) Columns of U are eigenvectors of S Diagonal elements of are eigenvalues of Unique for distinct eigen-values diagonal

Diagonal decomposition: why/how Let U have the eigenvectors as columns: Then, SU can be written Thus SU=U, or U–1SU= And S=UU–1.

Diagonal decomposition - example For The eigenvectors and form Recall UU–1 =1. Inverting, we have Then, S=UU–1 =

Example continued  Let’s divide U (and multiply U–1) by Then, S= Q (Q-1= QT ) Why? Stay tuned …

Symmetric Eigen Decomposition If is a symmetric matrix: Theorem: Exists a (unique) eigen decomposition where Q is orthogonal: Q-1= QT Columns of Q are normalized eigenvectors Columns are orthogonal. (everything is real)

Physical interpretation Consider a covariance matrix, A, i.e., A = 1/n S ST for some S The ellipse with the major axis as the larger eigenvalue and the minor axis as the smaller eigenvalue

Physical interpretation Original Variable A Original Variable B PC 1 PC 2 Orthogonal directions of greatest variance in data Projections along PC1 (Principal Component) discriminate the data most along any one axis

Physical interpretation First principal component is the direction of greatest variability (covariance) in the data Second is the next orthogonal (uncorrelated) direction of greatest variability So first remove all the variability along the first component, and then find the next direction of greatest variability And so on … Thus each eigenvectors provides the directions of data variances in decreasing order of eigenvalues For more information: See Gram-Schmidt Orthogonalization in G. Strang’s lectures