Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

Eigen Decomposition and Singular Value Decomposition
3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Covariance Matrix Applications
Factor Analysis Continued
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Section 5.1 Eigenvectors and Eigenvalues. Eigenvectors and Eigenvalues Useful throughout pure and applied mathematics. Used to study difference equations.
Lecture 17 Introduction to Eigenvalue Problems
OCE301 Part II: Linear Algebra lecture 4. Eigenvalue Problem Ax = y Ax = x occur frequently in engineering analysis (eigenvalue problem) Ax =  x [ A.
Principal Component Analysis
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Eigenvectors and Eigenvalues
Compiled By Raj G. Tiwari
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
Principal Component Analysis Adapted by Paul Anderson from Tutorial by Doug Raiford.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
Day 1 Eigenvalues and Eigenvectors
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Descriptive Statistics vs. Factor Analysis Descriptive statistics will inform on the prevalence of a phenomenon, among a given population, captured by.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 02 Chapter 2: Determinants.
Solving Linear Systems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. Solving linear.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
The Quadratic Formula Students will be able to solve quadratic equations by using the quadratic formula.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
2.1 – Linear and Quadratic Equations Linear Equations.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
Principal Component Analysis Zelin Jia Shengbin Lin 10/20/2015.
Lecture Note 1 – Linear Algebra Shuaiqiang Wang Department of CS & IS University of Jyväskylä
Presented by: Muhammad Wasif Laeeq (BSIT07-1) Muhammad Aatif Aneeq (BSIT07-15) Shah Rukh (BSIT07-22) Mudasir Abbas (BSIT07-34) Ahmad Mushtaq (BSIT07-45)
Principal Components Analysis ( PCA)
Unsupervised Learning II Feature Extraction
Objective: Use factoring to solve quadratic equations. Standard(s) being met: 2.8 Algebra and Functions.
Review of Eigenvectors and Eigenvalues from CliffsNotes Online mining-the-Eigenvectors-of-a- Matrix.topicArticleId-20807,articleId-
Eigenvalues and Eigenvectors
Review of Eigenvectors and Eigenvalues
Dimensionality Reduction
Elementary Linear Algebra Anton & Rorres, 9th Edition
The Inverse of a Square Matrix
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Properties Of the Quadratic Performance Surface
Principal Component Analysis
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Linear Discriminant Analysis(LDA)
Descriptive Statistics vs. Factor Analysis
Introduction to Statistical Methods for Measuring “Omics” and Field Data PCA, PcoA, distance measure, AMOVA.
Solving Quadratic Equations by Factoring
MATH 374 Lecture 23 Complex Eigenvalues.
Linear Algebra Lecture 3.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas
Principal Components What matters most?.
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
Linear Algebra Lecture 32.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Chapter 6 Section 5.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Principal Component Analysis
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE HADAPSAR, PUNE
Eigen Decomposition Based on the slides by Mani Thomas
Chapter 2 Determinants.
Presentation transcript:

Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki

Introduction Eigenvalue decomposition Physical interpretation of eigenvalue/eigenvectors

What are eigenvalues? Given a matrix, A, x is the eigenvector and is the corresponding eigenvalue if Ax = x  A must be square and the determinant of A - I must be equal to zero Ax - x = 0 iff (A - I) x = 0 Trivial solution is if x = 0 The non trivial solution occurs when det(A - I) = 0 Are eigenvectors unique?  If x is an eigenvector, then  x is also an eigenvector and is an eigenvalue A(  x) =  (Ax) =  ( x) = (  x)

Calculating the Eigenvectors/values Expand the det(A - I) = 0 for a 2 x 2 matrix For a 2 x 2 matrix, this is a simple quadratic equation with two solutions (maybe complex) This “characteristic equation” can be used to solve for x

Eigenvalue example Consider, The corresponding eigenvectors can be computed as  For = 0, one possible solution is x = (2, -1)  For = 5, one possible solution is x = (1, 2) For more information: Demos in Linear algebra by G. Strang,

Physical interpretation Consider a covariance matrix, A, i.e., A = 1/n S S T for some S Error ellipse with the major axis as the larger eigenvalue and the minor axis as the smaller eigenvalue

Physical interpretation Orthogonal directions of greatest variance in data Projections along PC1 (Principal Component) discriminate the data most along any one axis Original Variable A Original Variable B PC 1 PC 2

Physical interpretation First principal component is the direction of greatest variability (covariance) in the data Second is the next orthogonal (uncorrelated) direction of greatest variability  So first remove all the variability along the first component, and then find the next direction of greatest variability And so on … Thus each eigenvectors provides the directions of data variances in decreasing order of eigenvalues For more information: See Gram-Schmidt Orthogonalization in G. Strang’s lectures