PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.

Slides:



Advertisements
Similar presentations
Face Recognition Sumitha Balasuriya.
Advertisements

EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
Face Recognition and Biometric Systems Eigenfaces (2)
EigenFaces.
Machine Learning Lecture 8 Data Processing and Representation
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #20.
Dimensionality Reduction Chapter 3 (Duda et al.) – Section 3.8
© 2003 by Davi GeigerComputer Vision September 2003 L1.1 Face Recognition Recognized Person Face Recognition.
Principal Component Analysis
Pattern Recognition Topic 1: Principle Component Analysis Shapiro chap
Principal Component Analysis IML Outline Max the variance of the output coordinates Optimal reconstruction Generating data Limitations of PCA.
Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)
Face Recognition Jeremy Wyatt.
Face Recognition Using Eigenfaces
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Principal Component Analysis. Consider a collection of points.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Face Recognition and Feature Subspaces
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
CSE 185 Introduction to Computer Vision Face Recognition.
CSSE463: Image Recognition Day 27 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
Feature Selection and Dimensionality Reduction. “Curse of dimensionality” – The higher the dimensionality of the data, the more data is needed to learn.
Obama and Biden, McCain and Palin Face Recognition Using Eigenfaces Justin Li.
Presented by: Muhammad Wasif Laeeq (BSIT07-1) Muhammad Aatif Aneeq (BSIT07-15) Shah Rukh (BSIT07-22) Mudasir Abbas (BSIT07-34) Ahmad Mushtaq (BSIT07-45)
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
CSSE463: Image Recognition Day 25 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
Principal Components Analysis ( PCA)
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Unsupervised Learning II Feature Extraction
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
Principal Component Analysis (PCA)
CSSE463: Image Recognition Day 27
CSSE463: Image Recognition Day 26
Dimensionality Reduction
Principle Component Analysis (PCA) Networks (§ 5.8)
University of Ioannina
LECTURE 10: DISCRIMINANT ANALYSIS
9.3 Filtered delay embeddings
Recognition with Expression Variations
Lecture 8:Eigenfaces and Shared Features
Matrices and Vectors Review Objective
Face Recognition and Feature Subspaces
Lecture: Face Recognition and Feature Reduction
Recognition: Face Recognition
Principal Component Analysis (PCA)
Principal Components Analysis
Machine Learning Dimensionality Reduction
Principal Component Analysis
Outline Peter N. Belhumeur, Joao P. Hespanha, and David J. Kriegman, “Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection,”
Techniques for studying correlation and covariance structure
Principal Component Analysis
Recitation: SVD and dimensionality reduction
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Dimensionality Reduction
CSSE463: Image Recognition Day 25
Feature space tansformation methods
CS4670: Intro to Computer Vision
Principal Components What matters most?.
LECTURE 09: DISCRIMINANT ANALYSIS
CSSE463: Image Recognition Day 25
Principal Component Analysis
Principal Component Analysis
Presentation transcript:

PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the data comes to lie on the first coordinate (first principal component), the second greatest variance lies on the second coordinate (second principal component), and so on.”

Background for PCA Suppose attributes are A1 and A2, and we have n training examples. x’s denote values of A1 and y’s denote values of A2 over the training examples. Variance of an attribute:

Covariance of two attributes: If covariance is positive, both dimensions increase together. If negative, as one increases, the other decreases. Zero: independent of each other.

Covariance matrix Suppose we have n attributes, A1, ..., An. Covariance matrix:

Covariance matrix

Eigenvectors: Let M be an nn matrix. v is an eigenvector of M if M  v = v  is called the eigenvalue associated with v For any eigenvector v of M and scalar a, Thus you can always choose eigenvectors of length 1: If M has any eigenvectors, it has n of them, and they are orthogonal to one another. Thus eigenvectors can be used as a new basis for a n- dimensional vector space.

PCA Given original data set S = {x1, ..., xk}, produce new set by subtracting the mean of attribute Ai from each xi. Mean: 0 0 Mean: 1.81 1.91

Calculate the covariance matrix: Calculate the (unit) eigenvectors and eigenvalues of the covariance matrix: x y x y

Eigenvector with largest eigenvalue traces linear pattern in data

Order eigenvectors by eigenvalue, highest to lowest. In general, you get n components. To reduce dimensionality to p, ignore np components at the bottom of the list.

Feature vector = (v1, v2, ...vp) Construct new feature vector. Feature vector = (v1, v2, ...vp)

Derive the new data set. TransformedData = RowFeatureVector  RowDataAdjust This gives original data in terms of chosen components (eigenvectors)—that is, along these axes.

Reconstructing the original data We did: TransformedData = RowFeatureVector  RowDataAdjust so we can do RowDataAdjust = RowFeatureVector -1  TransformedData = RowFeatureVector T  TransformedData and RowDataOriginal = RowDataAdjust + OriginalMean

Example: Linear discrimination using PCA for face recognition Preprocessing: “Normalize” faces Make images the same size Line up with respect to eyes Normalize intensities

Raw features are pixel intensity values (2061 features) Each image is encoded as a vector i of these features Compute “mean” face in training set:

Subtract the mean face from each face vector Compute the covariance matrix C Compute the (unit) eigenvectors vi of C Keep only the first K principal components (eigenvectors)

The eigenfaces encode the principal sources of variation in the dataset (e.g., absence/presence of facial hair, skin tone, glasses, etc.). We can represent any face as a linear combination of these “basis” faces. Use this representation for: Face recognition (e.g., Euclidean distance from known faces) Linear discrimination (e.g., “glasses” versus “no glasses”, or “male” versus “female”)