Face Recognition Using Eigenfaces

Slides:



Advertisements
Similar presentations
Face Recognition Sumitha Balasuriya.
Advertisements

Eigen Decomposition and Singular Value Decomposition
Eigen Decomposition and Singular Value Decomposition
3D Geometry for Computer Graphics
Covariance Matrix Applications
EigenFaces.
Face Recognition Ying Wu Electrical and Computer Engineering Northwestern University, Evanston, IL
Machine Learning Lecture 8 Data Processing and Representation
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Principal Components Analysis Babak Rasolzadeh Tuesday, 5th December 2006.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Principal Component Analysis
Dimensionality Reduction Chapter 3 (Duda et al.) – Section 3.8
Symmetric Matrices and Quadratic Forms
Principal Component Analysis
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Principal Component Analysis IML Outline Max the variance of the output coordinates Optimal reconstruction Generating data Limitations of PCA.
Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 9(b) Principal Components Analysis Martin Russell.
吳育德 陽明大學放射醫學科學研究所 台北榮總整合性腦功能研究室 Introduction To Principal Component Analysis.
Face Recognition Jeremy Wyatt.
FACE RECOGNITION, EXPERIMENTS WITH RANDOM PROJECTION
Face Collections : Rendering and Image Processing Alexei Efros.
Principal Component Analysis. Consider a collection of points.
Facial Recognition CSE 391 Kris Lord.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Eigenfaces for Recognition Student: Yikun Jiang Professor: Brendan Morris.
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Chapter 2 Dimensionality Reduction. Linear Methods
Principal Components Analysis BMTRY 726 3/27/14. Uses Goal: Explain the variability of a set of variables using a “small” set of linear combinations of.
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
CSE 185 Introduction to Computer Vision Face Recognition.
Principal Components: A Mathematical Introduction Simon Mason International Research Institute for Climate Prediction The Earth Institute of Columbia University.
CSSE463: Image Recognition Day 27 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
Principal Component Analysis Machine Learning. Last Time Expectation Maximization in Graphical Models – Baum Welch.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Feature Extraction 主講人:虞台文. Content Principal Component Analysis (PCA) PCA Calculation — for Fewer-Sample Case Factor Analysis Fisher’s Linear Discriminant.
CSSE463: Image Recognition Day 10 Lab 3 due Weds Lab 3 due Weds Today: Today: finish circularity finish circularity region orientation: principal axes.
Obama and Biden, McCain and Palin Face Recognition Using Eigenfaces Justin Li.
Presented by: Muhammad Wasif Laeeq (BSIT07-1) Muhammad Aatif Aneeq (BSIT07-15) Shah Rukh (BSIT07-22) Mudasir Abbas (BSIT07-34) Ahmad Mushtaq (BSIT07-45)
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
CSSE463: Image Recognition Day 10 Lab 3 due Weds, 11:59pm Lab 3 due Weds, 11:59pm Take-home quiz due Friday, 4:00 pm Take-home quiz due Friday, 4:00 pm.
Principal Components Analysis ( PCA)
Unsupervised Learning II Feature Extraction
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Unsupervised Learning II Feature Extraction
CSSE463: Image Recognition Day 26
University of Ioannina
9.3 Filtered delay embeddings
Lecture 8:Eigenfaces and Shared Features
Face Recognition and Detection Using Eigenfaces
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Techniques for studying correlation and covariance structure
Principal Component Analysis
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Recitation: SVD and dimensionality reduction
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Feature space tansformation methods
Symmetric Matrices and Quadratic Forms
Principal Components What matters most?.
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
Principal Component Analysis
Symmetric Matrices and Quadratic Forms
Marios Mattheakis and Pavlos Protopapas
Presentation transcript:

Face Recognition Using Eigenfaces Kenan Gençol presented in the course Pattern Recognition instructed by Asst.Prof.Dr. Kemal Özkan Department of Electrical and Electronics Engineering, Osmangazi University

Agenda Introduction Principle Component Analysis (PCA) Eigenfaces for Recognition

Introduction A method introduced by Turk and Pentland from MIT in 1991. Uses Principle Component Analysis(PCA) as a mathematical framework.

Principal Component Analysis (PCA) What is it? It is a powerful tool for analysing data. Patterns in data can be hard to find in complex data, or in high dimension. PCA reduces a complex data set to a lower dimension. and identifies patterns in data, highlights their similarities and differences.

Principal Component Analysis (PCA) The goal of PCA is to find the most meaningful basis to re-express a data set. PCA asks: Is there another basis, which is a linear combination of the original basis, that best re-expresses our data set? Uses variance and covariance for this goal.

PCA - Mathematical Foundations The covariance measures the degree of the linear relationship between two variables. If positive, positively correlated data. If negative, negatively correlated data. If zero, uncorrelated data. The absolute magnitude of covariance measures the degree of redundancy.

PCA - Mathematical Foundations The covariance matrix shows the relationship between higher dimensions. If n dimensions, it is a nxn matrix. It is a square symmetric matrix. The diagonal terms are the variances, and off-diagonal terms are covariances. The off-diagonal terms large magnitudes correspond to high redundancy.

PCA - Mathematical Foundations Our goals re-stated: (1) minimize redundancy, measured by the magnitude of covariance. (2) maximize the signal, measured by the variance. Diagonalize the covariance matrix! This means: Decorrelate data!

PCA - Mathematical Foundations The Diagonalization of Covariance Matrix: All off-diagonal terms should be zero, or say another way, data is decorrelated. Each successive dimension should be rank-ordered according to variance (large variances have important structure.)

A little linear algebra... Some crucial theorems from linear algebra for PCA work: A matrix is symmetric if and only if orthogonally diagonalizable. A symmetric matrix is diagonalized by a matrix of its orthonormal eigenvectors.

PCA - Mathematical Foundations So, finally, Find eigenvectors of covariance matrix! Order them by eigenvalue, highest to lowest (gives order of significance). The eigenvector with the highest eigenvalue is the principle component.Second, third principles etc. Ignore the components of lesser significance.

PCA - Conclusion Results: The final data set will have less dimensions than the original. Aligned data in a basis with the axis of maximal variance (find another direction along which variance is maximized). Rank-ordering each basis vector according to the corresponding variances show how ‘principal’ each direction.

Discussion of PCA Principal components with larger associated variances show important, interesting structure, while those with lower variances represent noise. This is a strong but sometimes incorrect assumption. The goal of the analysis is to decorrelate the data, or say in other terms, is to remove second-order dependencies in the data. In the data sets of higher dependencies exist, it is insufficient at revealing all structure in the data.

Eigenfaces for Recognition Simply think of it as a template matching problem:

Computation of the Eigenfaces Let Γ is an N2 x1 vector, corresponding to the NxN face image Ι. Step1: obtain face images Ι1, Ι2,.... ΙM (training faces) Step2: represent every image Ιi as a vector Γi

Computation of the Eigenfaces Step3: compute the average face vector Ψ : Step4: subtract the mean face:

Computation of the Eigenfaces Step5: compute the covariance matrix C Step6: compute the eigenvectors ui of AAT:

Computation of the Eigenfaces The matrix AATis very large  impractical! Consider the matrix ATA (MxM matrix) and compute the eigenvectors of ATA. The eigenvectors of ATA are also the best M eigenvectors of AAT. They correspond to M EIGENFACES !! Keep only K eigenvectors corresponding to the K largest eigenvalues.

Recognition using eigenfaces Given an unknown face image Γ , follow these steps: Step1: normalize Γ : Φ = Γ – Ψ. Step2: project onto the eigenspace

Recognition using eigenfaces Step3: represent Φ as: Step4: Find face dist er = minl || Ω – Ωl || Recognize Γ as face l from training set !!

Discussion: Eigenfaces Performance is affected from: Background Lighting conditions Scale (head size) Orientation

Thank you!