Presentation is loading. Please wait.

Presentation is loading. Please wait.

Face Recognition Using Eigenfaces

Similar presentations


Presentation on theme: "Face Recognition Using Eigenfaces"— Presentation transcript:

1 Face Recognition Using Eigenfaces
Kenan Gençol presented in the course Pattern Recognition instructed by Asst.Prof.Dr. Kemal Özkan Department of Electrical and Electronics Engineering, Osmangazi University

2 Agenda Introduction Principle Component Analysis (PCA)
Eigenfaces for Recognition

3 Introduction A method introduced by Turk and Pentland from MIT in 1991. Uses Principle Component Analysis(PCA) as a mathematical framework.

4 Principal Component Analysis (PCA)
What is it? It is a powerful tool for analysing data. Patterns in data can be hard to find in complex data, or in high dimension. PCA reduces a complex data set to a lower dimension. and identifies patterns in data, highlights their similarities and differences.

5 Principal Component Analysis (PCA)
The goal of PCA is to find the most meaningful basis to re-express a data set. PCA asks: Is there another basis, which is a linear combination of the original basis, that best re-expresses our data set? Uses variance and covariance for this goal.

6 PCA - Mathematical Foundations
The covariance measures the degree of the linear relationship between two variables. If positive, positively correlated data. If negative, negatively correlated data. If zero, uncorrelated data. The absolute magnitude of covariance measures the degree of redundancy.

7 PCA - Mathematical Foundations
The covariance matrix shows the relationship between higher dimensions. If n dimensions, it is a nxn matrix. It is a square symmetric matrix. The diagonal terms are the variances, and off-diagonal terms are covariances. The off-diagonal terms large magnitudes correspond to high redundancy.

8 PCA - Mathematical Foundations
Our goals re-stated: (1) minimize redundancy, measured by the magnitude of covariance. (2) maximize the signal, measured by the variance. Diagonalize the covariance matrix! This means: Decorrelate data!

9 PCA - Mathematical Foundations
The Diagonalization of Covariance Matrix: All off-diagonal terms should be zero, or say another way, data is decorrelated. Each successive dimension should be rank-ordered according to variance (large variances have important structure.)

10 A little linear algebra...
Some crucial theorems from linear algebra for PCA work: A matrix is symmetric if and only if orthogonally diagonalizable. A symmetric matrix is diagonalized by a matrix of its orthonormal eigenvectors.

11 PCA - Mathematical Foundations
So, finally, Find eigenvectors of covariance matrix! Order them by eigenvalue, highest to lowest (gives order of significance). The eigenvector with the highest eigenvalue is the principle component.Second, third principles etc. Ignore the components of lesser significance.

12 PCA - Conclusion Results:
The final data set will have less dimensions than the original. Aligned data in a basis with the axis of maximal variance (find another direction along which variance is maximized). Rank-ordering each basis vector according to the corresponding variances show how ‘principal’ each direction.

13 Discussion of PCA Principal components with larger associated variances show important, interesting structure, while those with lower variances represent noise. This is a strong but sometimes incorrect assumption. The goal of the analysis is to decorrelate the data, or say in other terms, is to remove second-order dependencies in the data. In the data sets of higher dependencies exist, it is insufficient at revealing all structure in the data.

14 Eigenfaces for Recognition
Simply think of it as a template matching problem:

15 Computation of the Eigenfaces
Let Γ is an N2 x1 vector, corresponding to the NxN face image Ι. Step1: obtain face images Ι1, Ι2,.... ΙM (training faces) Step2: represent every image Ιi as a vector Γi

16

17 Computation of the Eigenfaces
Step3: compute the average face vector Ψ : Step4: subtract the mean face:

18

19

20 Computation of the Eigenfaces
Step5: compute the covariance matrix C Step6: compute the eigenvectors ui of AAT:

21 Computation of the Eigenfaces
The matrix AATis very large  impractical! Consider the matrix ATA (MxM matrix) and compute the eigenvectors of ATA. The eigenvectors of ATA are also the best M eigenvectors of AAT. They correspond to M EIGENFACES !! Keep only K eigenvectors corresponding to the K largest eigenvalues.

22

23

24 Recognition using eigenfaces
Given an unknown face image Γ , follow these steps: Step1: normalize Γ : Φ = Γ – Ψ. Step2: project onto the eigenspace

25 Recognition using eigenfaces
Step3: represent Φ as: Step4: Find face dist er = minl || Ω – Ωl || Recognize Γ as face l from training set !!

26 Discussion: Eigenfaces
Performance is affected from: Background Lighting conditions Scale (head size) Orientation

27 Thank you!


Download ppt "Face Recognition Using Eigenfaces"

Similar presentations


Ads by Google