Download presentation
Presentation is loading. Please wait.
Published byDina Pearson Modified over 9 years ago
1
EE4-62 MLCV Lecture 13-14 Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV
2
Face Image Tagging and Retrieval Face tagging at commercial weblogs Key issues – User interaction for face tags – Representation of a long- time accumulated data – Online and efficient learning 2 Active research area in Face Recognition Test and MPEG-7 for face image retrieval and automatic passport control Our proposal promoted to MPEG7 ISO/IEC standard
3
EE4-62 MLCV Principal Component Analysis (PCA) - Maximum Variance Formulation of PCA - Minimum-error formulation of PCA - Probabilistic PCA 3
4
EE4-62 MLCV Maximum Variance Formulation of PCA 4
5
EE4-62 MLCV 5
6
6
7
7
8
8
9
Minimum-error formulation of PCA 9
10
EE4-62 MLCV 10
11
EE4-62 MLCV 11
12
EE4-62 MLCV 12
13
EE4-62 MLCV 13
14
EE4-62 MLCV 14
15
EE4-62 MLCV Applications of PCA to Face Recognition 15
16
EE4-62 MLCV (Recap) Geometrical interpretation of PCA Principal components are the vectors in the direction of the maximum variance of the projection samples. Each two-dimensional data point is transformed to a single variable z1 representing the projection of the data point onto the eigenvector u1. The data points projected onto u1 has the max variance. Infer the inherent structure of high dimensional data. The intrinsic dimensionality of data is much smaller. For given 2D data points, u1 and u2 are found as PCs 16
17
EE4-62 MLCV Eigenfaces Collect a set of face images Normalize for scale, orientation (using eye locations) Construct the covariance matrix and obtain eigenvectors w h D=wh 17
18
EE4-62 MLCV Eigenfaces Project data onto the subspace Reconstruction is obtained as Use the distance to the subspace for face recognition 18 EE4-62 MLCV
19
19
20
EE4-62 MLCV Matlab Demos – Face Recognition by PCA 20
21
EE4-62 MLCV Face Images Eigen-vectors and Eigen-value plot Face image reconstruction Projection coefficients (visualisation of high- dimensional data) Face recognition 21
22
EE4-62 MLCV Probabilistic PCA A subspace is spanned by the orthonormal basis (eigenvectors computed from covariance matrix) Can interpret each observation with a generative model Estimate (approximately) the probability of generating each observation with Gaussian distribution, PCA: uniform prior on the subspace PPCA: Gaussian dist. 22 EE4-62 MLCV
23
Continuous Latent Variables 23
24
EE4-62 MLCV 24
25
EE4-62 MLCV 25
26
EE4-62 MLCV Probabilistic PCA 26 EE4-62 MLCV
27
27
28
EE4-62 MLCV 28
29
EE4-62 MLCV 29
30
EE4-62 MLCV 30
31
EE4-62 MLCV Maximum likelihood PCA 31
32
EE4-62 MLCV 32
33
EE4-62 MLCV 33
34
EE4-62 MLCV 34
35
EE4-62 MLCV 35
36
EE4-62 MLCV Limitations of PCA 36
37
EE4-62 MLCV Unsupervised learning 37 PCA vs LDA (Linear Discriminant Analysis)
38
EE4-62 MLCV Linear model Linear Manifold = Subspace Nonlinear Manifold 38 EE4-62 MLCV PCA vs Kernel PCA
39
EE4-62 MLCV Gaussian Distribution Assumption 39 IC1 IC2 PC1 PC2 PCA vs ICA (Independent Component Analysis)
40
EE4-62 MLCV 40 (also by ICA)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.