Presentation is loading. Please wait.

Presentation is loading. Please wait.

PCA Channel Student: Fangming JI u4082259 Supervisor: Professor Tom Geoden.

Similar presentations


Presentation on theme: "PCA Channel Student: Fangming JI u4082259 Supervisor: Professor Tom Geoden."— Presentation transcript:

1

2 PCA Channel Student: Fangming JI u4082259 Supervisor: Professor Tom Geoden

3 Organization of the Presentation  PCA and problems  PCA channel idea  Use the channel for automatic classification  Channel  Corrected Channel  Conclusion  Future work

4 Principle Component Analysis  A statistic tool  Maximizes the scatter of all projected samples in the image space.  Tries to capture the most important features and reduce the dimensions at the same time  Each eigenvector is a principle component

5 Algorithm of PCA  Given a training set of M images with the same size, convert each of them into a single dimension vector (I1, I2, … Im) Then, find the average image by calculating the mean of the training set Ψ = (∑In) / M, n = 1, …m. Each training image differs from the average by Φn = In - Ψ. Then, the covariance matrix C is found by  where A = [Φ1, Φ2, … Φm] and C is a matrix. It is too big to be used in practice. But fortunately, there are only M-1 non-zero eigenvalues and they can be found more efficiently with an M x M computation. This means that we can compute the eigenvector vi of instead of computing the eigenvector ui of. Also we can notice that the M best eigenvalues of are equal to the M best eigenvalues of. Then we can get M best eigenvalues of ui by Avi. At the end we will select a value K, to keep only K largest eigenvalues.

6 Eigenfaces

7 Problems of PCA based methods  Avalanche disaster  Up to a certain limit, these methods are robust over a wide range of parameter.  Algorithm breaks down dramatically beyond that point

8 Constant Features and Inconstant Features  Holistic features = Local features + inconstant features = Local features + inconstant features  Local Features (constant features)  Inconstant features (such as view, illumination and expressions)  Little change from inconstant => Little change for holistic one  Great change of inconstant => maybe great change for the holistic one

9 Distribution in the Image Space  Images from the same personality may sit in totally different regions of the images space.  Distance between the images beyond the range of being correctly recognized

10 The PCA Channel  Holistic features = Local features + inconstant features  Positions decided by both local features and inconstant features  Incremental changes in the inconstant features, should produce incremental changed holistic features or positions  This incremental changed position looks like a channel so we call it “PCA Channel”

11 Experiment Preparation And Tools  Collecting images with incremental changes in the orientations -- Mingtao’s software  45 images from three identities (15 images for each identity which are changed incrementally in orientation)  Dozens of images from another three identities, randomly oriented with some expression images  Face Recognition Practitioner – Software developed by me

12 Existence of The Channel  Take view for example

13 Automatic Image Classification  1)Given an input image  2)Recognize it  3) Compute the PCA again with the new recognized image  4) Go to step 1)  1) Give an input image  2) Recognize it  3) Put it into the training set  4) Go to step 1)  Original PCA method  The PCA channel method

14 Performance Comparison  If the training set is carefully selected the performance of PCA channel is better than the original one  Problems:  Sensitive to the selection of the training set  Contagious problem

15 Contagious Problem

16 The Corrected PCA Channel  Cut off the root of the mismatching  Improve the robustness

17 Implementation  Set up two threshold: Low(L) and High(H)  If the distance between the input image and the its nearest image in the training set H, put it for future recognition; if L H, put it for future recognition; if L < distance < H, make it a new group.  Calculate the PCA again and cut off the mismatching at here  Match again

18 Results  The success rate = Match to Original Training Set + Match to New Group  The success rate = 44.15%+50.65% = 94.80%  The success rate = 44.15%+51.94% = 96.09%  59.74%

19 New Groups

20 Conclusion  Properly build up image database and the PCA channel with cautious implementation, we can get very good performance for face recognition.  But from the above experiment we can see that, the strength but also the weakness of the PCA channel is the images database.  3D face reconstruction system.  Large computational load. But it can also be appropriate in some situations where the focus is more on accuracy than response time.

21 Future Works  Verify Our Research On Larger Data Set  Preprocess the images before recognition  Build Up a 3D-Face Morphable Model System  Research in Hybrid Methods


Download ppt "PCA Channel Student: Fangming JI u4082259 Supervisor: Professor Tom Geoden."

Similar presentations


Ads by Google