Comparing Kernel-based Learning Methods for Face Recognition Zhiguo Li

Slides:



Advertisements
Similar presentations
ECG Signal processing (2)
Advertisements

Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.
Olivier Duchenne , Armand Joulin , Jean Ponce Willow Lab , ICCV2011.
Input Space versus Feature Space in Kernel- Based Methods Scholkopf, Mika, Burges, Knirsch, Muller, Ratsch, Smola presented by: Joe Drish Department of.
An Introduction of Support Vector Machine
Support Vector Machines and Kernels Adapted from slides by Tim Oates Cognition, Robotics, and Learning (CORAL) Lab University of Maryland Baltimore County.
SVM—Support Vector Machines
Face Recognition CPSC UTC/CSE.
Face Recognition Method of OpenCV
Face Recognition Committee Machine Presented by Sunny Tang.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Support Vector Machines (SVMs) Chapter 5 (Duda et al.)
Dimensionality Reduction Chapter 3 (Duda et al.) – Section 3.8
Face Recognition Under Varying Illumination Erald VUÇINI Vienna University of Technology Muhittin GÖKMEN Istanbul Technical University Eduard GRÖLLER Vienna.
© 2003 by Davi GeigerComputer Vision September 2003 L1.1 Face Recognition Recognized Person Face Recognition.
Principal Component Analysis
Fisher’s Linear Discriminant  Find a direction and project all data points onto that direction such that:  The points in the same class are as close.
March 15-17, 2002Work with student Jong Oh Davi Geiger, Courant Institute, NYU On-Line Handwriting Recognition Transducer device (digitizer) Input: sequence.
Rutgers CS440, Fall 2003 Support vector machines Reading: Ch. 20, Sec. 6, AIMA 2 nd Ed.
An Introduction to Kernel-Based Learning Algorithms K.-R. Muller, S. Mika, G. Ratsch, K. Tsuda and B. Scholkopf Presented by: Joanna Giforos CS8980: Topics.
Eigenfaces As we discussed last time, we can reduce the computation by dimension reduction using PCA –Suppose we have a set of N images and there are c.
Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)
Principle of Locality for Statistical Shape Analysis Paul Yushkevich.
Subspace Representation for Face Recognition Presenters: Jian Li and Shaohua Zhou.
The Implicit Mapping into Feature Space. In order to learn non-linear relations with a linear machine, we need to select a set of non- linear features.
A Kernel-based Support Vector Machine by Peter Axelberg and Johan Löfhede.
Lecture 10: Support Vector Machines
Oral Defense by Sunny Tang 15 Aug 2003
An Introduction to Support Vector Machines Martin Law.
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
Face Recognition and Feature Subspaces
General Tensor Discriminant Analysis and Gabor Features for Gait Recognition by D. Tao, X. Li, and J. Maybank, TPAMI 2007 Presented by Iulian Pruteanu.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
Jun-Won Suh Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Speaker Verification System.
An Introduction to Support Vector Machines (M. Law)
Using Support Vector Machines to Enhance the Performance of Bayesian Face Recognition IEEE Transaction on Information Forensics and Security Zhifeng Li,
Classification Course web page: vision.cis.udel.edu/~cv May 12, 2003  Lecture 33.
Face Recognition: An Introduction
Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction Presented by Xianwang Wang Masashi Sugiyama.
Kernels Usman Roshan CS 675 Machine Learning. Feature space representation Consider two classes shown below Data cannot be separated by a hyperplane.
CS 478 – Tools for Machine Learning and Data Mining SVM.
Optimal Component Analysis Optimal Linear Representations of Images for Object Recognition X. Liu, A. Srivastava, and Kyle Gallivan, “Optimal linear representations.
Support Vector Machine Debapriyo Majumdar Data Mining – Fall 2014 Indian Statistical Institute Kolkata November 3, 2014.
PCA vs ICA vs LDA. How to represent images? Why representation methods are needed?? –Curse of dimensionality – width x height x channels –Noise reduction.
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Elements of Pattern Recognition CNS/EE Lecture 5 M. Weber P. Perona.
Support Vector Machines (SVM): A Tool for Machine Learning Yixin Chen Ph.D Candidate, CSE 1/10/2002.
Dimensionality reduction
June 25-29, 2006ICML2006, Pittsburgh, USA Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction Masashi Sugiyama Tokyo Institute of.
Principal Component Analysis and Linear Discriminant Analysis for Feature Reduction Jieping Ye Department of Computer Science and Engineering Arizona State.
2D-LDA: A statistical linear discriminant analysis for image matrix
SVMs in a Nutshell.
3D Face Recognition Using Range Images Literature Survey Joonsoo Lee 3/10/05.
Zhiming Liu and Chengjun Liu, IEEE. Introduction Algorithms and procedures Experiments Conclusion.
Support Vector Machines Reading: Textbook, Chapter 5 Ben-Hur and Weston, A User’s Guide to Support Vector Machines (linked from class web page)
Support Vector Machines (SVMs) Chapter 5 (Duda et al.) CS479/679 Pattern Recognition Dr. George Bebis.
Recognition with Expression Variations
Support Vector Machines and Kernels
Kernels Usman Roshan.
Outline Peter N. Belhumeur, Joao P. Hespanha, and David J. Kriegman, “Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection,”
Support Vector Machines Introduction to Data Mining, 2nd Edition by
CS 2750: Machine Learning Support Vector Machines
Principal Component Analysis
The following slides are taken from:
Presented by: Chang Jia As for: Pattern Recognition
Usman Roshan CS 675 Machine Learning
Presentation transcript:

Comparing Kernel-based Learning Methods for Face Recognition Zhiguo Li

Outline Objective What is Kernel? Why Kernel? How to Kernel? Principal Component Analysis (PCA) vs. Fisher Discriminant Analysis (FDA) PCA vs. Kernel PCA (KPCA) Experiments Discussions and Conclusions

Objective I want to find out if the kernel versions of Principal Component Analysis (PCA) and Fisher Discriminant Analysis (FDA) are better than linear versions of PCA and FDA for face recognition?

What is Kernel? Kernel function: For example, polynomial kernel function: What’s special? They can compute the dot products of two feature vectors without even knowing what they are!

Why Kernel? The original face data maybe not linearly separable, so how about after nonlinear mapping? They may become linearly separable.

How to Kernelize? Any algorithm which can be expressed solely in terms of dot products, i.e. without explicitly usage of the variables themselves, the kernel method enables us to construct nonlinear versions of it.

PCA vs. FDA PCA seeks to find the projection that maximize the total scatter across all classes FDA tries to find discriminant projection that maximize the between-class scatter and minimize the within-class scatter

PCA vs. FDA

Linear PCA vs. Kernel PCA Kernel PCA: first do kernel mapping, from input space to feature space, then carry out PCA on the kernelized data.

Experiments Face Databases: public available AT&T FERET Yale

Experiments AT&T FERET YALE 40 subs, 10 imgs per sub 70 subs, 6 imgs per sub 15 subs, 11 imgs per sub

Discussions and Conclusions Discussions: –The selection of kernel function lacks theoretic scheme –Instead of using NN as classifier, SVM may achieve higher recognition rate Conclusions: –For face data, there is no big difference between Linear version methods and kernel version methods –FDA methods are better than PCA methods for face recognition