Pattern Recognition Topic 1: Principle Component Analysis Shapiro chap

Slides:



Advertisements
Similar presentations
Face Recognition Sumitha Balasuriya.
Advertisements

Component Analysis (Review)
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Interest points CSE P 576 Ali Farhadi Many slides from Steve Seitz, Larry Zitnick.
Machine Learning Lecture 8 Data Processing and Representation
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Pattern Classification Chapter 2 (Part 2)0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #20.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Dimensionality Reduction Chapter 3 (Duda et al.) – Section 3.8
Principal Component Analysis
CS 790Q Biometrics Face Recognition Using Dimensionality Reduction PCA and LDA M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
An Introduction to Kernel-Based Learning Algorithms K.-R. Muller, S. Mika, G. Ratsch, K. Tsuda and B. Scholkopf Presented by: Joanna Giforos CS8980: Topics.
Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)
Project 4 out today –help session today –photo session today Project 2 winners Announcements.
Face Recognition Jeremy Wyatt.
Implementing a reliable neuro-classifier
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
ICA Alphan Altinok. Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers.
Pattern Recognition Topic 2: Bayes Rule Expectant mother:
Principal Component Analysis. Consider a collection of points.
Techniques for studying correlation and covariance structure
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Recognition Part II Ali Farhadi CSE 455.
Machine Vision for Robots
Principles of Pattern Recognition
Eigenedginess vs. Eigenhill, Eigenface and Eigenedge by S. Ramesh, S. Palanivel, Sukhendu Das and B. Yegnanarayana Department of Computer Science and Engineering.
Image recognition using analysis of the frequency domain features 1.
Classification Course web page: vision.cis.udel.edu/~cv May 12, 2003  Lecture 33.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
ECE 8443 – Pattern Recognition LECTURE 08: DIMENSIONALITY, PRINCIPAL COMPONENTS ANALYSIS Objectives: Data Considerations Computational Complexity Overfitting.
CSE 185 Introduction to Computer Vision Face Recognition.
CSSE463: Image Recognition Day 27 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
Multimodal Interaction Dr. Mike Spann
CSSE463: Image Recognition Day 25 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
Principal Components Analysis ( PCA)
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
CSSE463: Image Recognition Day 27
CSSE463: Image Recognition Day 26
An Image Database Retrieval Scheme Based Upon Multivariate Analysis and Data Mining Presented by C.C. Chang Dept. of Computer Science and Information.
LECTURE 11: Advanced Discriminant Analysis
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Lecture 8:Eigenfaces and Shared Features
Face Recognition and Feature Subspaces
Lecture: Face Recognition and Feature Reduction
Recognition: Face Recognition
Principal Component Analysis (PCA)
Principal Component Analysis
Announcements Project 1 artifact winners
CS 332 Visual Processing in Computer and Biological Vision Systems
Face Recognition and Detection Using Eigenfaces
Presented by :- Vishal Vijayshankar Mishra
Techniques for studying correlation and covariance structure
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Outline H. Murase, and S. K. Nayar, “Visual learning and recognition of 3-D objects from appearance,” International Journal of Computer Vision, vol. 14,
Cheng-Yi, Chuang (莊成毅), b99
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
CSSE463: Image Recognition Day 25
Feature space tansformation methods
CS4670: Intro to Computer Vision
CSSE463: Image Recognition Day 25
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Lecture 5: Feature invariance
The “Margaret Thatcher Illusion”, by Peter Thompson
Presentation transcript:

Pattern Recognition Topic 1: Principle Component Analysis Shapiro chap Duda & Hart chap Features Principle Components Analysis (PCA) Bayes Decision Theory Gaussian Normal Density

Pattern Recognition Basics input image/s feature extraction feature classification pattern recognition results

Feature Extraction Linear features eg. Lines, curves 2D features eg. Texture Feature of features

Feature Extraction A good feature should be invariant to changes in scale rotation translation illumination

Feature Classification Linear Classifier Non-linear Classifier neural network

Pattern Recognition Principle Components Analysis (Karhunen-Loeve Transformation) Principle components analysis and its interpretation Applications

Suppose we are shown a set of points:

We want to answer any of the following questions: What is the trend (direction) of the cloud of points Which is the direction of maximum variance (dispersion) Which is the direction of minimum variance (dispersion) Supposing I am only allowed to use a 1D description for this set of 2D points, i.e. using 1 coordinate instead of 2 coordinates (x,y). How should I represent the points in such a way that the overall error is minimized ? --- this is data compression problem All these questions can be answered using Principle Components Analysis (PCA)

Principal Component Analysis (PCA) Suppose we have N feature vectors We take the mean of the feature vectors Form the covariance matrix where Find the eigenvalues and eigenvectors of covariance matrix where the is a principal component with eigenvalue We did all these in Lab 4 !

Shift cloud of points so that average position is at origin

Principle components of cloud of points

Another Example

Shift cloud of points so that average position is at origin

Principle components of cloud of points

One problem with PCA when applied directly to the covariance matrix of the images is that the dimension of covariance matrix is huge! For example, for a 256x256 image, X will be of dimension 65536x1. So the covariance matrix of X will be of size 65536x65536 !!! To compute the eigenvector of such a huge matrix is clearly inconceivable. To solve this problem, Murakami and Kumar proposed a technique in the following paper: Murakami, H. and Vijaya Kumar, B.V.K., “Efficient Calculation of Primary Images from a Set of Images”, IEEE Trans. Pattern Analysis and Machine Intelligence, vol. PAMI-4, No. 5, Sep 1982. We shall now look at this method. In the lab, you will practice implementing this algorithm for getting principle components from huge feature vectors. We will use a face recognition example in the lab.

To begin, we note that our goal is to find the eigenvectors and eigenvalues of the covariance matrix We will see later that instead of working directly on , which is a huge matrix, we can find the eigenvectors and eigenvalues of an NxN matrix and then use it to deduce the eigenvectors and eigenvalues of . To see how this can be done, we first note that scalar

Therefore, if e is an eigenvector of C, then Let then So if is an eigenvector of , can be expressed as a linear combination of the vectors .

Therefore, Therefore, the vector is the eigenvector of the matrix with (i,j) elements given by

So the eigenvector of the matrix , i.e. e, is given by Note that the eigenvalues of the matrix with elements is the same as the eigenvalues of the matrix In conclusion, the task of finding the eigenvectors of a huge matrix of size x (where M is the number of pixels in an image), is reduced to the task of finding the eigenvectors of a NxN matrix.

Summary Suppose we have N feature vectors We want to find the principle components. Since a direct computation of covariance matrix is not feasible due to huge dimension of covariance matrix, we do the following steps instead: Step 1: We take the mean of the feature vectors Step 2: Subtract the mean from each of the feature vectors

Step 3: Form the inner product matrix Step 4: Find the eigenvectors and eigenvalues of M

Step 5: If and are the principle components and the corresponding eigenvalues respectively of then