HCI/ComS 575X: Computational Perception

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Face Recognition Face Recognition Using Eigenfaces K.RAMNATH BITS - PILANI.
As applied to face recognition.  Detection vs. Recognition.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Pattern Recognition Topic 1: Principle Component Analysis Shapiro chap
CONTENT BASED FACE RECOGNITION Ankur Jain 01D05007 Pranshu Sharma Prashant Baronia 01D05005 Swapnil Zarekar 01D05001 Under the guidance of Prof.
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
Face Recognition Jeremy Wyatt.
Face Recognition Using Eigenfaces
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
CS4670: Computer Vision Kavita Bala Lecture 7: Harris Corner Detection.
CS4670: Computer Vision Kavita Bala Lecture 8: Scale invariance.
Face Detection and Recognition
Techniques for studying correlation and covariance structure
Matrices CS485/685 Computer Vision Dr. George Bebis.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Compiled By Raj G. Tiwari
PCA & LDA for Face Recognition
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
CSSE463: Image Recognition Day 27 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
Relations, Functions, and Matrices Mathematical Structures for Computer Science Chapter 4 Copyright © 2006 W.H. Freeman & Co.MSCS Slides Relations, Functions.
Slide Copyright © 2009 Pearson Education, Inc. 7.3 Matrices.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Algebra Matrix Operations. Definition Matrix-A rectangular arrangement of numbers in rows and columns Dimensions- number of rows then columns Entries-
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Presented by: Muhammad Wasif Laeeq (BSIT07-1) Muhammad Aatif Aneeq (BSIT07-15) Shah Rukh (BSIT07-22) Mudasir Abbas (BSIT07-34) Ahmad Mushtaq (BSIT07-45)
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
CSSE463: Image Recognition Day 25 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
Transformation methods - Examples
Review of Eigenvectors and Eigenvalues from CliffsNotes Online mining-the-Eigenvectors-of-a- Matrix.topicArticleId-20807,articleId-
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
MTH108 Business Math I Lecture 20.
CSSE463: Image Recognition Day 27
CSSE463: Image Recognition Day 26
Review of Eigenvectors and Eigenvalues
Continuum Mechanics (MTH487)
University of Ioannina
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Lecture 8:Eigenfaces and Shared Features
CS 2750: Machine Learning Dimensionality Reduction
HCI/ComS 575X: Computational Perception
Singular Value Decomposition
Lecture on Linear Algebra
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
HCI/ComS 575X: Computational Perception
CS485/685 Computer Vision Dr. George Bebis
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
HCI/ComS 575X: Computational Perception
HCI/ComS 575X: Computational Perception
Principal Component Analysis
HCI/ComS 575X: Computational Perception
Introduction PCA (Principal Component Analysis) Characteristics:
HCI/ComS 575X: Computational Perception
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
X.1 Principal component analysis
CSSE463: Image Recognition Day 25
HCI/ComS 575X: Computational Perception
Eigen Decomposition Based on the slides by Mani Thomas
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
CSSE463: Image Recognition Day 25
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas
Eigenvectors and Eigenvalues
HCI/ComS 575X: Computational Perception
Presentation transcript:

HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev http://www.vrac.iastate.edu/~alexs

Eigenfaces HCI/ComS 575X: Computational Perception Iowa State University Copyright © Alexander Stoytchev

Lecture Plan A Cool Demo Eigenvectors: Math Background Eigenvectors: Matlab demo Eigenfaces: Theory + Notation + Algorithm Eigenfaces: Matlab demo

M. Turk and A. Pentland (1991). ``Eigenfaces for recognition''. Journal of Cognitive Neuroscience, 3(1).

Dana H. Ballard (1999). ``An Introduction to Natural Computation (Complex Adaptive Systems)'', Chapter 4: “Data”, pp 70-94, MIT Press.

Readings for Next Time Henry A. Rowley, Shumeet Baluja and Takeo Kanade (1997). ``Rotation Invariant Neural Network-Based Face Detection,'' Carnegie Mellon Technical Report, CMU-CS-97-201. Paul Viola and Michael Jones (2001). ``Robust Real-time Object Detection'', Second International Workshop on Statistical and Computational Theories of Vision Modeling, Learning, Computing, and Sampling, Vancouver, Canada, July 13, 2001.

Review of Eigenvalues and Eigenvectors

Review Questions What is a vector? What is a Matrix? What is the result when a vector is multiplied by a matrix? (Ax = ?)

Systems of LinearEquations Ax = b mn matrix n-vector (column) m-vector (column)  2 1 –3 4 x1 x2 = 1 4 2x1 + x2 = 1 –3x1 + 4x2 = 4 Example:

Intuitive Definition For any matrix there are some vectors such that the matrix multiplication changes only the magnitude of the vector. These vectors are called eigenvectors.

Mathematical Formulation Ax = x 0 1 1 0 x1 x2 =    = 1 /\ x1 = x2  = –1 /\ x1 = – x2 eigenvector1 eigenvector2 A symmetric   real A positive definite   > 0 A positive semi-definite    0 x1 x2 Ax x an eigenvector 

On-line Java Applets http://www.math.duke.edu/education/ webfeatsII/Lite_Applets/contents.html http://www.math.ucla.edu/~tao/resource/ general/115a.3.02f/EigenMap.html

Java Applet Example [http://www.math.duke.edu/education/webfeatsII/Lite_Applets/contents.html]

Java Applet Example [http://www.math.duke.edu/education/webfeatsII/Lite_Applets/contents.html]

Finding the Eigenvectors [From Ballard (1999)]

Finding the Eigenvectors

Finding the Eigenvectors Identity Matrix

Finding the Eigenvectors

Finding the Eigenvectors [From Ballard (1999)]

Finding the Eigenvectors Determinant, not matrix The solution exists if: Characteristic equation: i.e., [From Ballard (1999)]

Finding the Eigenvectors The solutions to the characteristic equation are the eigenvalues In this case: [From Ballard (1999)]

Finding the Eigenvectors Substituting with We get this system of equations [From Ballard (1999)]

Finding the Eigenvectors Thus, first eigenvector is: The corresponding eigenvalue is: Verification: [From Ballard (1999)]

Finding the Eigenvectors The second eigenvector is: [-05, 1]T The corresponding eigenvalue is: 1

Eigenvectors and Eigenvalues in Matlab 3 1 2 2 >> [V, D]=eig(A) V = 0.7071 -0.4472 0.7071 0.8944 D = 4 0 0 1

Eigenvectors and Eigenvalues in Matlab >> L = [ V(:,1), [0; 0], V(:,2)]' L = 0.7071 0.7071 0 0 -0.4472 0.8944 >> line(L(:,1), L(:,2)) >> axis([-1, 1, -1, 1])

Eigenvectors v.s. Clustering

[http://web.math.umt.edu/bardsley/courses/471/eigenface.pdf]

Simple Example

Three Sample Data Points [From Ballard (1999)]

The mean is equal to … [From Ballard (1999)]

Subtract the Mean From All Data Points [From Ballard (1999)]

Column-Vector Multiplied by a Row-Vector = Matrix!

The Covariance Matrix is Given By [From Ballard (1999)]

One possible coordinate system this set of data points [From Ballard (1999)]

A better one… [From Ballard (1999)]

Eigenvectors of the Covariance Matrix The eigenvectors of the covariance matrix are important! They represent the best coordinate system in which once can represent the specific data points. The structure of the data is maximally revealed if these coordinates are used instead of any other ones.

Let…

Also, Let…

Thus the Covariance Matrix can be rewritten as:

Why can we drop 1/M?

Representing images as vectors

A AT

AT A

What about that Trick with the Dimensions?

Matlab Demo

Visualizing the Eigenvectors in 3D

Recognition Example ? ?

Recognition Example ? ?

How about this one? ?

Eigenfaces

Representing images as vectors

The Next set of slides come from: Prof. Ramani Duraiswami’s class CMSC: 426 Computer Vision http://www.umiacs.umd.edu/~ramani/cmsc426/

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

[http://www.umiacs.umd.edu/~ramani/cmsc426/]

Matlab Demo

THE END