Eigenfaces (for Face Recognition)

Slides:



Advertisements
Similar presentations
Chapter 4 Systems of Linear Equations; Matrices
Advertisements

Chapter 4 Systems of Linear Equations; Matrices
Matrices: Inverse Matrix
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Solving Systems of Linear Equations using Elimination
Intro to Matrices Don’t be scared….
1 Chapter 2 Matrices Matrices provide an orderly way of arranging values or functions to enhance the analysis of systems in a systematic manner. Their.
Chapter 7 Matrix Mathematics Matrix Operations Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
5  Systems of Linear Equations: ✦ An Introduction ✦ Unique Solutions ✦ Underdetermined and Overdetermined Systems  Matrices  Multiplication of Matrices.
Solving Systems of Equations and Inequalities
1 1.1 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra SYSTEMS OF LINEAR EQUATIONS.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Compiled By Raj G. Tiwari
MATRICES AND DETERMINANTS
Needs Work Need to add –HW Quizzes Chapter 13 Matrices and Determinants.
Row rows A matrix is a rectangular array of numbers. We subscript entries to tell their location in the array Matrices are identified by their size.
Some matrix stuff.
Multivariate Statistics Matrix Algebra II W. M. van der Veld University of Amsterdam.
Chapter 4 Matrices By: Matt Raimondi.
What is involved for achieved? Forming and solving 3 simultaneous equations from words. Giving your solution back in context.
Matrix Determinants and Inverses
Unit 3: Matrices.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
MATRIX A set of numbers arranged in rows and columns enclosed in round or square brackets is called a matrix. The order of a matrix gives the number of.
Unit 3: Matrices. Matrix: A rectangular arrangement of data into rows and columns, identified by capital letters. Matrix Dimensions: Number of rows, m,
Unsupervised Learning II Feature Extraction
Designed by Victor Help you improve MATRICES Let Maths take you Further… Know how to write a Matrix, Know what is Order of Matrices,
Matrices IB Mathematics SL. Matrices Describing Matrices Adding Matrices.
Chapter 4 Systems of Linear Equations; Matrices
MTH108 Business Math I Lecture 20.
3.1 - Solving Systems by Graphing
Linear Equations in Linear Algebra
12-4: Matrix Methods for Square Systems
Chapter 4 Systems of Linear Equations; Matrices
Linear Algebra Review.
Systems of Linear Equations
Chapter 7 Matrix Mathematics
10.5 Inverses of Matrices and Matrix Equations
5 Systems of Linear Equations and Matrices
Redundant Equations for Matrices
L9Matrix and linear equation
Matrix Operations Add and Subtract Matrices Multiply Matrices
Solving Systems of Linear Equations using Substitution
Systems of Linear Equations
Systems of First Order Linear Equations
Chapter 10: Solving Linear Systems of Equations
Singular Value Decomposition
Matlab tutorial course
Math-2 (honors) Matrix Algebra
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Unit 3: Matrices
Systems of Linear Equations
Solving Linear Equations
CSE 541 – Numerical Methods
Functions.
6 Systems of Linear Equations and Matrices
Matrix Algebra.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Mathematics for Signals and Systems
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Chapter 4 Systems of Linear Equations; Matrices
Systems of Linear Equations
Linear Equations in Linear Algebra
Solving Systems of Linear Equations using Substitution
Linear Systems Systems of Linear Equations
Matrices are identified by their size.
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
Chapter 4 Systems of Linear Equations; Matrices
Presentation transcript:

Eigenfaces (for Face Recognition)

Face Recognition, Eigenfaces, Matrices We start with looking at problem such as: 4 Apples plus 5 Bananas costs $10 7 Apples plus 6 Bananas costs $20 What is price of apple or banana? So, first we call applePrice A, and bananaPrice B, get: 4A + 5B = 10 and 7A + 6B = 20

Linear Equations and Matrices 4A + 5B = 10 and 7A + 6B = 20 Those two equations can be written as Matrix, get: [ ] [ ] = [ ] Which is matrix notation for this kind of problem. 4 5 7 6 AB 10 20

Going from one notation to another 4A + 5B = 10 and 7A + 6B = 20 XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX [ ] [ ] = [ ] Note that we have not actually solved anything by doing this to get the new notation, we have merely got a different way to express the Problem. But, now, solution will be easier. 4 5 7 6 AB 10 20

[ ] [ ] = [ ] Matrices We said, solution will be easier. [ ] [ ] = [ ] We said, solution will be easier. Well, what happens in mathematics, is that they tell you: To name the matrix as, say, P. Then find the inverse of P, called P Then, multiply both sides by P 10 20 4 5 7 6 AB -1 -1

P P [ ] = P [ ] Matrices 4 5 7 6 A B 10 20 AB 10 20 So, we name the matrix, as P Let P = [ ], so get P [ ] = [ ] Then find the inverse of P, called P Then, multiply both sides by P , multiply by putting on left So, we get P P [ ] = P [ ] 4 5 7 6 A B 10 20 -1 -1 AB 10 20 -1 -1

P P [ ] = P [ ] Matrices So, we have Now, for some matrix facts: 1) For any square matrix P, P x P = Identity 2) For any matrix or vector T, Identity x T is = T 3) For any square matrix P, if P is of dimension 2x2, then its inverse, P , is also of dimension 2x2. A B 10 20 -1 -1 -1 -1

Matrices 4) For two entities (matrix or vector) to multiply, when the dimensions are written out, the inner numbers must agree, i.e., be common; the resulting answer will have dimensions that are specified by the two outer numbers of the writing. example1 for Fact 4: A B = C is 2x7 in the example above, = means “gives the result” example2 : A x B is not valid. Why?? __x__ 2x3 3x7 2x3 2x3

Matrices For Fact 4, example3 : A x B x C the dimensions of B MUST be 3x5. 5) The Identity matrix is a square matrix with zeroes in the complete matrix except the main diagonal which has 1’s. The main diagonal is the one that runs from the Northwest to the Southeast, or can be said as, from the top left to the bottom right. 2x3 __x__ 5x3

Matrices Some notes on Matrices 1) we have not said anything about “How to Compute the Inverse?” This topic Is covered in a Matrix Algebra class in about the 3 to 4 weeks. Suffice it to say that we rarely have to write the actual computation code for the inverse. This is because it has become a standard add-on to most libraries, in particular to platforms like MATLab . 2) Whether we multiply on the left or the right depends on the positioning of the item we are trying to remove.

Redundant Equations for Matrices Our equations had been: 4A + 5B = 10 and 7A + 6B = 20 Now, consider: 4A + 5B = 10 and 8A + 10B = 20 our matrix then is P = [ ] 4 5 8 10

Redundant Equations for Matrices our matrix then is P = [ ] Our system of two equations now, and our new matrix P now, does not have what is called a Unique Solution. I hope you know (very well) that the reason is because the second eqn is not adding anything new to the first eqn (2nd simply doubled the 1st). 4 5 8 10

Redundant Equations for Matrices our matrix then is P = [ ] The 2nd simply doubling the 1st, is an example of redundant equations. In general systems of equations, one can have Exactly one solution, 2) No solution Or 3) Infinitely many solutions. 4 5 8 10

Redundant Equations for Matrices In systems of equations, one can have Exactly one solution, 2) No solution Or 3) Infinitely many solutions. This can be seen geometrically as: 1) Two lines crossing (intersecting) at a unique point ; 2) two lines never intersecting each other, being parallel to each other; 3) two lines being identical, laying on each other (infinite solutions).

Redundant Equations for Matrices The cases 2 and 3, 2) two lines never intersecting each other, being parallel to each other; 3) two lines being identical, laying on each other (infinite solutions), arise from the following situation: 4A + 5B = 10 and 8A + 10B = ??

Redundant Equations for Matrices The cases 2 and 3, 2) two lines never intersecting each other, being parallel to each other; 3) two lines being identical, laying on each other (infinite solutions), arise from the following: 4A + 5B = 10 and 8A + 10B = ?? if ?? is 2x10 (i.e., 20), this means the lines are identical (Case 3); if not, then parallel (Case 2)

Redundant Equations for Matrices For eqns like 4A + 5B = 10 and 8A + 10B = ?? if ?? is 2x10 (i.e., 20), this means the lines are identical (Case 3); if not, then parallel (Case 2). (Case 2 is sometimes called Inconsistent Equations). So, Mathematics evolved techniques to automatically tell which case was the case given to the system.

Redundant Equations for Matrices Not only did Mathematics evolve techniques to automatically tell which case was the case given to the system, they went further. Consider 4A + 5.00001B = 10 and 8A + 10B = 20.000001 These lines (equations) are not exactly identical, but almost are.

Redundant Equations for Matrices For eqns 4A + 5.00001B = 10 and 8A + 10B = 20.000001 Math invented techniques to tell that rows of a matrix are “almost” redundant. In fact, the techniques can tell how strongly non-redundant (called independent) a row is. This black-box technique is called EigenAnalysis. You feed it a square matrix, it comes back with a set of new Columns (called eigenvectors) and a set of scalar numbers (called eigenvalues).

Redundant Equations for Matrices In EigenAnalysis, we feed it a square matrix, it comes back with a set of new Columns (called eigenvectors) and a set of scalar numbers (called eigenvalues). The eigenvectors are telling us about a new way to represent any original vector of the space. The eigenvalues are telling us how much redundancy there is in the data vectors.

Redundant Equations for Matrices The eigenvalues are telling us how much redundancy there is in the data vectors. It turns out that in the world of faces, there is a lot of redundancy. So, face recognition can be sped up by exploiting the redundancy.

Brute Force way to Recognize Faces Imagine we have a database of M faces. Suppose each face is 256x256 in size. Then the matching task, i.e., the recognition task, can be done by: Simply get a match score between the 256x256 Test Face picture and each database picture. Then find the minimum score (or lowest 5 scores, or something like that). Now, since this face recognition topic is a CS topic, we examine the time cost of this approach, to get the match score for matching test pic to each database pic.

Brute Force way to Recognize Faces Since this face recognition topic is a CS topic, we examine the time cost of this approach, to get the match score for matching a test pic to each database pic. The match score can be computed by an algorithm that loops thru each pixel position and compares the test pic’s pixel and the database pic’s pixel (of the same position). The comparison could take the form of a simple subtraction followed by an absolute value calculation. Then, all these absolute values (each for a different position in the image) would be added up to give the match score. So, our score is ∑ AbsoluteValue (TestPic pixel i - DatabasePic pixel i) 256x256 i=1

Brute Force way to Recognize Faces Now, for a database of M faces, we are matching a test pic to the stored faces. So total work is M x (Number of steps in matching the test pic to a single database pic face.) Each face takes 256x256 loop executions. Total work is 256x256 loop executions x M. This is the Brute Force Approach. It is the simplest approach that would come to mind, without much thought. The challenge, then, is to beat the brute force speed.

Brute Force way to Recognize Faces The challenge is to beat the brute force speed. What the eigenface approach does is lower this from 256x256 executions (per picture match in the brute force approach), to about 50 loop executions, per match to a picture in the database. How is the lowering achieved?

Redundant Equations for Matrices The eigenvectors are going to give us a new representation of each original face, plus we will know how much redundancy there is in the system, so that the true independent dimensions are much fewer than 256x256. In the brute force approach, each database pic is represented by its 256x256 number of pixels, hence the matching had to go thru each of these 256x256 numbers. If we can represent each pic now by fewer numbers, say 50, we will have a speedier approach. We next look at how this lowering is done.