Presented by: Muhammad Wasif Laeeq (BSIT07-1) Muhammad Aatif Aneeq (BSIT07-15) Shah Rukh (BSIT07-22) Mudasir Abbas (BSIT07-34) Ahmad Mushtaq (BSIT07-45)

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

Eigen Decomposition and Singular Value Decomposition
EigenFaces.
Machine Learning Lecture 8 Data Processing and Representation
PCA + SVD.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
As applied to face recognition.  Detection vs. Recognition.
Principal Components Analysis Babak Rasolzadeh Tuesday, 5th December 2006.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Principal Component Analysis
Principal Component Analysis
Principal Component Analysis
Face Recognition Jeremy Wyatt.
Face Recognition Using Eigenfaces
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Principal Component Analysis Principles and Application.
Tables, Figures, and Equations
Principal Component Analysis. Consider a collection of points.
Techniques for studying correlation and covariance structure
Principal Component Analysis. Philosophy of PCA Introduced by Pearson (1901) and Hotelling (1933) to describe the variation in a set of multivariate data.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Empirical Modeling Dongsup Kim Department of Biosystems, KAIST Fall, 2004.
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Chapter 2 Dimensionality Reduction. Linear Methods
Principal Components Analysis BMTRY 726 3/27/14. Uses Goal: Explain the variability of a set of variables using a “small” set of linear combinations of.
REVIEW OF MATHEMATICS. Review of Vectors Analysis GivenMagnitude of vector: Example: Dot product:  is the angle between the two vectors. Example:
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Multivariate Statistics Matrix Algebra I W. M. van der Veld University of Amsterdam.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Educ 200C Wed. Oct 3, Variation What is it? What does it look like in a data set?
Principal Component Analysis (PCA). Data Reduction summarization of data with many (p) variables by a smaller set of (k) derived (synthetic, composite)
Linear Subspace Transforms PCA, Karhunen- Loeve, Hotelling C306, 2000.
Matrix Notation for Representing Vectors
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Principle Component Analysis and its use in MA clustering Lecture 12.
Arab Open University Faculty of Computer Studies M132: Linear Algebra
Principal Component Analysis Zelin Jia Shengbin Lin 10/20/2015.
Principal Components Analysis ( PCA)
Multivariate statistical methods. Multivariate methods multivariate dataset – group of n objects, m variables (as a rule n>m, if possible). confirmation.
Unsupervised Learning II Feature Extraction
Principal Component Analysis (PCA)
Dimensionality Reduction
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
9.3 Filtered delay embeddings
Principal Component Analysis (PCA)
Principal Components Analysis
Principal Component Analysis
Singular Value Decomposition
Eigenvalues and Eigenvectors
Principal Component Analysis (PCA)
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Principal Component Analysis
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Introduction to Statistical Methods for Measuring “Omics” and Field Data PCA, PcoA, distance measure, AMOVA.
Recitation: SVD and dimensionality reduction
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Matrix Algebra and Random Vectors
X.1 Principal component analysis
Eigenvalues and Eigenvectors
Eigen Decomposition Based on the slides by Mani Thomas
Principal Components What matters most?.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Principal Component Analysis
Eigen Decomposition Based on the slides by Mani Thomas
Presentation transcript:

Presented by: Muhammad Wasif Laeeq (BSIT07-1) Muhammad Aatif Aneeq (BSIT07-15) Shah Rukh (BSIT07-22) Mudasir Abbas (BSIT07-34) Ahmad Mushtaq (BSIT07-45) PRINCIPLE COMPONENT ANALYSIS

BZUPAGES.COM Study Hours G.P.A. Student A64 Student B53.2 Student C42.75 Student D22 Student E x 2 Matrix

BZUPAGES.COM

BZUPAGES.COM Var1Var9000 Student A Student B Student C Student D Student E 5 x 9000 Matrix

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM THE DEFINITION Principal component analysis (PCA) is a mathematical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of uncorrelated variables called principal components.

BZUPAGES.COM Principal Components are always perpendicular to each other The number of principal components is less than or equal to the number of original variables First Principal Component has highest Variance

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM WHERE TO USE Used to uncover unknown trends Used to summarize Data Visualization of high dimension Data Dimensionality Reduction

BZUPAGES.COM BACKGROUND MATHEMATICS BY SHAHRUKH

Mean Standard Deviation Variance Covariance The covariance Matrix STATISTICS

Eigenvectors Eigenvalues MATHEMATICS Matrix Algebra

STATISTICS Mean

STATISTICS Standard deviation The average distance from the mean of the data set to a point

STATISTICS Cont.. To compute the squares of the distance from each data point to the mean of the set, add them all up, divide by n-1 and take the positive square root

STATISTICS Variance Variance is another measure of the spread of data in a data set. In fact it is almost identical to the standard deviation. The formula is this:

STATISTICS Covariance When there are than two dimensions of data then we use this, The formula is :

STATISTICS The covariance Matrix When there are more than two dimensions of data then we use this. The formula is : Suppose we have three dimensions of data:

Eigenvectors Eigenvalues MATHEMATICS Matrix Algebra

Eigenvectors:- The eigenvectors of a square matrix are the non- zero vectors which, after being multiplied by the matrix, remain proportional to the original vector. MATHEMATICS Cont…

Eigenvalues:- For each eigenvector, the corresponding eigenvalue is the factor by which the eigenvector changes when multiplied by the matrix. MATHEMATICS Cont…

MATHEMATICS Eigenvectors, Eigenvalues

PCA IN

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM

BZUPAGES.COM So how do we detect a face? Any face can be identified by multiplying the eigen-faces to the name minus average. If the image we “project” from the face space is close enough to the actual image detected then we found what we are looking for…

PCA IN ACTION!

BZUPAGES.COM Suppose we have a 2 dimensional data:

BZUPAGES.COM The first step of PCA is to obtain covariance matrix Variance of x1 Variance of x2 Covariance of x1-x2 Variance(1) Cov(1,2) Cov(1,2)Variance(2)

BZUPAGES.COM Formula for variance: Formula for Covariance

BZUPAGES.COM Step 2: is to obtain Eigen Values by solving function determinant Solving a the above equation gives two values of And these two values are Eigen Values

BZUPAGES.COM Step 3: is to obtain Eigen Vector by solving for matrix X in such a way that, Cov Matrix

BZUPAGES.COM Step 4: is to obtain coordinates of data point in the direction of Eigen Vectors We obtain this by multiplying centered data matrix to the Eigen vector matrix.

BZUPAGES.COM Lets have a look at an Excell Workbook

BZUPAGES.COM Our covariance matrix is:

BZUPAGES.COM Lets find out the Eigen Values: By solving function determinant: * Solving = ,

BZUPAGES.COM Now we will find out Eigen Vectors: By solving the following matrix: Substract the 1 st Eigen value from variance terms of co-variance matrix of Step 1, we obtain

BZUPAGES.COM Finding Eigen Vectors: Now for 1 st Eigen Vector: X= abab 0000 We get a=0.6262, b= Similarly for 2 nd Eigen value , We get a= and b=

BZUPAGES.COM Now obtain cordinates of data point in the direction of Eigen Vectors

BZUPAGES.COM