Stanford Synchrotron Radiation Lightsource Principal Component Analysis Apurva Mehta.

Slides:



Advertisements
Similar presentations
Krishna Rajan Data Dimensionality Reduction: Introduction to Principal Component Analysis Case Study: Multivariate Analysis of Chemistry-Property data.
Advertisements

Component Analysis (Review)
Tensors and Component Analysis Musawir Ali. Tensor: Generalization of an n-dimensional array Vector: order-1 tensor Matrix: order-2 tensor Order-3 tensor.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
L15:Microarray analysis (Classification) The Biological Problem Two conditions that need to be differentiated, (Have different treatments). EX: ALL (Acute.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 9(b) Principal Components Analysis Martin Russell.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Microarray analysis Algorithms in Computational Biology Spring 2006 Written by Itai Sharon.
Principal Components Analysis (PCA) 273A Intro Machine Learning.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Techniques for studying correlation and covariance structure
Principal Component Analysis. Philosophy of PCA Introduced by Pearson (1901) and Hotelling (1933) to describe the variation in a set of multivariate data.
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Probability of Error Feature vectors typically have dimensions greater than 50. Classification accuracy depends upon the dimensionality and the amount.
Presented By Wanchen Lu 2/25/2013
Principal Components Analysis BMTRY 726 3/27/14. Uses Goal: Explain the variability of a set of variables using a “small” set of linear combinations of.
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Canonical Correlation Analysis and Related Techniques Simon Mason International Research Institute for Climate Prediction The Earth Institute of Columbia.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Math 5364/66 Notes Principal Components and Factor Analysis in SAS Jesse Crawford Department of Mathematics Tarleton State University.
Non-Linear Dimensionality Reduction
ECE 471/571 – Lecture 6 Dimensionality Reduction – Fisher’s Linear Discriminant 09/08/15.
Discriminant Analysis
CSC2515: Lecture 7 (post) Independent Components Analysis, and Autoencoders Geoffrey Hinton.
Principle Component Analysis and its use in MA clustering Lecture 12.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
Principal Component Analysis Zelin Jia Shengbin Lin 10/20/2015.
Principal Component Analysis (PCA).
Factorising large image datasets John Ashburner. Principal Component Analysis Need to reduce dimensions for data mining –Reduced feature set that explains.
3 “Products” of Principle Component Analysis
Principal component analysis of the color spectra from natural scenes Long Nguyen ECE-499 Advisor: Prof. Shane Cotter.
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
Principal Components Analysis ( PCA)
Out of sample extension of PCA, Kernel PCA, and MDS WILSON A. FLORERO-SALINAS DAN LI MATH 285, FALL
Dynamic graphics, Principal Component Analysis Ker-Chau Li UCLA department of Statistics.
CSE 554 Lecture 8: Alignment
Principal Component Analysis (PCA)
PREDICT 422: Practical Machine Learning
Equilibrium of a Rigid Body in Two Dimensions
LECTURE 10: DISCRIMINANT ANALYSIS
Principal components analysis
1) Find the measure of the largest angle.
Principal Component Analysis
Blind Signal Separation using Principal Components Analysis
Object Modeling with Layers
Dynamic graphics, Principal Component Analysis
Principal Component Analysis (PCA)
Principal Component Analysis
Lecture 14 PCA, pPCA, ICA.
Principal Component Analysis (PCA)
Techniques for studying correlation and covariance structure
Principal Component Analysis
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Introduction PCA (Principal Component Analysis) Characteristics:
Word Embedding Word2Vec.
Recitation: SVD and dimensionality reduction
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
LECTURE 09: DISCRIMINANT ANALYSIS
Principal Component Analysis
Canonical Correlation Analysis and Related Techniques
INTRODUCTION TO Machine Learning
Eigen Decomposition Based on the slides by Mani Thomas
Marios Mattheakis and Pavlos Protopapas
Presentation transcript:

Stanford Synchrotron Radiation Lightsource Principal Component Analysis Apurva Mehta

1D dataset?

Apurva Mehta A new Pebble Pattern

Apurva Mehta 2D dataset? Two Eigenvectors

Apurva Mehta EXAFS dataset… Two Components/distinct phases

Apurva Mehta EXAFS dataset… Is this a new phase? Or a linear combination of the others two?

Apurva Mehta World is certainly 2D But is it higher dimensional?

Apurva Mehta With Better Data… Maybe 3D. But 11D? We need better data than Google Earth.

Apurva Mehta So it is true for other datasets too

Apurva Mehta

OK, now we know the number of components/phases/eigenvectors So what are they?

Apurva Mehta 2D dataset

Apurva Mehta 2D dataset Eigen 1 Eigen 2 Why not these?

Apurva Mehta PCA is just Math Knows nothing about your samples. Therefore, It picks component 1 to take up the largest variation, component 2 to take up the largest of the remainder, etc….

Apurva Mehta What about orthogonality?

Apurva Mehta What about orthogonality? PCA eigenvectors

Apurva Mehta What about orthogonality? Another Alternate eigenset Component 1 negative

Apurva Mehta What about orthogonality? Another Alternate eigenset All samples = +ve sum of components

Apurva Mehta What about orthogonality? Another Alternate eigenset Why not these?

Apurva Mehta Questions? Comments?

Apurva Mehta Example : Decomposition of a Cu+2 compound Ceramic Body Reaction layer

Apurva Mehta MicroXAS maps Cu 0 EB Cu M Cu X Ca K