Presentation is loading. Please wait.

Presentation is loading. Please wait.

Agenda Dimension reduction Principal component analysis (PCA)

Similar presentations


Presentation on theme: "Agenda Dimension reduction Principal component analysis (PCA)"— Presentation transcript:

1 Agenda Dimension reduction Principal component analysis (PCA)
Multi-dimensional scaling (MDS) Microarray visualization

2 Why Dimension Reduction
Computation: The complexity grows exponentially with the dimension. Visualization: projection of high-dimensional data to 2D or 3D. Interpretation: the intrinsic dimension maybe small.

3 1. Dimension reduction Principal component analysis (PCA) Multi-dimensional Scaling (MDS)

4 Philosophy of PCA A PCA is concerned with explaining the variance-covariance sturcture of a set of variables through a few linear combinations. We typically have a data matrix of n observations on p correlated variables x1,x2,…xp PCA looks for a transformation of the xi into p new variables yi that are uncorrelated. Want to present x1,x2,…xp with a few yi’s without lossing much information.

5 PCA Looking for a transformation of the data matrix X (nxp) such that
Y= T X=1 X1+ 2 X2+..+ p Xp Where =(1 , 2 ,.., p)T is a column vector of wheights with 1²+ 2²+..+ p² =1

6 Maximize the variance of the projection of the observations on the Y variables
Find  so that Var(T X)= T Var(X)  is maximal Var(X) is the covariance matrix of the Xi variables

7 Good Better

8 Eigen Vector and Eigen Value

9 PCA

10 Covariance matrix

11 And so.. We find that The direction of  is given by the eigenvector 1 correponding to the largest eigenvalue of matrix Σ . The second vector that is orthogonal (uncorrelated) to the first is the one that has the second highest variance which comes to be the eignevector corresponding to the second eigenvalue And so on …

12 So PCA gives New variables Yi that are linear combination of the original variables (xi): Yi= ei1x1+ei2x2+…eipxp ; i=1..p The new variables Yi are derived in decreasing order of importance; they are called ‘principal components’

13

14 Scale before PCA PCA is sensitive to scale
PCA should be applied on data that have approximately the same scale in each variable

15

16 Johnson RA and Wichern DW. Applied multivariate Statistical Analysis
Johnson RA and Wichern DW. Applied multivariate Statistical Analysis. Pearson Education, 2003

17 How many PCAs to keep

18 SVD (singular value decomposition)
Johnson RA and Wichern DW. Applied multivariate Statistical Analysis. Pearson Education, 2003

19 SVD Berrar DP and Dubitzky GM. A Practical Approach to Microarray Data Analysis. Springer 2003.

20 SVD and PCA

21 PCA application: genomic study
Population stratification: allele frequency differences between cases and controls due to systematic ancestry differences—can cause spurious associations in disease studies. PCA could be used to infer underlying population structure.

22 Figure 2 Nature Genetics 38, (2006) Principal components analysis corrects for stratification in genome-wide association studies Alkes L Price, Nick J Patterson, Robert M Plenge, Michael E Weinblatt, Nancy A Shadick & David Reic Figure 2. The top two axes of variation of European American samples. We applied our method to a data set of 488 European Americans genotyped on an Affy- metrix platform containing 116,204 SNPs as part of an ongoing disease study (see Meth- ods). We hypothesize that the first axis reflects genetic variation between northwest and southeast Europe, with a fraction of the samples showing southeast European ancestry (first axis < 0; see text). It follows that the second axis separates two southeast European subpopulations.

23 Chao Tian, Peter K. Gregersen and Michael F. Seldin
Chao Tian, Peter K. Gregersen and Michael F. Seldin. (2008) Accounting for ancestry: population substructure and genome-wide association studies.

24 1.1 PCA PCA, Case study Transcriptional regulation and function during the human cell cycle, Cho et al. (2001) Nature Genetics Vol 27, 48-54 -- to identify cell-cycle–regulated transcripts in human cells -- Primary fibroblasts prepared from human foreskin were grown to approximately 30% confluence and synchronized in late G1 using a double thymidine-block protocol9. Cultures were then released from arrest, and cells were collected every 2 hours for 24 hours, covering nearly 2 complete cell cycles. -- Messenger RNA was isolated, labeled and hybridized to sets of arrays containing probes for approximately 40,000 human genes and non-overlapping ESTs. We carried out the entire synchronization experiment in duplicate under identical conditions for 6,800 genes on Affy array. The two data sets were averaged and analyzed using both supervised and unsupervised clustering of expression patterns.

25 1.1 PCA PCA, Case study Un-synchronized

26 1.1 PCA

27 1.1 PCA PCA projection: 387 genes in 13-dim space (time points) are projected to 2D space using correlation matrix; Gene phase 1: G1; 4: S; 2: G2; 3: M

28 1.1 PCA Variance in data explained by the first n principle components

29 1.1 PCA The weights of the 13 principle directions

30 1.1 PCA PCA projection: 13 samples (time points) in 387-dim space (genes) are projected to 2D space using correlation matrix; Each sample is labeled by its time point

31 1.1 PCA Potential pitfalls of PCA:
Principal components do not always capture important information needed. PCA projection from 2D to 1D: Cluster information will be lost.

32 1.2 Multidimensional scaling (MDS)
Suppose we are giving the distance structure of the following 10 cities. And we have no knowledge of the city location/map of the US. Can we map these cities to a 2D space to best present their distance structure?

33 1.2 Multidimensional scaling (MDS)
MDS deals with the following problem: for a set of observed similarities (or distances) between every pair of N items, find a representation of the items in few dimensions such that the interitem proximities “nearly match” the original similarities (or distance). The numerical measure of how close the original distances and the distances at lower dimensional coordinate is called stress.

34 1.2 MDS

35 1.2 MDS

36 1.2 MDS Mapping to 3D is possible but more difficult to visualize and interpret.

37 1.2 MDS MDS attempts to map objects to a visible 2D or 3D Euclidean space. The goal is to best preserve the distance structure after the mapping. The original data can be of high-dimensional or even non-metric space. The method only cares the distance (dissimilarity) structure. The resulting mapping is not unique. Any rotation or reflection of a mapping solution is also a solution. It could be shown that the results of PCA are exactly those of classical MDS if the distances calculated from the data matrix are Euclidean.

38 PCA MDS Input data Data matrix (S subjects in G dimensions) Dissimilarity structure (distance between any pair of subjects) Method “Project” subjects to low-dimensional space and preserve as large variance as possible Find a low-dimensional space that best keep the dissimilarity structure Restrictions Data have to be in Euclidean space Flexible to any data structure as long as the dissimilarity structure can be defined Pros and cons The PCs can be further used to model in downstream analyses. If a new subject is added, it can be similarly projected. Flexibility and visualization. But if a new subject is added, it can’t be shown in an existing MDS solution.

39 2. Microarray visualization
Data matrix Data: X={xij}nd , an n (genes)  d (samples) matrix.

40 2. Microarray visualization
Heatmap Log-ratio of the target sample to reference sample. log(target/reference) Gradient color of RED: positive; GREEN: negative; BLACK: 0. LIGHT GREY: missing value.

41 2. Microarray visualization
Treeview software developed by Mike Eisen

42 3. Software for dimension reduction & visualization
PCA in R: prcomp(stats) Principal Components Analysis (preferred) princomp(stats) Principal Components Analysis screeplot(stats) Screeplot of PCA Results PCA in IMSL (a commercial C library) MDS in R: isoMDS(MASS) Kruskal's Non-metric Multidimensional Scaling cmdscale(stats) Classical (Metric) Multidimensional Scaling sammon(MASS) Sammon's Non-Linear Mapping MDS: Various software and resources about MDS Heatmap visualization: Treeview


Download ppt "Agenda Dimension reduction Principal component analysis (PCA)"

Similar presentations


Ads by Google