Presentation is loading. Please wait.

Presentation is loading. Please wait.

Object Orie’d Data Analysis, Last Time Gene Cell Cycle Data Microarrays and HDLSS visualization DWD bias adjustment NCI 60 Data Today: More NCI 60 Data.

Similar presentations


Presentation on theme: "Object Orie’d Data Analysis, Last Time Gene Cell Cycle Data Microarrays and HDLSS visualization DWD bias adjustment NCI 60 Data Today: More NCI 60 Data."— Presentation transcript:

1 Object Orie’d Data Analysis, Last Time Gene Cell Cycle Data Microarrays and HDLSS visualization DWD bias adjustment NCI 60 Data Today: More NCI 60 Data & Detailed (math ’ cal) look at PCA

2 Last Time: Checked Data Combo, using DWD Dir ’ ns

3 DWD Views of NCI 60 Data Interesting Question: Which clusters are really there? Issues: DWD great at finding dir’ns of separation And will do so even if no real structure Is this happening here? Or: which clusters are important? What does “important” mean?

4 Real Clusters in NCI 60 Data Simple Visual Approach: Randomly relabel data (Cancer Types) Recompute DWD dir’ns & visualization Get heuristic impression from this Deeper Approach Formal Hypothesis Testing (Done later)

5 Random Relabelling #1

6 Random Relabelling #2

7 Random Relabelling #3

8 Random Relabelling #4

9 Revisit Real Data

10 Revisit Real Data (Cont.) Heuristic Results: Strong Clust’s Weak Clust’s Not Clust’s Melanoma C N S NSCLC Leukemia Ovarian Breast Renal Colon Later: will find way to quantify these ideas i.e. develop statistical significance

11 NCI 60 Controversy Can NCI 60 Data be normalized? Negative Indication: Kou, et al (2002) Bioinformatics, 18, 405- 412. –Based on Gene by Gene Correlations Resolution: Gene by Gene Data View vs. Multivariate Data View

12 Resolution of Paradox: Toy Data, Gene View

13 Resolution: Correlations suggest “ no chance ”

14 Resolution: Toy Data, PCA View

15 Resolution: PCA & DWD direct ’ ns

16 Resolution: DWD Adjusted

17 Resolution: DWD Adjusted, PCA view

18 Resolution: DWD Adjusted, Gene view

19 Resolution: Correlations & PC1 Projection Correl ’ n

20 Needed final verification of Cross-platform Normal ’ n Is statistical power actually improved? Will study later

21 DWD: Why does it work? Rob Tibshirani Query: Really need that complicated stuff? (DWD is complex) Can’t we just use means? Empirical Fact (Joel Parker): (DWD better than simple methods)

22 DWD: Why does it work? Xuxin Liu Observation: Key is unbalanced sub-sample sizes (e.g biological subtypes) Mean methods strongly affected DWD much more robust Toy Example

23 DWD: Why does it work?

24 Xuxin Liu Example Goals: –Bring colors together –Keep symbols distinct (interesting biology) Study varying sub-sample proportions: –Ratio = 1: Both methods great –Ratio = 0.61: Mean degrades, DWD good –Ratio = 0.35: Mean poor, DWD still OK –Ratio = 0.11: DWD degraded, still better Later: will find underlying theory

25 PCA: Rediscovery – Renaming Statistics: Principal Component Analysis (PCA) Social Sciences: Factor Analysis (PCA is a subset) Probability / Electrical Eng: Karhunen – Loeve expansion Applied Mathematics: Proper Orthogonal Decomposition (POD) Geo-Sciences: Empirical Orthogonal Functions (EOF)

26 An Interesting Historical Note The 1 st (?) application of PCA to Functional Data Analysis: Rao, C. R. (1958) Some statistical methods for comparison of growth curves, Biometrics, 14, 1-17. 1 st Paper with “Curves as Data” viewpoint

27 Detailed Look at PCA Three important (and interesting) viewpoints: 1. Mathematics 2. Numerics 3. Statistics 1 st : Review linear alg. and multivar. prob.

28 Review of Linear Algebra Vector Space: set of “vectors”,, and “scalars” (coefficients), “closed” under “linear combination” ( in space) e.g., “ dim Euclid’n space”

29 Review of Linear Algebra (Cont.) Subspace: subset that is again a vector space i.e. closed under linear combination e.g. lines through the origin e.g. planes through the origin e.g. subsp. “generated by” a set of vector (all linear combos of them = = containing hyperplane through origin)

30 Review of Linear Algebra (Cont.) Basis of subspace: set of vectors that: span, i.e. everything is a lin. com. of them are linearly indep’t, i.e. lin. Com. is unique e.g. “unit vector basis” since

31 Review of Linear Algebra (Cont.) Basis Matrix, of subspace of Given a basis,, create matrix of columns:

32 Review of Linear Algebra (Cont.) Then “linear combo” is a matrix multiplicat’n: where Check sizes:

33 Review of Linear Algebra (Cont.) Aside on matrix multiplication: (linear transformat’n) For matrices, Define the “matrix product” (“inner products” of columns with rows) (composition of linear transformations) Often useful to check sizes:

34 Review of Linear Algebra (Cont.) Matrix trace: For a square matrix Define Trace commutes with matrix multiplication:

35 Review of Linear Algebra (Cont.) Dimension of subspace (a notion of “size”): number of elements in a basis (unique) (use basis above) e.g. dim of a line is 1 e.g. dim of a plane is 2 dimension is “degrees of freedom”

36 Review of Linear Algebra (Cont.) Norm of a vector: in, Idea: “length” of the vector Note: strange properties for high, e.g. “length of diagonal of unit cube” =

37 Review of Linear Algebra (Cont.) Norm of a vector (cont.): “length normalized vector”: (has length one, thus on surf. of unit sphere & is a direction vector) get “distance” as:

38 Review of Linear Algebra (Cont.) Inner (dot, scalar) product: for vectors and, related to norm, via

39 Review of Linear Algebra (Cont.) Inner (dot, scalar) product (cont.): measures “angle between and ” as: key to “orthogonality”, i.e. “perpendicul’ty”: if and only if

40 Review of Linear Algebra (Cont.) Orthonormal basis : All ortho to each other, i.e., for All have length 1, i.e., for

41 Review of Linear Algebra (Cont.) Orthonormal basis (cont.): “Spectral Representation”: where check: Matrix notation: where i.e. is called “transform (e.g. Fourier, wavelet) of ”

42 Review of Linear Algebra (Cont.) Parseval identity, for in subsp. gen’d by o. n. basis : Pythagorean theorem “Decomposition of Energy” ANOVA - sums of squares Transform,, has same length as, i.e. “rotation in ”

43 Gram-Schmidt Ortho-normalization Idea: Given a basis, find an orthonormal version, by subtracting non-ortho part Review of Linear Algebra (Cont.)

44 Projection of a vector onto a subspace : Idea: member of that is closest to (i.e. “approx’n”) Find that solves: (“least squares”) For inner product (Hilbert) space: exists and is unique Review of Linear Algebra (Cont.)

45 Projection of a vector onto a subspace (cont.): General solution in : for basis matrix, So “proj’n operator” is “matrix mult’n”: (thus projection is another linear operation) (note same operation underlies least squares) Review of Linear Algebra (Cont.)

46 Projection using orthonormal basis : Basis matrix is “orthonormal”: So = = Recon(Coeffs of “in dir’n”)

47 Review of Linear Algebra (Cont.) Projection using orthonormal basis (cont.): For “orthogonal complement”,, and Parseval inequality:

48 Review of Linear Algebra (Cont.) (Real) Unitary Matrices: with Orthonormal basis matrix (so all of above applies) Follows that (since have full rank, so exists …) Lin. trans. (mult. by ) is like “rotation” of But also includes “mirror images”

49 Review of Linear Algebra (Cont.) Singular Value Decomposition (SVD): For a matrix Find a diagonal matrix, with entries called singular values And unitary (rotation) matrices, (recall ) so that

50 Review of Linear Algebra (Cont.) Intuition behind Singular Value Decomposition: For a “linear transf’n” (via matrix multi’n) First rotate Second rescale coordinate axes (by ) Third rotate again i.e. have diagonalized the transformation

51 Review of Linear Algebra (Cont.) SVD Compact Representation: Useful Labeling: Singular Values in Increasing Order Note: singular values = 0 can be omitted Let = # of positive singular values Then: Where are truncations of

52 Review of Linear Algebra (Cont.) Eigenvalue Decomposition: For a (symmetric) square matrix Find a diagonal matrix And an orthonormal matrix (i.e. ) So that:, i.e.

53 Review of Linear Algebra (Cont.) Eigenvalue Decomposition (cont.): Relation to Singular Value Decomposition (looks similar?): Eigenvalue decomposition “harder” Since needs Price is eigenvalue decomp’n is generally complex Except for square and symmetric Then eigenvalue decomp. is real valued Thus is the sing’r value decomp. with:


Download ppt "Object Orie’d Data Analysis, Last Time Gene Cell Cycle Data Microarrays and HDLSS visualization DWD bias adjustment NCI 60 Data Today: More NCI 60 Data."

Similar presentations


Ads by Google