Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Interesting Question

Similar presentations


Presentation on theme: "An Interesting Question"โ€” Presentation transcript:

1 An Interesting Question
How generally applicable is Backwards approach to PCA? An Attractive Answer: James Damon, UNC Mathematics Key Idea: Express Backwards PCA as Nested Series of Constraints

2 General View of Backwards PCA
Define Nested Spaces via Constraints E.g. SVD ๐‘† ๐‘˜ = ๐‘ฅ :๐‘ฅ= ๐‘—=1 ๐‘˜ ๐‘ ๐‘— ๐‘ข ๐‘— Now Define: ๐‘† ๐‘˜โˆ’1 = ๐‘ฅโˆˆ ๐‘† ๐‘˜ : ๐‘ฅ, ๐‘ข ๐‘˜ =0 Constraint Gives Nested Reduction of Dimโ€™n

3 Vectors of Angles Vectors of Angles as Data Objects ๐œƒ 1 โ‹ฎ ๐œƒ ๐‘‘ โˆˆ ๐‘† 1 ๐‘‘
๐œƒ 1 โ‹ฎ ๐œƒ ๐‘‘ โˆˆ ๐‘† 1 ๐‘‘ Slice space ๐‘† 1 ๐‘‘ with hyperplanes???? (ala Principal Nested Spheres)

4 Vectors of Angles E.g. ๐‘‘=2, Data w/ โ€œSingle Mode of Varโ€™nโ€
Best Fitting Planar Slice gives Bimodal Distโ€™n Special Thanks to Eduardo Garcรญa-Portuguรฉs

5 Torus Space ๐‘บ ๐Ÿ ร— ๐‘บ ๐Ÿ Try To Fit A Geodesic Challenge: Can Get
Arbitrarily Close

6 Torus Space ๐‘บ ๐Ÿ ร— ๐‘บ ๐Ÿ Fit Nested Sub-Manifold

7 PNS Main Idea Data Objects: ๐‘‹ 1 ,โ‹ฏ, ๐‘‹ ๐‘› โˆˆ ๐‘† ๐‘‘ โŠ† โ„ ๐‘‘+1 Where ๐‘† ๐‘‘ is a ๐‘‘ dimensional manifold Consider a nested series of sub-manifolds: ๐’ฎ= ๐‘† 0 ,โ‹ฏ, ๐‘† ๐‘‘โˆ’1 where for ๐‘—=0,โ‹ฏ,๐‘‘โˆ’1 ๐‘‘๐‘–๐‘š ๐‘† ๐‘— =๐‘— and ๐‘† ๐‘— โŠ† ๐‘† ๐‘—+1 Goal: Fit all of ๐’ฎ simultaneously to ๐‘‹ 1 ,โ‹ฏ ๐‘‹ ๐‘›

8 General Background Call each ๐‘† ๐‘— a stratum,
so ๐’ฎ is a manifold stratification To be fit to ๐‘‹ 1 ,โ‹ฏ ๐‘‹ ๐‘› New Approach: Simultaneously fit ๐’ฎ= ๐‘† 0 ,โ‹ฏ, ๐‘† ๐‘‘โˆ’1 Nested Submanifold (NS)

9 Projection Notation For ๐‘—=0,โ‹ฏ,๐‘‘โˆ’1 let ๐‘ƒ (๐‘˜) denote the telescoping projection onto ๐‘† ๐‘˜ I.e. for Xโˆˆ ๐‘† ๐‘‘ ๐‘ƒ (๐‘˜) = ๐‘ƒ ๐‘˜ ๐‘ƒ ๐‘˜+1 โ‹ฏ ๐‘ƒ ๐‘‘โˆ’1 ๐‘‹ Note: This projection is fundamental to Backwards PCA methods

10 PNS Components For a given ๐’ฎ, represent a point ๐‘‹โˆˆ ๐‘† ๐‘‘
by its Nested Submanifold components: ๐‘ 1 ๐‘‹ ,โ‹ฏ, ๐‘ ๐‘‘ (๐‘‹) where for ๐‘—=1,โ‹ฏ,๐‘‘ ๐‘ ๐‘— ๐‘‹ = ๐‘ƒ ๐‘— ๐‘‹ โˆ’ ๐‘ƒ ๐‘—โˆ’1 (๐‘‹) In the sense that โ€œ๐ดโˆ’๐ตโ€ means the shortest geodesic arc between ๐ด & ๐ต

11 Nested Submanifold Fits
Simultaneous Fit Criteria? Based on Stratum-Wise Sums of Squares For ๐‘—=1,โ‹ฏ,๐‘‘ define ๐‘†๐‘† ๐‘— = ๐‘–=1 ๐‘› ๐‘‘ ๐‘ƒ ๐‘— ๐‘‹ ๐‘– , ๐‘ƒ ๐‘—โˆ’1 ๐‘‹ ๐‘– 2 Uses โ€œlengthsโ€ of NS Components: ๐‘ ๐‘— ๐‘‹ = ๐‘ƒ ๐‘— ๐‘‹ โˆ’ ๐‘ƒ ๐‘—โˆ’1 (๐‘‹)

12 NS Components in โ„ 2 NS Candidate 2 (Shifted ๐‘† 0 to Sample Mean)
Note: Both ๐‘†๐‘† 1 & ๐‘†๐‘† 2 Decrease

13 NS Components in โ„ 2 NS based On PC1 Note: ๐‘†๐‘† 1 โ†‘ ๐‘†๐‘† 2 โ†“
๐‘†๐‘† 1 โ†‘ ๐‘†๐‘† 2 โ†“ Yet ๐‘†๐‘† ๐‘†๐‘† 2 is Constant (Pythagorean Thm)

14 NS Components in โ„ 2 NS based On PC2 Note: ๐‘†๐‘† 1 โ†“ ๐‘†๐‘† 2 โ†‘
๐‘†๐‘† 1 โ†“ ๐‘†๐‘† 2 โ†‘ ๐‘†๐‘† 1 + ๐‘†๐‘† 2 is Constant (Pythagorean Thm)

15 NS Components in โ„ 2 NS Candidate 1

16 NS Components in โ„ 2 NS Candidate 2 0.3ร— ๐‘†๐‘† ร— ๐‘†๐‘† 2 โ†“

17 NS Components in โ„ 2 NS based On PC1 0.3ร— ๐‘†๐‘† ร— ๐‘†๐‘† 2 โ†“

18 NS Components in โ„ 2 NS based On PC2 0.3ร— ๐‘†๐‘† ร— ๐‘†๐‘† 2 โ†‘

19 Nested Submanifold Fits
Simultaneously fit ๐’ฎ= ๐‘† 0 ,โ‹ฏ, ๐‘† ๐‘‘โˆ’1 Simultaneous Fit Criterion? ๐‘ค 1 ๐‘†๐‘† 1 + ๐‘ค 2 ๐‘†๐‘† 2 +โ‹ฏ+ ๐‘ค ๐‘‘ ๐‘†๐‘† ๐‘‘ Above Suggests Want: ๐‘ค 1 < ๐‘ค 2 <โ‹ฏ< ๐‘ค ๐‘‘ Works for Euclidean PCA (?)

20 Nested Submanifold Fits
Simultaneous Fit Criterion? ๐‘ค 1 ๐‘†๐‘† 1 + ๐‘ค 2 ๐‘†๐‘† 2 +โ‹ฏ+ ๐‘ค ๐‘‘ ๐‘†๐‘† ๐‘‘ Above Suggests Want: ๐‘ค 1 < ๐‘ค 2 <โ‹ฏ< ๐‘ค ๐‘‘ Important Predecessor Pennec (2016) AUC Criterion: ๐‘ค ๐‘— โˆ(๐‘—โˆ’1)

21 Pennecโ€™s Area Under the Curve
100% Based on Scree Plot ๐‘†๐‘† 1 ๐‘†๐‘† 2 ๐‘†๐‘† 3 ๐‘†๐‘† 4 1 2 3 4 Component Index

22 Pennecโ€™s Area Under the Curve
100% Based on Scree Plot Cumulative ๐‘†๐‘† 1 ๐‘†๐‘† 2 ๐‘†๐‘† 3 ๐‘†๐‘† 4 1 2 3 4 Component Index

23 Pennecโ€™s Area Under the Curve
100% Based on Scree Plot Cumulative Area = ๐‘†๐‘† 2 +2 ๐‘†๐‘† 3 +3 ๐‘†๐‘† 4 ๐‘†๐‘† 1 ๐‘†๐‘† 2 ๐‘†๐‘† 3 ๐‘†๐‘† 4 1 2 3 4 Component Index

24 Torus Space ๐‘บ ๐Ÿ ร— ๐‘บ ๐Ÿ Fit Nested Sub-Manifold Choice of ๐‘ค 1 & ๐‘ค 2 in:
๐‘ค 1 ๐‘†๐‘† 1 + ๐‘ค 2 ๐‘†๐‘† 2 ???

25 (maybe OK for low rank approx.)
Torus Space ๐‘บ ๐Ÿ ร— ๐‘บ ๐Ÿ ร—โ‹ฏร— ๐‘บ ๐Ÿ Tiled (โˆ’๐œ‹,๐œ‹] ๐‘‘ embedding is complicated (maybe OK for low rank approx.) Instead Consider Nested Sub-Torii Work in Progress with Garcia, Wood, Le Key Factor: Important Modes of Variation

26 OODA Big Picture New Topic: Curve Registration Main Reference: Srivastava et al (2011)

27 Collaborators Anuj Srivastava (Florida State U.)
Wei Wu (Florida State U.) Derek Tucker (Florida State U.) Xiaosun Lu (U. N. C.) Inge Koch (U. Adelaide) Peter Hoffmann (U. Adelaide) J. O. Ramsay (McGill U.) Laura Sangalli (Milano Polytech.)

28 Context Functional Data Analysis Curves as Data Objects Toy Example:

29 Context Functional Data Analysis Curves as Data Objects Toy Example:
How Can We Understand Variation?

30 Context Functional Data Analysis Curves as Data Objects Toy Example:
How Can We Understand Variation?

31 Context Functional Data Analysis Curves as Data Objects Toy Example:
How Can We Understand Variation?

32 Functional Data Analysis
Insightful Decomposition

33 Functional Data Analysis
Insightful Decomposition Horizโ€™l Varโ€™n

34 Functional Data Analysis
Insightful Decomposition Vertical Variation Horizโ€™l Varโ€™n

35 (even mathematical formulation)
Challenge Fairly Large Literature Many (Diverse) Past Attempts Limited Success (in General) Surprisingly Slippery (even mathematical formulation)

36 Challenge (Illustrated)
Thanks to Wei Wu

37 Challenge (Illustrated)
Thanks to Wei Wu

38 Functional Data Analysis
Appropriate Mathematical Framework? Vertical Variation Horizโ€™l Varโ€™n

39 Landmark Based Shape Analysis
Approach: Identify objects that are: Translations Rotations Scalings of each other Mathematics: Equivalence Relation Results in: Equivalence Classes Which become the Data Objects

40 Landmark Based Shape Analysis
Equivalence Classes become Data Objects a.k.a. โ€œOrbitsโ€ Mathematics: Called โ€œQuotient Spaceโ€ , , , , , ,

41 Curve Registration What are the Data Objects? Vertical Variation
Horizโ€™l Varโ€™n

42 Curve Registration What are the Data Objects? Consider โ€œTime Warpingsโ€ ๐›พ: 0,1 โ†’ [0,1] (smooth) More Precisely: Diffeomorphisms

43 Curve Registration Diffeomorphisms ๐›พ: 0,1 โ†’ [0,1] ๐›พ is 1 to 1
๐›พ: 0,1 โ†’ [0,1] ๐›พ is 1 to 1 ๐›พ is onto (thus ๐›พ is invertible) Differentiable ๐›พ โˆ’1 is Differentiable

44 Time Warping Intuition
Elastically Stretch & Compress Axis

45 Time Warping Intuition
Elastically Stretch & Compress Axis ๐›พ x =x (identity)

46 Time Warping Intuition
Elastically Stretch & Compress Axis ๐›พ x

47 Time Warping Intuition
Elastically Stretch & Compress Axis ๐›พ x

48 Time Warping Intuition
Elastically Stretch & Compress Axis ๐›พ x

49 Curve Registration Say curves ๐‘“ 1 (๐‘ฅ) and ๐‘“ 2 (๐‘ฅ) are equivalent, ๐‘“ 1 โ‰ˆ ๐‘“ 2 When โˆƒ๐›พ so that ๐‘“ 1 ๐›พ ๐‘ฅ = ๐‘“ 1 โˆ˜๐›พ ๐‘ฅ = ๐‘“ 2 (๐‘ฅ)

50 Curve Registration Toy Example: Starting Curve, ๐‘“ ๐‘ฅ

51 Curve Registration Toy Example: Equivalent Curves, ๐‘“ ๐‘ฅ

52 Curve Registration Toy Example: Warping Functions

53 Curve Registration Toy Example: Non-Equivalent Curves Cannot Warp Into Each Other

54 Data Objects I Equivalence Classes of Curves (parallel to Kendall shape analysis)

55 Data Objects I Equivalence Classes of Curves (Set of All Warps of Given Curve) Notation: ๐‘“ = ๐‘“โˆ˜๐›พ: ๐›พโˆˆฮ“ for a โ€œrepresentorโ€ ๐‘“ ๐‘ฅ

56 Data Objects I Equivalence Classes of Curves (Set of All Warps of Given Curve) Next Task: Find Metric on Space of Curves

57 Metrics in Curve Space Find Metric on Equivalence Classes Start with Warp Invariant Metric on Curves & Extend

58 Metrics in Curve Space Traditional Approach to Curve Registration:
Align curves, say ๐‘“ 1 and ๐‘“ 2 By finding optimal time warp, ๐›พ, so: ๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 1 โˆ’ ๐‘“ 2 โˆ˜๐›พ Vertical varโ€™n: PCA after alignment Horizontal varโ€™n: PCA on ๐›พs

59 Metrics in Curve Space Problem: Donโ€™t have proper metric Since: ๐‘‘ ๐‘“ 1 , ๐‘“ 2 โ‰ ๐‘‘ ๐‘“ 2 , ๐‘“ 1 Because: ๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 1 โˆ’ ๐‘“ 2 โˆ˜๐›พ โ‰  ๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 2 โˆ’ ๐‘“ 1 โˆ˜๐›พ

60 Metrics in Curve Space ๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 1 โˆ’ ๐‘“ 2 โˆ˜๐›พ โ‰  ๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 2 โˆ’ ๐‘“ 1 โˆ˜๐›พ
๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 1 โˆ’ ๐‘“ 2 โˆ˜๐›พ โ‰  ๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 2 โˆ’ ๐‘“ 1 โˆ˜๐›พ Thanks to Xiaosun Lu

61 Metrics in Curve Space ๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 1 โˆ’ ๐‘“ 2 โˆ˜๐›พ โ‰  ๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 2 โˆ’ ๐‘“ 1 โˆ˜๐›พ Note:
๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 1 โˆ’ ๐‘“ 2 โˆ˜๐›พ โ‰  ๐‘–๐‘›๐‘“ ๐›พ ๐‘“ 2 โˆ’ ๐‘“ 1 โˆ˜๐›พ Note: Very Different L2 norms Thanks to Xiaosun Lu

62 Metrics in Curve Space Solution: Look for Warp Invariant Metric ๐‘‘ Where: ๐‘‘ ๐‘“ 1 , ๐‘“ 2 =๐‘‘ ๐‘“ 1 โˆ˜๐›พ, ๐‘“ 2 โˆ˜๐›พ

63 Metrics in Curve Space ๐‘‘ ๐‘“ 1 , ๐‘“ 2 =๐‘‘ ๐‘“ 1 โˆ˜๐›พ, ๐‘“ 2 โˆ˜๐›พ I.e. Have โ€œParallelโ€ Representatives Of Equivalence Classes

64 Metrics in Curve Space Warp Invariant Metric ๐‘‘ Developed in context of: Likelihood Geometry Fisher โ€“ Rao Metric: ๐‘‘ ๐น๐‘… ๐‘“ 1 , ๐‘“ 2 = ๐‘‘ ๐น๐‘… ๐‘“ 1 โˆ˜๐›พ, ๐‘“ 2 โˆ˜๐›พ

65 Metrics in Curve Space Fisher โ€“ Rao Metric: Computation Based on Square Root Velocity Function (SRVF) ๐‘ž ๐‘“ ๐‘ก = ๐‘“ ๐‘ก ๐‘“ ๐‘ก Signed Version Of Square Root Derivative Where ๐‘“ ๐‘ก = ๐œ• ๐œ•๐‘ก ๐‘“(๐‘ก)

66 Metrics in Curve Space Square Root Velocity Function (SRVF) ๐‘ž ๐‘“ ๐‘ก = ๐‘“ ๐‘ก ๐‘“ ๐‘ก ๐‘“ ๐‘ก =๐‘“ ๐‘ก ๐‘ž ๐‘“ ๐‘  ๐‘ž ๐‘“ ๐‘  ๐‘‘๐‘ 

67 Metrics in Curve Space Fisher โ€“ Rao Metric: Computation Based on SRVF: ๐‘‘ ๐น๐‘… ๐‘“ 1 , ๐‘“ 2 = ๐‘ž 1 โˆ’ ๐‘ž 2 2 So work with SRVF, Since much easier to compute.

68 Metrics in Curve Space Why square roots? Thanks to Xiaosun Lu

69 Metrics in Curve Space Why square roots?

70 Metrics in Curve Space Why square roots?

71 Metrics in Curve Space Why square roots?

72 Metrics in Curve Space Why square roots?

73 Metrics in Curve Space Why square roots?

74 Metrics in Curve Space Why square roots?

75 Metrics in Curve Space Why square roots?

76 Metrics in Curve Space Why square roots?

77 Metrics in Curve Space Why square roots? Dislikes Pinching Focusses Well On Peaks of Unequal Height

78 Metrics in Curve Space Note on SRVF representation: ๐‘‘ ๐น๐‘… ๐‘“ 1 , ๐‘“ 2 = ๐‘ž 1 โˆ’ ๐‘ž 2 2 Can show: Warp Invariance ๐‘‘ ๐น๐‘… ๐‘“ 1 , ๐‘“ 2 = ๐‘‘ ๐น๐‘… ๐‘“ 1 โˆ˜๐›พ, ๐‘“ 2 โˆ˜๐›พ Follows from Jacobean calculation

79 Metrics in Curve Quotient Space
Above was Invariance for Individual Curves Now extend to: Equivalence Classes of Curves I.e. Orbits as Data Objects I.e. Quotient Space

80 Metrics in Curve Quotient Space
Define Metric on Equivalence Classes: For ๐‘“ 1 & ๐‘“ 2 , i.e. ๐‘ž 1 & ๐‘ž 2 ๐‘‘ ๐‘“ 1 , ๐‘“ 2 = ๐‘–๐‘›๐‘“ ๐›พโˆˆฮ“ ๐‘ž 1 โˆ’ ๐‘ž 2 โˆ˜๐›พ Independent of Choice of ๐‘“ 1 & ๐‘“ 2 By Warp Invariance

81 Mean in Curve Quotient Space
Benefit of a Metric: Allows Definition of a โ€œMeanโ€ Frรฉchet Mean Geodesic Mean Barycenter Karcher Mean

82 Mean in Curve Quotient Space
Given Equivalence Class Data Objects: ๐‘“ 1 , ๐‘“ 2 , โ‹ฏ, ๐‘“ ๐‘› The Karcher Mean is: ๐œ‡ = ๐‘Ž๐‘Ÿ๐‘”๐‘š๐‘–๐‘› ๐‘ž ๐‘–=1 ๐‘› ๐‘‘ ๐‘ž , ๐‘ž ๐‘– 2

83 Mean in Curve Quotient Space
The Karcher Mean is: ๐œ‡ = ๐‘Ž๐‘Ÿ๐‘”๐‘š๐‘–๐‘› ๐‘ž ๐‘–=1 ๐‘› ๐‘‘ ๐‘ž , ๐‘ž ๐‘– 2 Intuition: Recall, for Euclidean Data Minimizer = Conventional ๐‘‹

84 Mean in Curve Quotient Space
Next Define โ€œMost Representativeโ€ Choice of ๐œ‡ ๐‘› As Representer of ๐œ‡

85 Mean in Curve Quotient Space
โ€œMost Representativeโ€ ๐œ‡ ๐‘› in ๐œ‡ Given a candidate ๐œ‡ Consider warps to each ๐‘ž ๐‘– Choose ๐œ‡ ๐‘› to make Karcher mean of warps = Identity (under Fisher Rao metric)

86 Mean in Curve Quotient Space
โ€œMost Representativeโ€ ๐œ‡ ๐‘› in ๐œ‡ Thanks to Anuj Srivastava

87 Toy Example โ€“ (Details Later)
Estimated Warps (Note: Represented With Karcher Mean At Identity)

88 Mean in Curve Quotient Space
โ€œMost Representativeโ€ ๐œ‡ ๐‘› in ๐œ‡ Terminology: The โ€œTemplate Meanโ€

89 More Data Objects Final Curve Warps:
Warp Each Data Curve, ๐‘“ 1 , โ‹ฏ, ๐‘“ ๐‘› To Template Mean, ๐œ‡ ๐‘› Denote Warp Functions ๐›พ 1 , โ‹ฏ, ๐›พ ๐‘› Gives (Roughly Speaking): Vertical Components ๐‘“ 1 โˆ˜ ๐›พ 1 , โ‹ฏ, ๐‘“ ๐‘› โˆ˜ ๐›พ ๐‘› (Aligned Curves) Horizontal Components ๐›พ 1 , โ‹ฏ, ๐›พ ๐‘› Data Objects I

90 More Data Objects Final Curve Warps:
Data Objects II Final Curve Warps: Warp Each Data Curve, ๐‘“ 1 , โ‹ฏ, ๐‘“ ๐‘› To Template Mean, ๐œ‡ ๐‘› Denote Warp Functions ๐›พ 1 , โ‹ฏ, ๐›พ ๐‘› Gives (Roughly Speaking): Vertical Components ๐‘“ 1 โˆ˜ ๐›พ 1 , โ‹ฏ, ๐‘“ ๐‘› โˆ˜ ๐›พ ๐‘› (Aligned Curves) Horizontal Components ๐›พ 1 , โ‹ฏ, ๐›พ ๐‘› ~ Kendallโ€™s Shapes

91 More Data Objects Final Curve Warps:
Warp Each Data Curve, ๐‘“ 1 , โ‹ฏ, ๐‘“ ๐‘› To Template Mean, ๐œ‡ ๐‘› Denote Warp Functions ๐›พ 1 , โ‹ฏ, ๐›พ ๐‘› Gives (Roughly Speaking): Vertical Components ๐‘“ 1 โˆ˜ ๐›พ 1 , โ‹ฏ, ๐‘“ ๐‘› โˆ˜ ๐›พ ๐‘› (Aligned Curves) Horizontal Components ๐›พ 1 , โ‹ฏ, ๐›พ ๐‘› Data Objects III ~ Changโ€™s Transfoโ€™s

92 Computation Several Variations of Dynamic Programming Done by Eric Klassen, Wei Wu

93 Toy Example Raw Data

94 Toy Example Raw Data Both Horizontal And Vertical Variation

95 Toy Example Conventional PCA Projections

96 Toy Example Conventional PCA Projections Power Spread Across Spectrum

97 Toy Example Conventional PCA Projections Power Spread Across Spectrum

98 Toy Example Conventional PCA Scores

99 Toy Example Conventional PCA Scores Views of 1-d Curve Bending Through 4 Dimโ€™nsโ€™

100 Toy Example Conventional PCA Scores Patterns Are โ€œHarmonicsโ€ In Scores

101 Toy Example Scores Plot Shows Data Are โ€œ1โ€ Dimensional So Need Improved PCA Decomp.

102 Visualization Vertical Variation:
PCA on Aligned Curves, ๐‘“ 1 โˆ˜ ๐›พ 1 , โ‹ฏ, ๐‘“ ๐‘› โˆ˜ ๐›พ ๐‘› Projected Curves

103 Toy Example Aligned Curves (Clear 1-d Vertical Varโ€™n)

104 Toy Example Aligned Curve PCA Projections All Varโ€™n In 1st Component

105 Visualization Horizontal Variation: PCA on Warps, ๐›พ 1 , โ‹ฏ, ๐›พ ๐‘›
Projected Curves

106 Toy Example Estimated Warps

107 Toy Example Warps, PC Projections

108 Toy Example Warps, PC Projections Mostly 1st PC

109 Toy Example Warps, PC Projections Mostly 1st PC, But 2nd Helps Some

110 Toy Example Warps, PC Projections Rest is Not Important

111 Toy Example Horizontal Varโ€™n Visualization Challenge: (Complicated) Warps Hard to Interpret Approach: Apply Warps to Template Mean (PCA components)

112 Toy Example Warp Componโ€™ts (+ Mean) Applied to Template Mean

113 Participant Presentations
Xi Yang Multi-View Weighted Network Hang Yu Introduction to multiple kernel learning Zhipeng Ding Fast Predictive Simple Geodesic Regression


Download ppt "An Interesting Question"

Similar presentations


Ads by Google