Download presentation
Presentation is loading. Please wait.
1
An Interesting Question
How generally applicable is Backwards approach to PCA? An Attractive Answer: James Damon, UNC Mathematics Key Idea: Express Backwards PCA as Nested Series of Constraints
2
General View of Backwards PCA
Define Nested Spaces via Constraints E.g. SVD ๐ ๐ = ๐ฅ :๐ฅ= ๐=1 ๐ ๐ ๐ ๐ข ๐ Now Define: ๐ ๐โ1 = ๐ฅโ ๐ ๐ : ๐ฅ, ๐ข ๐ =0 Constraint Gives Nested Reduction of Dimโn
3
Vectors of Angles Vectors of Angles as Data Objects ๐ 1 โฎ ๐ ๐ โ ๐ 1 ๐
๐ 1 โฎ ๐ ๐ โ ๐ 1 ๐ Slice space ๐ 1 ๐ with hyperplanes???? (ala Principal Nested Spheres)
4
Vectors of Angles E.g. ๐=2, Data w/ โSingle Mode of Varโnโ
Best Fitting Planar Slice gives Bimodal Distโn Special Thanks to Eduardo Garcรญa-Portuguรฉs
5
Torus Space ๐บ ๐ ร ๐บ ๐ Try To Fit A Geodesic Challenge: Can Get
Arbitrarily Close
6
Torus Space ๐บ ๐ ร ๐บ ๐ Fit Nested Sub-Manifold
7
PNS Main Idea Data Objects: ๐ 1 ,โฏ, ๐ ๐ โ ๐ ๐ โ โ ๐+1 Where ๐ ๐ is a ๐ dimensional manifold Consider a nested series of sub-manifolds: ๐ฎ= ๐ 0 ,โฏ, ๐ ๐โ1 where for ๐=0,โฏ,๐โ1 ๐๐๐ ๐ ๐ =๐ and ๐ ๐ โ ๐ ๐+1 Goal: Fit all of ๐ฎ simultaneously to ๐ 1 ,โฏ ๐ ๐
8
General Background Call each ๐ ๐ a stratum,
so ๐ฎ is a manifold stratification To be fit to ๐ 1 ,โฏ ๐ ๐ New Approach: Simultaneously fit ๐ฎ= ๐ 0 ,โฏ, ๐ ๐โ1 Nested Submanifold (NS)
9
Projection Notation For ๐=0,โฏ,๐โ1 let ๐ (๐) denote the telescoping projection onto ๐ ๐ I.e. for Xโ ๐ ๐ ๐ (๐) = ๐ ๐ ๐ ๐+1 โฏ ๐ ๐โ1 ๐ Note: This projection is fundamental to Backwards PCA methods
10
PNS Components For a given ๐ฎ, represent a point ๐โ ๐ ๐
by its Nested Submanifold components: ๐ 1 ๐ ,โฏ, ๐ ๐ (๐) where for ๐=1,โฏ,๐ ๐ ๐ ๐ = ๐ ๐ ๐ โ ๐ ๐โ1 (๐) In the sense that โ๐ดโ๐ตโ means the shortest geodesic arc between ๐ด & ๐ต
11
Nested Submanifold Fits
Simultaneous Fit Criteria? Based on Stratum-Wise Sums of Squares For ๐=1,โฏ,๐ define ๐๐ ๐ = ๐=1 ๐ ๐ ๐ ๐ ๐ ๐ , ๐ ๐โ1 ๐ ๐ 2 Uses โlengthsโ of NS Components: ๐ ๐ ๐ = ๐ ๐ ๐ โ ๐ ๐โ1 (๐)
12
NS Components in โ 2 NS Candidate 2 (Shifted ๐ 0 to Sample Mean)
Note: Both ๐๐ 1 & ๐๐ 2 Decrease
13
NS Components in โ 2 NS based On PC1 Note: ๐๐ 1 โ ๐๐ 2 โ
๐๐ 1 โ ๐๐ 2 โ Yet ๐๐ ๐๐ 2 is Constant (Pythagorean Thm)
14
NS Components in โ 2 NS based On PC2 Note: ๐๐ 1 โ ๐๐ 2 โ
๐๐ 1 โ ๐๐ 2 โ ๐๐ 1 + ๐๐ 2 is Constant (Pythagorean Thm)
15
NS Components in โ 2 NS Candidate 1
16
NS Components in โ 2 NS Candidate 2 0.3ร ๐๐ ร ๐๐ 2 โ
17
NS Components in โ 2 NS based On PC1 0.3ร ๐๐ ร ๐๐ 2 โ
18
NS Components in โ 2 NS based On PC2 0.3ร ๐๐ ร ๐๐ 2 โ
19
Nested Submanifold Fits
Simultaneously fit ๐ฎ= ๐ 0 ,โฏ, ๐ ๐โ1 Simultaneous Fit Criterion? ๐ค 1 ๐๐ 1 + ๐ค 2 ๐๐ 2 +โฏ+ ๐ค ๐ ๐๐ ๐ Above Suggests Want: ๐ค 1 < ๐ค 2 <โฏ< ๐ค ๐ Works for Euclidean PCA (?)
20
Nested Submanifold Fits
Simultaneous Fit Criterion? ๐ค 1 ๐๐ 1 + ๐ค 2 ๐๐ 2 +โฏ+ ๐ค ๐ ๐๐ ๐ Above Suggests Want: ๐ค 1 < ๐ค 2 <โฏ< ๐ค ๐ Important Predecessor Pennec (2016) AUC Criterion: ๐ค ๐ โ(๐โ1)
21
Pennecโs Area Under the Curve
100% Based on Scree Plot ๐๐ 1 ๐๐ 2 ๐๐ 3 ๐๐ 4 1 2 3 4 Component Index
22
Pennecโs Area Under the Curve
100% Based on Scree Plot Cumulative ๐๐ 1 ๐๐ 2 ๐๐ 3 ๐๐ 4 1 2 3 4 Component Index
23
Pennecโs Area Under the Curve
100% Based on Scree Plot Cumulative Area = ๐๐ 2 +2 ๐๐ 3 +3 ๐๐ 4 ๐๐ 1 ๐๐ 2 ๐๐ 3 ๐๐ 4 1 2 3 4 Component Index
24
Torus Space ๐บ ๐ ร ๐บ ๐ Fit Nested Sub-Manifold Choice of ๐ค 1 & ๐ค 2 in:
๐ค 1 ๐๐ 1 + ๐ค 2 ๐๐ 2 ???
25
(maybe OK for low rank approx.)
Torus Space ๐บ ๐ ร ๐บ ๐ รโฏร ๐บ ๐ Tiled (โ๐,๐] ๐ embedding is complicated (maybe OK for low rank approx.) Instead Consider Nested Sub-Torii Work in Progress with Garcia, Wood, Le Key Factor: Important Modes of Variation
26
OODA Big Picture New Topic: Curve Registration Main Reference: Srivastava et al (2011)
27
Collaborators Anuj Srivastava (Florida State U.)
Wei Wu (Florida State U.) Derek Tucker (Florida State U.) Xiaosun Lu (U. N. C.) Inge Koch (U. Adelaide) Peter Hoffmann (U. Adelaide) J. O. Ramsay (McGill U.) Laura Sangalli (Milano Polytech.)
28
Context Functional Data Analysis Curves as Data Objects Toy Example:
29
Context Functional Data Analysis Curves as Data Objects Toy Example:
How Can We Understand Variation?
30
Context Functional Data Analysis Curves as Data Objects Toy Example:
How Can We Understand Variation?
31
Context Functional Data Analysis Curves as Data Objects Toy Example:
How Can We Understand Variation?
32
Functional Data Analysis
Insightful Decomposition
33
Functional Data Analysis
Insightful Decomposition Horizโl Varโn
34
Functional Data Analysis
Insightful Decomposition Vertical Variation Horizโl Varโn
35
(even mathematical formulation)
Challenge Fairly Large Literature Many (Diverse) Past Attempts Limited Success (in General) Surprisingly Slippery (even mathematical formulation)
36
Challenge (Illustrated)
Thanks to Wei Wu
37
Challenge (Illustrated)
Thanks to Wei Wu
38
Functional Data Analysis
Appropriate Mathematical Framework? Vertical Variation Horizโl Varโn
39
Landmark Based Shape Analysis
Approach: Identify objects that are: Translations Rotations Scalings of each other Mathematics: Equivalence Relation Results in: Equivalence Classes Which become the Data Objects
40
Landmark Based Shape Analysis
Equivalence Classes become Data Objects a.k.a. โOrbitsโ Mathematics: Called โQuotient Spaceโ , , , , , ,
41
Curve Registration What are the Data Objects? Vertical Variation
Horizโl Varโn
42
Curve Registration What are the Data Objects? Consider โTime Warpingsโ ๐พ: 0,1 โ [0,1] (smooth) More Precisely: Diffeomorphisms
43
Curve Registration Diffeomorphisms ๐พ: 0,1 โ [0,1] ๐พ is 1 to 1
๐พ: 0,1 โ [0,1] ๐พ is 1 to 1 ๐พ is onto (thus ๐พ is invertible) Differentiable ๐พ โ1 is Differentiable
44
Time Warping Intuition
Elastically Stretch & Compress Axis
45
Time Warping Intuition
Elastically Stretch & Compress Axis ๐พ x =x (identity)
46
Time Warping Intuition
Elastically Stretch & Compress Axis ๐พ x
47
Time Warping Intuition
Elastically Stretch & Compress Axis ๐พ x
48
Time Warping Intuition
Elastically Stretch & Compress Axis ๐พ x
49
Curve Registration Say curves ๐ 1 (๐ฅ) and ๐ 2 (๐ฅ) are equivalent, ๐ 1 โ ๐ 2 When โ๐พ so that ๐ 1 ๐พ ๐ฅ = ๐ 1 โ๐พ ๐ฅ = ๐ 2 (๐ฅ)
50
Curve Registration Toy Example: Starting Curve, ๐ ๐ฅ
51
Curve Registration Toy Example: Equivalent Curves, ๐ ๐ฅ
52
Curve Registration Toy Example: Warping Functions
53
Curve Registration Toy Example: Non-Equivalent Curves Cannot Warp Into Each Other
54
Data Objects I Equivalence Classes of Curves (parallel to Kendall shape analysis)
55
Data Objects I Equivalence Classes of Curves (Set of All Warps of Given Curve) Notation: ๐ = ๐โ๐พ: ๐พโฮ for a โrepresentorโ ๐ ๐ฅ
56
Data Objects I Equivalence Classes of Curves (Set of All Warps of Given Curve) Next Task: Find Metric on Space of Curves
57
Metrics in Curve Space Find Metric on Equivalence Classes Start with Warp Invariant Metric on Curves & Extend
58
Metrics in Curve Space Traditional Approach to Curve Registration:
Align curves, say ๐ 1 and ๐ 2 By finding optimal time warp, ๐พ, so: ๐๐๐ ๐พ ๐ 1 โ ๐ 2 โ๐พ Vertical varโn: PCA after alignment Horizontal varโn: PCA on ๐พs
59
Metrics in Curve Space Problem: Donโt have proper metric Since: ๐ ๐ 1 , ๐ 2 โ ๐ ๐ 2 , ๐ 1 Because: ๐๐๐ ๐พ ๐ 1 โ ๐ 2 โ๐พ โ ๐๐๐ ๐พ ๐ 2 โ ๐ 1 โ๐พ
60
Metrics in Curve Space ๐๐๐ ๐พ ๐ 1 โ ๐ 2 โ๐พ โ ๐๐๐ ๐พ ๐ 2 โ ๐ 1 โ๐พ
๐๐๐ ๐พ ๐ 1 โ ๐ 2 โ๐พ โ ๐๐๐ ๐พ ๐ 2 โ ๐ 1 โ๐พ Thanks to Xiaosun Lu
61
Metrics in Curve Space ๐๐๐ ๐พ ๐ 1 โ ๐ 2 โ๐พ โ ๐๐๐ ๐พ ๐ 2 โ ๐ 1 โ๐พ Note:
๐๐๐ ๐พ ๐ 1 โ ๐ 2 โ๐พ โ ๐๐๐ ๐พ ๐ 2 โ ๐ 1 โ๐พ Note: Very Different L2 norms Thanks to Xiaosun Lu
62
Metrics in Curve Space Solution: Look for Warp Invariant Metric ๐ Where: ๐ ๐ 1 , ๐ 2 =๐ ๐ 1 โ๐พ, ๐ 2 โ๐พ
63
Metrics in Curve Space ๐ ๐ 1 , ๐ 2 =๐ ๐ 1 โ๐พ, ๐ 2 โ๐พ I.e. Have โParallelโ Representatives Of Equivalence Classes
64
Metrics in Curve Space Warp Invariant Metric ๐ Developed in context of: Likelihood Geometry Fisher โ Rao Metric: ๐ ๐น๐
๐ 1 , ๐ 2 = ๐ ๐น๐
๐ 1 โ๐พ, ๐ 2 โ๐พ
65
Metrics in Curve Space Fisher โ Rao Metric: Computation Based on Square Root Velocity Function (SRVF) ๐ ๐ ๐ก = ๐ ๐ก ๐ ๐ก Signed Version Of Square Root Derivative Where ๐ ๐ก = ๐ ๐๐ก ๐(๐ก)
66
Metrics in Curve Space Square Root Velocity Function (SRVF) ๐ ๐ ๐ก = ๐ ๐ก ๐ ๐ก ๐ ๐ก =๐ ๐ก ๐ ๐ ๐ ๐ ๐ ๐ ๐๐
67
Metrics in Curve Space Fisher โ Rao Metric: Computation Based on SRVF: ๐ ๐น๐
๐ 1 , ๐ 2 = ๐ 1 โ ๐ 2 2 So work with SRVF, Since much easier to compute.
68
Metrics in Curve Space Why square roots? Thanks to Xiaosun Lu
69
Metrics in Curve Space Why square roots?
70
Metrics in Curve Space Why square roots?
71
Metrics in Curve Space Why square roots?
72
Metrics in Curve Space Why square roots?
73
Metrics in Curve Space Why square roots?
74
Metrics in Curve Space Why square roots?
75
Metrics in Curve Space Why square roots?
76
Metrics in Curve Space Why square roots?
77
Metrics in Curve Space Why square roots? Dislikes Pinching Focusses Well On Peaks of Unequal Height
78
Metrics in Curve Space Note on SRVF representation: ๐ ๐น๐
๐ 1 , ๐ 2 = ๐ 1 โ ๐ 2 2 Can show: Warp Invariance ๐ ๐น๐
๐ 1 , ๐ 2 = ๐ ๐น๐
๐ 1 โ๐พ, ๐ 2 โ๐พ Follows from Jacobean calculation
79
Metrics in Curve Quotient Space
Above was Invariance for Individual Curves Now extend to: Equivalence Classes of Curves I.e. Orbits as Data Objects I.e. Quotient Space
80
Metrics in Curve Quotient Space
Define Metric on Equivalence Classes: For ๐ 1 & ๐ 2 , i.e. ๐ 1 & ๐ 2 ๐ ๐ 1 , ๐ 2 = ๐๐๐ ๐พโฮ ๐ 1 โ ๐ 2 โ๐พ Independent of Choice of ๐ 1 & ๐ 2 By Warp Invariance
81
Mean in Curve Quotient Space
Benefit of a Metric: Allows Definition of a โMeanโ Frรฉchet Mean Geodesic Mean Barycenter Karcher Mean
82
Mean in Curve Quotient Space
Given Equivalence Class Data Objects: ๐ 1 , ๐ 2 , โฏ, ๐ ๐ The Karcher Mean is: ๐ = ๐๐๐๐๐๐ ๐ ๐=1 ๐ ๐ ๐ , ๐ ๐ 2
83
Mean in Curve Quotient Space
The Karcher Mean is: ๐ = ๐๐๐๐๐๐ ๐ ๐=1 ๐ ๐ ๐ , ๐ ๐ 2 Intuition: Recall, for Euclidean Data Minimizer = Conventional ๐
84
Mean in Curve Quotient Space
Next Define โMost Representativeโ Choice of ๐ ๐ As Representer of ๐
85
Mean in Curve Quotient Space
โMost Representativeโ ๐ ๐ in ๐ Given a candidate ๐ Consider warps to each ๐ ๐ Choose ๐ ๐ to make Karcher mean of warps = Identity (under Fisher Rao metric)
86
Mean in Curve Quotient Space
โMost Representativeโ ๐ ๐ in ๐ Thanks to Anuj Srivastava
87
Toy Example โ (Details Later)
Estimated Warps (Note: Represented With Karcher Mean At Identity)
88
Mean in Curve Quotient Space
โMost Representativeโ ๐ ๐ in ๐ Terminology: The โTemplate Meanโ
89
More Data Objects Final Curve Warps:
Warp Each Data Curve, ๐ 1 , โฏ, ๐ ๐ To Template Mean, ๐ ๐ Denote Warp Functions ๐พ 1 , โฏ, ๐พ ๐ Gives (Roughly Speaking): Vertical Components ๐ 1 โ ๐พ 1 , โฏ, ๐ ๐ โ ๐พ ๐ (Aligned Curves) Horizontal Components ๐พ 1 , โฏ, ๐พ ๐ Data Objects I
90
More Data Objects Final Curve Warps:
Data Objects II Final Curve Warps: Warp Each Data Curve, ๐ 1 , โฏ, ๐ ๐ To Template Mean, ๐ ๐ Denote Warp Functions ๐พ 1 , โฏ, ๐พ ๐ Gives (Roughly Speaking): Vertical Components ๐ 1 โ ๐พ 1 , โฏ, ๐ ๐ โ ๐พ ๐ (Aligned Curves) Horizontal Components ๐พ 1 , โฏ, ๐พ ๐ ~ Kendallโs Shapes
91
More Data Objects Final Curve Warps:
Warp Each Data Curve, ๐ 1 , โฏ, ๐ ๐ To Template Mean, ๐ ๐ Denote Warp Functions ๐พ 1 , โฏ, ๐พ ๐ Gives (Roughly Speaking): Vertical Components ๐ 1 โ ๐พ 1 , โฏ, ๐ ๐ โ ๐พ ๐ (Aligned Curves) Horizontal Components ๐พ 1 , โฏ, ๐พ ๐ Data Objects III ~ Changโs Transfoโs
92
Computation Several Variations of Dynamic Programming Done by Eric Klassen, Wei Wu
93
Toy Example Raw Data
94
Toy Example Raw Data Both Horizontal And Vertical Variation
95
Toy Example Conventional PCA Projections
96
Toy Example Conventional PCA Projections Power Spread Across Spectrum
97
Toy Example Conventional PCA Projections Power Spread Across Spectrum
98
Toy Example Conventional PCA Scores
99
Toy Example Conventional PCA Scores Views of 1-d Curve Bending Through 4 Dimโnsโ
100
Toy Example Conventional PCA Scores Patterns Are โHarmonicsโ In Scores
101
Toy Example Scores Plot Shows Data Are โ1โ Dimensional So Need Improved PCA Decomp.
102
Visualization Vertical Variation:
PCA on Aligned Curves, ๐ 1 โ ๐พ 1 , โฏ, ๐ ๐ โ ๐พ ๐ Projected Curves
103
Toy Example Aligned Curves (Clear 1-d Vertical Varโn)
104
Toy Example Aligned Curve PCA Projections All Varโn In 1st Component
105
Visualization Horizontal Variation: PCA on Warps, ๐พ 1 , โฏ, ๐พ ๐
Projected Curves
106
Toy Example Estimated Warps
107
Toy Example Warps, PC Projections
108
Toy Example Warps, PC Projections Mostly 1st PC
109
Toy Example Warps, PC Projections Mostly 1st PC, But 2nd Helps Some
110
Toy Example Warps, PC Projections Rest is Not Important
111
Toy Example Horizontal Varโn Visualization Challenge: (Complicated) Warps Hard to Interpret Approach: Apply Warps to Template Mean (PCA components)
112
Toy Example Warp Componโts (+ Mean) Applied to Template Mean
113
Participant Presentations
Xi Yang Multi-View Weighted Network Hang Yu Introduction to multiple kernel learning Zhipeng Ding Fast Predictive Simple Geodesic Regression
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.