Download presentation
1
PCA Extension By Jonash
2
Outline Robust PCA Generalized PCA Clustering points on a line
Clustering lines on a plane Clustering hyperplanes in a space
3
Robust PCA Rrbust Principal Component Analysis for Computer Vision
Fernando De la Torre Mochael J. Black CS, Brown University
4
PCA is Least-Square Fit
5
PCA is Least-Square Fit
6
Robust Statistics Recover the best fit for the majority of the data
Detect and reject outliers
7
Robust PCA
8
Robust PCA
9
Robust PCA Training images
10
Robust PCA Naïve PCA Simply reject Robust PCA
11
RPCA BBTdi di B BTdi = ci In traditional PCA, we minimize
Σni = 0 (di – B BT di)2 = Σni = 0 (di - Bci)2 EM PCA Limσ -> 0(D = BC + σ2I) E-step C = (BTB)-1BTD M-step B = DCT(CCT)-1 BBTdi di B BTdi = ci
12
RPCA Xu and Yuille [1995] tries to minimize
Σni = 1 [ V i(di – B ci)2 + n(1-Vi) ] Hard to solve (continuous + discrete)
13
RPCA Gabriel and Zamir [1979] tries to minimize
Σni = 1 Σdp = 1 [ wpi(dpi – Bci)2] Impratical for high dimension “Low rank approximation of matrices by least squares with any choice of weights” 1979
14
RPCA Idea is to use a robust function ρ
Geman-McClureρ (x,σ) = x2/(x2 + σ2) Σni = 1 Σdp = 1 ρ[ (dpi–μp –Σkj = 1 bpjcji), σp] Approximated by local quadratic function Use gradient descent The rest is nothing but heuristics
15
RPCA
16
Robust PCA - Experiment
256 training images (120x160) Obtain 20 RPCA basis 3 hrs on 900MHz Pentinum III in Matlab
17
Outline Robust PCA Generalized PCA Clustering points on a line
Clustering lines on a plane Clustering hyperplanes in a space
18
Generalized PCA Generalized Principal Component Analysis Rene Vidal
Yi Ma Shankar Sastry UC Berkley and UIUC
19
GPCA
20
GPCA Example 1
21
GPCA Example 2
22
GPCA Example 3
23
GPCA Goals # of subspaces and their dimension Basis for subspace
Segmentation of data
24
GPCA Ideas Union of subspaces = certain polynomials Noise free case
25
Outline Robust PCA Generalized PCA Clustering points on a line
Clustering lines on a plane Clustering hyperplanes in a space
26
GPCA 1D Case
27
GPCA 1D Case Cont’d
28
To have a unique solution,
GPCA 1D Case Cont’d MN = n+1 unknowns To have a unique solution, rank(Vn) = n = Mn- 1
29
GPCA 1D Example n = 2 groups pn(x) = ( x – μ1) ( x – μ2)
No polynomial of degree 1 Infinite polynomial of degree 3 pn(x) = x2 + c1x + c2 => Polynomial factor
30
Outline Robust PCA Generalized PCA Clustering points on a line
Clustering lines on a plane Clustering hyperplanes in a space
31
GPCA 2D Case L j = { X = [x, y]T: bj1x + bj2y = 0 }
(b11x + b12y = 0) or (b21x + b22y = 0)…
32
GPCA 2D Case Cont’d (b11x + b12y = 0) or (b21x + b22y = 0)…
Pn(x) = (b11x + b12y)…(bn1x + bn2y) = 0 = Σck xn-k yk
33
GPCA 2D Case Cont’d Take n = 2 for example…
p2(x) = (b11x + b12y)(b21x + b22y) ▽p2(x) = (b21x + b22y)b1 + (b11x + b12y)b2 , bj = [bj1, bj2] T if x ~ L1, then ▽p2(x) ~ b1, otherwise ~ b2
34
GPCA 2D Case Cont’d Given that {yj ε Lj}, the normal vector of Lj is bj ~ ▽pn(yj) 3 things… Determine “ n ” as min{ j: rank(Vj) = j } Solve cn for Vncn = 0 Find normal vector bj
35
Outline Robust PCA Generalized PCA Clustering points on a line
Clustering lines on a plane Clustering hyperplanes in a space
36
GPCA Hyperplanes Still assume d1 = … = dn = d = D – 1
Sj = { bjTx = bj1x1 + bj2x2 + … + bjDxD = 0}
37
GPCA Hyperplanes MN = C(D+n-1, D)
38
GPCA Hyperplanes
39
GPCA Hyperplanes Since we know n, we can solve for ck
ck => bk by ▽pn(x) If we know yj on each Sj, finding bj will be easy
40
GPCA Hyperplanes One point yj on each hyperplane Sj
Consider a random line L = t * v + x0 Obtain yj by intersecting L and Sj yj = tj * v + x0 Find roots tj by … Pn(t v + xo)
41
GPCA Hyperplanes Summarize We want to find n to solve for c
To get b (normal) for each S, find ▽pn(x) To get label j, solve pn(yj = tj * v + x0) = 0
43
One More Thing
44
One More Thing Previously we assume d1 = … =dn= D – 1
Actually we cannot assume that… Please read section 4.2 & 4.3 … by yourself Discuss how to recursively reduce dimension
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.