Download presentation
Presentation is loading. Please wait.
1
6/26/2006CGI'06, Hangzhou China1 Sub-sampling for Efficient Spectral Mesh Processing Rong Liu, Varun Jain and Hao Zhang GrUVi lab, Simon Fraser University, Burnaby, Canada
2
6/26/2006CGI'06, Hangzhou China2 Roadmap Background Nyström Method Kernel PCA (KPCA) Measuring Nyström Quality using KPCA Sampling Schemes Applications Conclusion and Future Work
3
6/26/2006CGI'06, Hangzhou China3 Roadmap Background Nyström Method Kernel PCA (KPCA) Measuring Nyström Quality using KPCA Sampling Schemes Applications Conclusion and Future Work
4
6/26/2006CGI'06, Hangzhou China4 Spectral Applications spectral clustering [Ng et. al., 02] spectral mesh compression [Karni and Gotsman, 00] watermarking [Ohbuchi et. al., 01] spectral mesh segmentation [Liu and Zhang, 04] face recognition in eigenspace [Turk, 01] spectral mesh correspondence [Jain and Zhang, 06] “affinity matrix” W, its eigen-decomposition texture mapping using MDS [Zigelman et. al., 02]
5
6/26/2006CGI'06, Hangzhou China5 Spectral Embedding W =W = 0.56 j i i j W = EΛE T n points, dimension 2 E = … embedding space, dimension n row i i j
6
6/26/2006CGI'06, Hangzhou China6 Bottlenecks Computation of W, O(n 2 ). Apply sub-sampling to compute partial W. Eigenvalue decomposition of W, O(n 3 ). Apply Nyström method to approximate the eigenvectors of W. How to sample to make Nyström work better ?
7
6/26/2006CGI'06, Hangzhou China7 Roadmap Background Nyström Method Kernel PCA (KPCA) Measuring Nyström Quality using KPCA Sampling Schemes Applications Conclusion and Future Work
8
6/26/2006CGI'06, Hangzhou China8 Sub-sampling Compute partial affinities n points O (n 2 ) O (l. n) complexity : Z = X U Y l sample points W = affinities between X and Y affinities within X
9
6/26/2006CGI'06, Hangzhou China9 Nyström Method [Williams and Seeger, 2001] Approximate Eigenvectors W = A BTBT B C, A = UΛU T O (n 3 ) O (l 2. n) complexity : U = U B T UΛ -1 approximate eigenvectors
10
6/26/2006CGI'06, Hangzhou China10 Schur Complement U B T UΛ -1 Λ U T = A BTA-1BBTA-1B B BTBT W = UΛU T = W = A C B BTBT Schur Complement = C - B T A -1 B F F Practically, SC is not useful to measure the quality of a sample set. SC =
11
6/26/2006CGI'06, Hangzhou China11 Roadmap Background Nyström Method Kernel PCA (KPCA) Measuring Nyström Quality using KPCA Sampling Schemes Applications Conclusion and Future Work
12
6/26/2006CGI'06, Hangzhou China12 PCA and KPCA [Schölkopf et al, 1998] covariance matrix C X dimension 2 covariance matrix C φ( X) X feature space, high dimension (infinite) X is implicitly defined by a kernel matrix K, where K ij =
13
6/26/2006CGI'06, Hangzhou China13 Training Set for KPCA K = L MTMT M N L = EΛE T E = E M T EΛ -1 ˙ Λ -1/2
14
6/26/2006CGI'06, Hangzhou China14 Nyström Method and KPCA W = A BTBT B C A = UΛU T U = U B T UΛ -1 Nyström KPCA w/ training set K = L MTMT M N L = EΛE T E = E M T EΛ -1 ˙ Λ -1/2
15
6/26/2006CGI'06, Hangzhou China15 Roadmap Background Nyström Method Kernel PCA (KPCA) Measuring Nyström Quality using KPCA Sampling Schemes Applications Conclusion and Future Work
16
6/26/2006CGI'06, Hangzhou China16 When Nyström Works Well ? When the training set of KPCA works well ? Training set should minimize: subspace spanned by training points
17
6/26/2006CGI'06, Hangzhou China17 Objective Function minimize : maximize: W = A BTBT B C evaluation:
18
6/26/2006CGI'06, Hangzhou China18 Compare Γ and SCSC Given two sampling sets S 1 and S 2 1.Test data are generated using Gaussian distribution; 2.Test is repeated for 100 times; 3.4% inconsistency.
19
6/26/2006CGI'06, Hangzhou China19 Roadmap Background Nyström Method Kernel PCA (KPCA) Measuring Nyström Quality using KPCA Sampling Schemes Applications Conclusion and Future Work
20
6/26/2006CGI'06, Hangzhou China20 How to sample: Greedy Scheme Maximize: Greedy Sampling Scheme: W = A BTBT B C AB Best candidate sampling scheme: To find the best 1% with probability 95%, we only need to search for the best one from a random subset of size 90 ( log(0.01)/log(0.95) ) regardless of the problem size.
21
6/26/2006CGI'06, Hangzhou China21 Properties of Γ ( 0, m ), m is the column size of B maximize 1 T (A -1 1) 1.A is symmetric. 2.Diagonals of A are 1. 3.Off-diagonals of A are in (0, 1). It can be shown that when A’s columns are canonical basis of the Euclidean space, the maxima is obtained.
22
6/26/2006CGI'06, Hangzhou China22 How to Sample: Farthest Point Scheme A = 1 1 1 … In order for A ’s columns to be close to canonical basis, the off-diagonals should be close to zero. This means the distances between each pair of samples should be as large as possible, namely Samples are mutually farthest away.
23
6/26/2006CGI'06, Hangzhou China23 Farthest Sampling Scheme
24
6/26/2006CGI'06, Hangzhou China24 Roadmap Background Nyström Method Kernel PCA (KPCA) Measuring Nyström Quality using KPCA Sampling Schemes Applications Conclusion and Future Work
25
6/26/2006CGI'06, Hangzhou China25 Mesh Correspondence M (1) D (1) W (1) EΛ -1/2 M (1) M (2) D (2) W (2) EΛ -1/2 M (2)
26
6/26/2006CGI'06, Hangzhou China26 without sampling farthest point sampling random sampling (vertices sampled: 10, total vertices: 250)
27
6/26/2006CGI'06, Hangzhou China27 (vertices sampled: 10 total vertices: 2000)
28
6/26/2006CGI'06, Hangzhou China28 correspondence error against mesh size correspond a series a slimmed mesh with the original mesh a correspondence error at a certain vertex is defined as the geodesic distance between the matched point and the ground-truth matching point.
29
6/26/2006CGI'06, Hangzhou China29 Mesh Segmentation M D W EΛ -1/2
30
6/26/2006CGI'06, Hangzhou China30 (b, d) obtained using farthest point sampling (a, c) obtained using random sampling faces sampled: 10 number in brackets: value of Γ
31
6/26/2006CGI'06, Hangzhou China31 w/o sampling, it takes 30s to handle a mesh with 4000 faces. 2.2 GHz Processor 1GB RAM
32
6/26/2006CGI'06, Hangzhou China32 Roadmap Background Nyström Method Kernel PCA (KPCA) Measuring Nyström Quality using KPCA Sampling Schemes Applications Conclusion and Future Work
33
6/26/2006CGI'06, Hangzhou China33 Conclusion Nyström approximation can be considered as using training data in Kernel PCA. Objective function Γ effectively quantifies the quality of a sample set. Γ leads to two sampling schemes: greedy scheme and farthest point scheme. Farthest point sampling scheme outperforms random sampling.
34
6/26/2006CGI'06, Hangzhou China34 Future Work Study the influence of kernel functions to Nyström method. Further improve the sampling scheme.
35
6/26/2006CGI'06, Hangzhou China35 Thank you ! Questions ?
36
6/26/2006CGI'06, Hangzhou China36 Mesh Correspondence 1.Given any two models, M (1) and M (2), build the geodesic distance matrices D (1) and D (2). D ij encodes the geodesic distance between vertices i and j ; 2. D (1) W (1), D (2) W (2), using Gaussian kernel. 3.Compute the eigenvalue decomposition of W (1) and W (2), and use the corresponding eigenvectors to define the spectral-embedded models M (1) and M (2). handle bending, uniform scaling and rigid body transformation. 4.Compute the correspondence between M (1) and M (2).
37
6/26/2006CGI'06, Hangzhou China37 Mesh Segmentation 1.Given a model M, somehow define the distances between each pair of faces; the distances are stored in matrix D ; 2. D W ; 3.Compute the eigenvalue decomposition of W, and use the eigenvectors to spectral-embed the faces. 4.Cluster (K-means) the embedded faces. Each cluster corresponds to a segment of the original model.
38
6/26/2006CGI'06, Hangzhou China38 Maximize: Given any two sampling sets S 1 and S 2, S 1 is superior to S 2 iff Efficient to compute. Minimize: (schur complement)schur complement S 1 is superior to S 2 iff Very expensive to compute. Γ and Schur Complement SC = C - B T A -1 B
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.