Download presentation
Presentation is loading. Please wait.
Published byDenis Perkins Modified over 9 years ago
1
Shape Analysis and Retrieval Statistical Shape Descriptors Notes courtesy of Funk et al., SIGGRAPH 2004
2
Outline: Shape Descriptors Statistical Shape Descriptors Singular Value Decomposition (SVD)
3
Shape Matching General approach: Define a function that takes in two models and returns a measure of their proximity. D,D, M1M1 M1M1 M3M3 M2M2 M 1 is closer to M 2 than it is to M 3
4
Shape Descriptors Shape Descriptor: A structured abstraction of a 3D model that is well suited to the challenges of shape matching Descriptors 3D Models D, D,
5
Matching with Descriptors Preprocessing Compute database descriptors Run-Time 3D Query Shape Descriptor 3D Database Best Matches
6
Matching with Descriptors Preprocessing Compute database descriptors Run-Time Compute query descriptor 3D Query Shape Descriptor 3D Database Best Matches
7
Matching with Descriptors Preprocessing Compute database descriptors Run-Time Compute query descriptor Compare query descriptor to database descriptors 3D Query Shape Descriptor 3D Database Best Matches
8
Matching with Descriptors Preprocessing Compute database descriptors Run-Time Compute query descriptor Compare query descriptor to database descriptors Return best Match(es) 3D Query Shape Descriptor 3D Database Best Matches
9
Shape Matching Challenge Need shape descriptor that is: Concise to store –Quick to compute –Efficient to match –Discriminating 3D Query Shape Descriptor 3D Database Best Matches
10
Shape Matching Challenge Need shape descriptor that is: –Concise to store Quick to compute –Efficient to match –Discriminating 3D Query Shape Descriptor 3D Database Best Matches
11
Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute Efficient to match –Discriminating 3D Query Shape Descriptor 3D Database Best Matches
12
Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match Discriminating 3D Query Shape Descriptor 3D Database Best Matches
13
Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating Invariant to transformations –Invariant to deformations –Insensitive to noise –Insensitive to topology –Robust to degeneracies Different Transformations (translation, scale, rotation, mirror)
14
Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating –Invariant to transformations Invariant to deformations –Insensitive to noise –Insensitive to topology –Robust to degeneracies Different Articulated Poses
15
Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating –Invariant to transformations –Invariant to deformations Insensitive to noise –Insensitive to topology –Robust to degeneracies Scanned Surface Image courtesy of Ramamoorthi et al.
16
Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating –Invariant to transformations –Invariant to deformations –Insensitive to noise Insensitive to topology –Robust to degeneracies Different Tessellations Different Genus Images courtesy of Viewpoint & Stanford
17
Shape Matching Challenge Need shape descriptor that is: –Concise to store –Quick to compute –Efficient to match –Discriminating –Invariant to transformations –Invariant to deformations –Insensitive to noise –Insensitive to topology Robust to degeneracies No Bottom! &*Q?@#A%! Images courtesy of Utah & De Espona
18
Outline: Shape Descriptors Statistical Shape Descriptors Singular Value Decomposition (SVD)
19
Statistical Shape Descriptors Challenge: Want a simple shape descriptor that is easy to compare and gives a continuous measure of the similarity between two models. Solution: Represent each model by a vector and define the distance between models as the distance between corresponding vectors.
20
Statistical Shape Descriptors Properties: –Structured representation –Easy to compare –Generalizes the matching problem Models represented as points in a fixed dimensional vector space
21
Statistical Shape Descriptors General Approaches: –Retrieval –Clustering –Compression –Hierarchical representation Models represented as points in a fixed dimensional vector space
22
Outline: Shape Descriptors Statistical Shape Descriptors Singular Value Decomposition (SVD)
23
Complexity of Retrieval Given a query: Compute the distance to each database model Sort the database models by proximity Return the closest matches ~ Best Match(es) 3D Query Database ModelsSorted Models D(Q,M i ) Q M1M1 M2M2 MkMk M1M1 M2M2 MkMk ~ ~ ~ M1M1 ~ M2M2
24
Complexity of Retrieval If there are k models in the database and each model is represented by an n-dimensional vector: Computing the distance to each database model: –O(k n) time Sort the database models by proximity: –O(k logk) time If n is large, retrieval will be prohibitively slow.
25
Algebra Definition: Given a vector space V and a subspace W V, the projection onto W, written W, is the map that sends v V to the nearest vector in W. If {w 1,…,w m } is an orthonormal basis for W, then:
26
Tensor Algebra Definition: The inner product of two n-dimensional vectors v={v 1,…,v n } and w={w 1,…,w n }, written v,w , is the scalar value defined by:
27
Tensor Algebra Definition: The outer product of two n-dimensional vectors v={v 1,…,v n } and w={w 1,…,w n }, written v w, is the matrix defined by:
28
Tensor Algebra Definition: The transpose of an mxn matrix M, written M t, is the nxm matrix with: Property: For any two vectors v and w, the transpose has the property:
29
SVD Compression Key Idea: Given a collection of vectors in n-dimensional space, find a good m-dimensional subspace (m<<n) in which to represent the vectors.
30
SVD Compression Specifically: If P={p 1,…,p k } is the initial n-dimensional point set, and {w 1,…,w m } is an orthonormal basis for the m-dimensional subspace, we will compress the point set by sending:
31
SVD Compression Challenge: To find the m-dimensional subspace that will best capture the initial point information.
32
Variance of the Point Set Given a collection of points P={p 1,…,p k }, in an n-dimensional vector space, determine how the vectors are distributed across different directions. pipi p1p1 p2p2 pkpk
33
Variance of the Point Set Define the Var P as the function: giving the variance of the point set P in direction v (assume |v|=1). pipi p1p1 p2p2 pkpk
34
Variance of the Point Set More generally, for a subspace W V, define the variance of P in the subspace W as: If {w 1,…,w m } is an orthonormal basis for W, then:
35
Variance of the Point Set Example: The variance in the direction v 1 is large, while the variance in the direction v 2 is small. If we want to compress down to one dimension, we should project the points onto v 1 pipi p2p2 p1p1 v1v1v1v1 v2v2v2v2 pkpk
36
Covariance Matrix Definition: The covariance matrix M P, of a point set P={p 1,…,p k } is the symmetric matrix which is the sum of the outer products of the p i :
37
Covariance Matrix Theorem: The variance of the point set P in a direction v is equal to:
38
Covariance Matrix Theorem: The variance of the point set P in a direction v is equal to: Proof:
39
Singular Value Decomposition Theorem: Every symmetric matrix M can be written out as the product: where O is a rotation/reflection matrix (OO t =Id) and D is a diagonal matrix with the property:
40
Singular Value Decomposition Implication: Given a point set P, we can compute the covariance matrix of the point set, M P, and express the matrix in terms of its SVD factorization: where {v 1,…,v n } is an orthonormal basis and i is the variance of the point set in direction v i.
41
Singular Value Decomposition Compression: The vector subspace spanned by {v 1,…,v m } is the vector sub-space that maximizes the variance in the initial point set P. If m is too small, then too much information is discarded and there will be a loss in retrieval accuracy.
42
Singular Value Decomposition Hierarchical Matching: First coarsely compare the query to database vectors. If {query is coarsely similar to target} –Refine the comparison Else –Do not refine O(k n) matching becomes O(k m) with m<<n and no loss of retrieval accuracy.
43
Singular Value Decomposition Hierarchical Matching: SVD expresses the initial vectors in terms of the eigenbasis: Because there is more variance in v 1 than in v 2, more variance in v 2 than in v 3, etc. this gives a hierarchical representation of the data so that coarse comparisons can be performed by comparing only the first m coefficients.
44
Efficient to match? Preprocessing: Compute SVD factorization Transform database descriptors Run-Time: 3.Transform Query SVD Query
45
Efficient to match? 4.Low resolution sort 0.040 0.052 0.103 0.661 0.430 Distance to Query Query Database
46
Efficient to match? 5.Update closest matches 6.Resort Query 0.229 0.052 0.103 0.661 0.430 Database Distance to Query
47
Efficient to match? 5.Update closest matches 6.Resort Query 0.229 0.141 0.103 0.661 0.430 Database Distance to Query
48
Efficient to match? 5.Update closest matches 6.Resort Query 0.229 0.141 0.189 0.661 0.430 Database Distance to Query
49
Efficient to match? 5.Update closest matches 6.Resort Query 0.229 0.230 0.189 0.661 0.430 Database Distance to Query
50
Efficient to match? 5.Update closest matches 6.Resort Query 0.229 0.230 0.200 0.661 0.430 Database Distance to Query
51
Efficient to match? 5.Update closest matches 6.Resort Query 0.229 0.230 =0.289 0.661 0.430 Database Distance to Query
52
Efficient to match? 5.Update closest matches 6.Resort Query 0.334 0.230 =0.289 0.661 0.430 Database Distance to Query
53
Efficient to match? 5.Update closest matches 6.Resort Query 0.334 =0.301 =0.289 0.661 0.430 Database Distance to Query
54
Efficient to match? 5.Update closest matches 6.Resort Query 0.334 =0.301 =0.289 0.661 0.430 Database Distance to Query
55
Singular Value Decomposition Theorem: Every symmetric matrix M can be written out as the product: where O is a rotation/reflection matrix (OO t =Id) and D is a diagonal matrix with the property:
56
Singular Value Decomposition Proof: 1.Every symmetric matrix has at least one real eigenvector v. 2.If v is an eigenvector and w is perpendicular to v then Mw is also perpendicular to v. v wvwv Since M maps the subspace of vectors perpendicular to v back into itself, we can look at the restriction of M to the subspace and iterate to get the next eigenvector.
57
Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: v F(v)F(v)F(v)F(v)
58
Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: Then F must have a maximum at some point v 0. v0v0v0v0 F(v0)F(v0)F(v0)F(v0)
59
Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: Then F must have a maximum at some point v 0. Then F(v 0 )= v 0. v0v0v0v0 F(v0)F(v0)F(v0)F(v0)
60
Singular Value Decomposition If F has a maximum at some point v 0 then F(v 0 )= v 0. If w 0 is on the sphere, next to v 0, then w 0 -v 0 is nearly perpendicular to v 0. And for any small vector w 1 perpendicular to v 0, v 0 + w 1 is nearly on the sphere v0v0v0v0 w0w0w0w0 v0v0v0v0 w 0 -v 0 w0w0w0w0
61
Singular Value Decomposition If F has a maximum at some point v 0 then F(v 0 )= v 0. For small values of w 0 close to v 0, we have: For v 0 to be a maximum, we must have: for all w 0 near v 0. Thus, F(v 0 ) must be perpendicular to all vectors that are perpendicular to v 0, and hence must itself be a multiple of v 0.
62
Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: Then F must have a maximum at some point v 0. Then F(v 0 )= v 0. v0v0v0v0 F(v0)F(v0)F(v0)F(v0)
63
Singular Value Decomposition Proof (Step 1): Let F(v) be the function on the unit sphere (||v||=1) defined by: Then F must have a maximum at some point v 0. Then F(v 0 )= v 0. But F(v)=2Mv v0v0v0v0 F(v0)F(v0)F(v0)F(v0) v 0 is an eigenvector of M.
64
Singular Value Decomposition Proof: 1.Every symmetric matrix has at least one eigenvector v. 2.If v is an eigenvector and w is perpendicular to v then Mw is also perpendicular to v.
65
Singular Value Decomposition Proof (Step 2): If w is perpendicular to v, then v,w =0. Since M is symmetric: so that Mw is also perpendicular to v.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.