Download presentation
Presentation is loading. Please wait.
Published byMelvin Bell Modified over 9 years ago
1
Alignment Introduction Notes courtesy of Funk et al., SIGGRAPH 2004
2
Outline: Challenge General Approaches Specific Examples
3
Alignment Challenge : The shape of a model does not change when acted on by similarity transformations: = Scale Rotation Translation
4
Alignment Challenge: However, the shape descriptors can change if a the model is: –Translated –Scaled –Rotated How do we match shape descriptors across transformations that do not change the shape of a model?
5
Outline: Challenge General Approaches Specific Examples
6
Alignment Approaches: Given the shape descriptors of two models, find the transformation(s) -- translation, scale, and rotation -- that minimize the distance between the two models: –Exhaustive search –Closed form solution –Minimization –Normalization –Invariance
7
Alignment Aside: Because translations and rotations preserve distances, applying such a transformation to one model is equivalent to applying the inverse transformation to the other one: =
8
Alignment Aside: For translations and rotations we can simplify the alignment equation:
9
Exhaustive Search Approach : 1.Compare the descriptors at all possible transformations. 2.Find the transformation at which the distance is minimal. 3.Define the model similarity as the value at the minimum.
10
Exhaustive Search Approach : 1.Compare the descriptors at all possible transformations. 2.Find the transformation at which the distance is minimal. 3.Define the model similarity as the value at the minimum. Exhaustive search for optimal rotation
11
Exhaustive Search Approach : 1.Compare the descriptors at all possible transformations. 2.Find the transformation at which the distance is minimal. 3.Define the model similarity as the value at the minimum.
12
Exhaustive Search Properties : Always gives the correct answer Needs to be performed at run-time and can be very slow to compute: Computes the measure of similarity for every transform. We only need the value at the best one.
13
Closed Form Solution Approach : Explicitly find the transformation(s) that solves the equation: Properties : Always gives the correct answer Only compute the measure of similarity for the best transformation. A closed form solution does not always exist. Often needs to be computed at run-time.
14
Minimization Approach : Coarsely align the models using low frequency information. Progressively refine the alignment by comparing higher frequency components and adjusting the alignment. Converge to the (locally) optimal alignment. Example : Light field descriptors Spherical Extent Function
15
Minimization Approach : Coarsely align the models using low frequency information. Progressively refine the alignment by comparing higher frequency components and adjusting the alignment. Converge to the (locally) optimal alignment. Initial ModelsLow FrequencyAligned Models
16
Minimization Approach : Coarsely align the models using low frequency information. Progressively refine the alignment by comparing higher frequency components and adjusting the alignment. Converge to the (locally) optimal alignment. = =+ + Initial ModelsLow FrequencyAligned ModelsHigh Frequency
17
Minimization Properties : Can be applied to any type of transformation Needs to be computed at run-time. Difficult to do robustly: Given the low frequency alignment and the computed high- frequency alignment, how do you combine the two? Considerations can include: Relative size of high and low frequency info Distribution of info across the low frequencies Speed of oscillation
18
Normalization Approach : Place every model into a canonical coordinate frame and assume that two models are optimally aligned when each is in its own canonical frame. Example: COM, Max Radius, PCA Scale Rotation Translation
19
Normalization Properties : Can be computed in a pre-processing stage. For some transformations this is guaranteed to give the optimal alignment. For other transformations the approach is only a heuristic and may fail. Failure of PCA-normalization in aligning rotations
20
Invariance Approach : Represent every model by a descriptor that is unchanged when the model is transformed by discarding information that is transformation dependent. Scale Rotation Translation Transformation-invariant descriptor
21
Invariance Review: Is there a general method for addressing these basic types of transformations? DescriptorTranslationScaleRotation Shape Distributions (D2) +-+ Extended Gaussian Images +-- Shape Histograms (Shells) --+ Shape Histograms (Sectors) -+- Spherical Parameterizations +--
22
Invariance Properties : Can be computed in a pre-processing stage. Works for translation, scale and rotation. Gives a more compact representation. Tends to discard valuable, discriminating, information. …….....No Invariance ……..……Rotation …Translation + Rotation ………….…..Rotation
23
Outline: Challenge General Approaches Specific Examples –Normalization: PCA –Closed Form Solution: Ordered Point Sets
24
PCA Alignment Treat a surface as a collection of points and define the variance function:
25
PCA Alignment Define the covariance matrix M: Find the eigen-values and align so that the eigen-values map to the x-, y-, and z-axes
26
PCA Alignment Limitations: –Eigen-values are only defined up to sign! PCA alignment is only well-defined up to axial flips about the x-, y-, and z-axes.
27
PCA Alignment Limitations: –Assumes that the eigen-values are distinct and therefore the eigen-vectors are well-defined (up to sign). This is not always true and can make PCA alignment unstable.
28
Outline: Challenge General Approaches Specific Examples –Normalization: PCA –Closed Form Solution: Ordered Point Sets
29
Ordered Point Sets Challenge: Given ordered point sets P={p 1,…,p n }, Q={q 1,…,q n }, find the rotation/reflection R minimizing the sum of squared differences: p1p1 p2p2 p3p3 p4p4 p5p5 p6p6 q1q1 q2q2 q3q3 q4q4 q5q5 q6q6 R(q2)R(q2) R(q1)R(q1) R(q3)R(q3) R(q4)R(q4) R(q5)R(q5) R(q6)R(q6) q1q1 q2q2 q3q3 q4q4 q5q5 q6q6 R
30
Review Vector dot-product: If v =(v 1,…,v n ) and w=(w 1,…,w n ) are two n- dimensional vectors the dot-product of v with w is the sum of the product of the coefficients:
31
Review Trace: The trace of a nxn matrix M is the sum of the diagonal entries of M: Properties: –
32
Review Trace: If M is any nxn matrix and D is a diagonal nxn matrix, then the trace of MD is the sum of the products of the diagonal entries. MD
33
Review Matrix multiplication: If M and N are two then the (i,j) th entry of the matrix MN is the dot-product of the j th row vector of M with the i th column vector of N. M N j th rowi th column MN (i,j) th entry
34
Review Matrix dot-product: If M and N are two mxn matrices then the i th diagonal entry of the matrix M t N is the dot-product of the i th column vector of M with the i th column vector of N. MtMt N i th rowi th column m n n m MtNMtN i th diagonal entry
35
Review Matrix dot-product: We define the dot-product of two mxn matrices, M and N, to be the trace of the matrix product: (the sum of the dot-products of the column vectors).
36
Review SVD Factorization: If M is an mxm matrix, then M can be factored as the product: where D is a diagonal mxm matrix with non-negative entries and U and V are orthonormal (i.e. rotation/reflection) mxm matrices.
37
Trace Maximization Claim: If M is an mxm matrices, whose SVD factorization is: then the orthonormal transformation R=VU t is the orthonormal transformation maximizing the trace:
38
Trace Maximization Proof: We can rewrite the trace equation as: If we set R 0 to be the rotation R 0 =V t RU, we get:
39
Trace Maximization Proof: Since R 0 is a rotation, each of its entries can have value no larger than one. Since D is diagonal, the value of the trace is the product of the diagonal elements of D and R 0.
40
Trace Maximization Proof: To maximize the trace we want R 0 to be maximal on the diagonal (i.e. have only 1’s). Thus, R 0 is the identity matrix and we have: So that the rotation/reflection R that maximizes Trace(RM) is:
41
Ordered Point Sets Challenge: Given ordered point sets P={p 1,…,p n }, Q={q 1,…,q n }, find the rotation/reflection R minimizing the sum of squared differences:
42
Ordered Point Sets Solution: Use the fact that we can express the difference: to rewrite the equation as:
43
Ordered Point Sets Solution: Use the fact that rotations preserves lengths: to rewrite the equation as:
44
Ordered Point Sets Solution: Since the value: Does not depend on the choice of rotation R, minimizing the sum of squared distances is equivalent to maximizing the sum of dot-products:
45
Ordered Point Sets Solution: If we let M P (respectively M Q ) be the 3xn matrix whose columns are the points p i (respectively q i ) then we can rewrite the sum of the vector dot-products as the matrix dot product: and we can find the maximizing rotation/reflection R by trace maximization.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.