Presentation is loading. Please wait.

Presentation is loading. Please wait.

Statistics and Shape Analysis

Similar presentations


Presentation on theme: "Statistics and Shape Analysis"— Presentation transcript:

1 Statistics and Shape Analysis
By Marc Sobel

2 Shape similarity Humans recognize shapes via both local and global features. (i) matching local features between shapes like curvature, distance to centroid can be statistically modeled via building statistics and parameters to reflect the matching. (ii) matching the relationship between global features of shapes (are they both apples or not?)

3 Incorporating both local and global features in shape matching
How can we incorporate both local and global features in shape matching? An obvious paradigm is to model global features as governed by priors, and local features given global features as a likelihood.

4 Definitions and Notation
Let u1,…,un be the vertices of one shape and v1,…,vm the vertices of another shape. We’d like to biuld correspondences between the vertices which properly reflect the relationship between the shapes. We use the notation (ui,vj) for a correspondence of this type. We use the terminology for a particle consisting of a set of such correspondences. Let Xi,l be the l’th local feature measure for vertex i of the first shape and Yj,l the l’th local feature measure for vertex j of the second shape. For now assume these feature measures are observed. We’d like to biuld a particle which reflects the local and global features of interest.

5 Contiguity: An important global feature.
If shapes result from one another via rotation and scaling then the order of shape 1 correspondence points should match the order of shape 2 correspondence points: i.e., if (i1,j1) is one correspondence and (i2,j2) is another, then either i1<i2 and j1<j2 or i1>i2 and j1>j2. We can incorporate this into a prior.

6 Notation: We have that:

7 Simple Likelihood Based on the observed features we form weight statistics: Let W denote the weight matrix associated with the features. Therefore given that a correspondence ‘C’ belongs in the ‘true’ set of correspondences, we write the simple likelihood in the form,

8 Complicated Likelihoods
At stage t, putting ω as the parameter, we define the likelihood:

9 Simple and Complicated Priors
Model a prior for all sets of correspondences which are strongly contiguous: a) a simple prior (we use ω for the weight variable) b) I] a prior giving more weight to diagonals than other correspondences. II] we can define such a prior sequentially based on the fact that

10 Complicated Prior Put With ‘DIAG[i,j]’ referring to the positively oriented diagonal to which (i,j) belong.

11 Simulating the Posterior Distribution: Simple Prior
We would like to simulate the posterior distribution of contiguous correspondences. We do this by calculating the weights:

12 Simulating the Posterior Distribution: Complicated Prior
Here we simulate:

13 A Simpler Model Define the posterior probabilities:
For parameter λ, described below.

14 Weights for the simpler model
The weights for the simpler model are particularly easy: Choosing λ tending to infinity properly, we get convergence to the MAP estimator of the simple particle filter.

15 Shape Similarity: A More complicated model employing curvature and distance parameters
We have:

16 Simple Likelihood Based on the observed features we form weight parameters: Let W denote the weight matrix associated with the features. Therefore given that a correspondence ‘C’ belongs in the ‘true’ set of correspondences, we write the likelihood in the form,

17 Particle Likelihood We write the likelihood in the form:

18 Particle Prior We assume standard priors for the mu’s and nu’s. We also assume a prior for the set of contiguous correspondences. The particle is updated as follows: define,

19 Particle Prior At stage t we have particles,
Their weights are given by:


Download ppt "Statistics and Shape Analysis"

Similar presentations


Ads by Google