Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using the Particle Filter Approach to Building Partial Correspondences between Shapes Rolf Lakaemper, Marc Sobel Temple University, Philadelphia,PA,USA.

Similar presentations


Presentation on theme: "Using the Particle Filter Approach to Building Partial Correspondences between Shapes Rolf Lakaemper, Marc Sobel Temple University, Philadelphia,PA,USA."— Presentation transcript:

1 Using the Particle Filter Approach to Building Partial Correspondences between Shapes Rolf Lakaemper, Marc Sobel Temple University, Philadelphia,PA,USA

2 Part I: Motivation

3 The Goal: Finding correspondences between feature-points in two (similar) shapes

4 The Motivation: Shape recognition. Classically divided into three steps: 1.Finding correspondences (this talk) 2.Alignment 3.Shape Similarity

5 We want to handle partial correspondences between arbitrary point sets based on local descriptors and certain global constraints. This includes but is not limited to…

6 The simplest case: Closed boundaries vs. Closed boundaries

7 Advanced: Closed polygons vs. polygons representing parts

8 More advanced: Partial matching of unordered 2D point sets

9 …and unordered 3D point sets Bad example due to insufficient constraints. Later we’ll see how to improve it.

10 These tasks can be described as an optimization problem. They will only differ in the way the global constraints are defined. We will present a Particle Filter (PF) based solution unifying these problems. The PF system is able to learn properties of the global constraints.

11 General Approach Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule TaskPF Framework

12 Part II: Illustration of the Approach using the simple example of correspondences between closed boundary curves

13 The example data: boundary polygons Each boundary curve is uniformly sub sampled Each shape is represented by an ordered set of boundary points

14 II.A Local Constraints

15 For each boundary point we compute local feature descriptors, eventually leading to a local correspondence matrix. This matrix describes the local constraints. Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule

16 Computation of Local Feature Descriptors. As an example we use Centroid Distance Curvature Remark: research is not about optimal, new and fancy local descriptors. On the contrary: for several reasons we use relatively weak descriptors

17 Centroid Distance (normalized, average dist. = 1) Relative distance to center of polygon (mean of vertices) Extendable to parts Curvature e.g. turn angle

18 Using each descriptor independently, we compute the correspondence probability between all pairs of points. The correspondence is computed in a symmetric way: –How likely is it that p i in shape1 corresponds to q k in shape2 relative to all points in shape2 AND –How likely is it that q k in shape2 corresponds to p i in shape1 relative to all points in shape1

19 Example: Centroid Distance Compute correspondence matrix MD 1 = [md 1 ij ] u i = centroid distance of point i in shape1 v j = centroid distance of point j in shape2 G σ = Gauss Distribution with standard deviation σ Row normalize MD 1

20 MD 1 described the correspondence probability of a point in shape1 to a point in shape2. To find the correspondence probability shape2 to shape1, compute MD 2 (column normalized):

21 Finally, the correspondence matrix MD is the element wise product of MD 1 and MD 2 : MD = MD 1.* MD 2

22 The correspondence matrix MC using curvature is computed accordingly The final local correspondence matrix is the joint probability of both features L = MD.* MC

23 Correspondences MCMC.*MDMD

24 MCMC.*MDMD

25 MC.*MD ? Examples for SELF similarity matrices

26 MC.*MD

27 Conclusion about L: L defines a probability P c over the set C of correspondences L(S1,S2) = L T (S2,S1). This means L is order independent with respect to S1, S2. This of course does NOT necessarily mean that L is symmetric. L is symmetric  S1=S2 just as a note: M(S1,S1) (the self similarity matrix) is not necessarily diagonal dominant (see ‘device’ example).

28 L defines the weights of single correspondences. Finding the optimal correspondence configuration is the task of finding a certain path in L under certain constraints. In our example, the constraints are: 1)One to one correspondences only 2)Order preservation The following section will formalize the optimization problem.

29 II.B Correspondence as optimization problem

30 Definitions: A grouping g in the set G of all groupings is a configuration of correspondences. Global constraints restrict the search space for our optimization process to G - (a subset of G), the admissible groupings Using L (=P c ), we define a weight function W over G

31 We formulate the correspondence problem as one of choosing the grouping, g^ ∈ G − from the set of admissible groupings, G − with maximal weight or, more specifically: Lemma 1: the optimal grouping is complete

32 The optimization problem could typically be solved using dynamic programming We want to use particle filters to solve the correspondence problem Reason: particle filters provide a less restricted framework, which enables us to extend the system to solve more general and complex recognition problems (parts, inner structures, 3D shapes)

33 II.C Using Particle Filters to solve the optimization problem

34 Particle Filters Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule

35 Some general remarks: Particle Filtering, e.g. in contrast to the deterministic Dynamic Programming, is a statistical approach. It does not guarantee an optimal (but only a near optimal ) solution.

36 But: the weight matrix is built from non-precise local descriptors. Hence a precise, optimal solution does not necessarily make sense anyway.

37 The goal of particle filters (PF) is to estimate the posterior distribution over the entire search space using discrete distributions (constructed dynamically at each of a number of different iterations) based on a limited number of particles. For our optimization problem, we are interested in the strongest particle. Whatever dialect of PF is used, they always consist of 2 major steps:

38 Let’s say we have n particles (hypotheses). Each particle has a ‘weight’. Step 1: Prediction Corresponding to additional information, update each particle and compute its new weight. Step 2: Evaluation Pick n updated particles according to their weights. ‘Better’ particles have a higher chance to survive.

39 In our example problem, a single particle is a set of order preserving correspondences. Its weight is computed using the weights of the single correspondences.

40 Prediction : add a new correspondence (order preserving) based on the and compute the new weight. The new correspondence is picked using the distribution defined by the weight matrix L Evaluation : Residual sub sampling. Additionally we use a RECEDE step : every m steps, n correspondences are deleted (m>n). This can be seen as an add on to the update step.

41 Adding correspondences randomly means: we do not need to know a starting point, we also do not need to know the direction of the path (this is an advantage, though not the main advantage over dynamic programming) The order constraint decreases the size of the search space rapidly

42 Updating using the correspondence matrix L as underlying distribution has 2 effects: 1.Correspondences of high probability are likely to be picked first 2.Correspondences in indistinct areas of L have a uniform distribution. This problem is automatically solved, since our system prefers particles with a higher number of correspondences.

43 The main contribution: L is dynamically modified using GLOBAL constraints These constraints are modeled by a global constraint matrix, which is re-built for each particle in each step. The matrix contains real valued elements, weakening the admissibility definition to ‘degree of admissibility’

44 II.D Global constraints in our example

45 Adding Global Constraints Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule

46 Finding the optimal global correspondence between two shapes: we want an optimal correspondence configuration based on L AND We want to maximize the number of correspondences We want to conserve the point order

47 We want to find a maximal and order preserving path in M We do NOT know the number of correspondences, the start point or the path direction (Matrix M is a torus, i.e. circular in both dimensions)

48 An order preserving max. path is a set of correspondences of shape vertices

49 Order preservation can be formulated in terms of a global constraint matrix with entries {0,1} 1100000000 0000000000 0001100000 0001100000 0000000000 0000001100 0000001100 0000001100 0000000000 0000000001

50 The particle selection process operates on the product of the local and global matrix. The strict distinction between local and global constraints will be of use in more advanced applications, which we will see later. Our optimization process operates on a dynamically changing matrix.

51 The algorithm: Construct L For each particle: initialize C Loop: Prediction: pick new correspondence from L.*C Update C (‘GC update rule’) Evaluation Recede

52

53 Results Part II (the alignment was computed using procrustes analysis)

54 Carriage (identical shapes)

55

56

57 Bat (similar shapes)

58

59

60 Part III Different tasks, different global constraints

61 Updating Global Constraints Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule

62 Example 1: NO global constraints. C = ones GC update rule: C = C.* I (the solution to this problem is the configuration consisting of ALL correspondences) *

63 Example 2: one to one correspondences GC update rule: *

64 This optimization problem can be solved precisely using the Hungarian Algorithm. Since no further global constraints are defined, the local descriptor must contain global information. A typical system for this task is Belongie/Malik/Puzicha’s ‘Shape Context’ approach. Drawback: since the local descriptor captures global properties, it performs not too well for partial shape matching

65 Example 3: one to one correspondences and order preservation (our previous example) GC update rule: *

66 This optimization problem can precisely be solved using Dynamic Programming (DP). A typical system for this task is Scott/Nowak’s DP approach. Drawback: not naturally extendable

67 Example 4: one to one correspondences and order preservation and PARTIAL matching GC update rule: * ?

68 In contrast to the previous examples, the update rule can not be defined a priori in an obvious way. The following will motivate how to design the update rule. It will be seen that it will not only depend on the new correspondence, but on the entire grouping. The optimal global matrix will be learned during the PF process.

69 Example: fountain

70

71 Maximizing the weights allows non central points to find correspondences spread all over the target boundary Additional constraint is needed Adding PRIORS adds domain knowledge: we know target path should be close to a diagonal.

72 find dominant diagonals during the PF process

73 PF system is run twice: first time to find the best prior, second time to use the prior to find best correspondence Second run requires significantly less particles (from ~50 down to ~8).

74 RESULTS for partial matching (parts are of size 15 points, target = 50 points) See also [CVPR 08]

75

76 MPEG-7 (10 classes only) retrieval test on parts PFPA: Particle Filter Procrustes Analysis OSB: Latecki et al. DTWCW: Dynamic time warping LCSS: longest common sub sequence

77 Example 5: partial matching in unordered point sets (experiments of example 5&6 are performed by Shusha Li) GC update rule: * ?

78 The update rule involves learning of reasonable neighborhoods. (See [ICPR 08] for more details)

79 Results: Left column: shape context Right column: our PF approach based on shape context

80 Example 6: partial matching in unordered point sets, 3D version GC update rule: * ?

81 The update rule involves learning of reasonable neighborhoods. …work in progress… For this task, not only the GC update rule must be adjusted, but also the RECEDE step becomes more attention. RECEDE is done based on analysis of clusters. (more details in the next months)

82

83

84 Conclusions Particle Filters were used to solve the optimization problem of finding the best correspondence configuration Based on pre computed local constraints and learned global constraints the framework unifies different optimization tasks

85 Thanks!


Download ppt "Using the Particle Filter Approach to Building Partial Correspondences between Shapes Rolf Lakaemper, Marc Sobel Temple University, Philadelphia,PA,USA."

Similar presentations


Ads by Google