Using the Particle Filter Approach to Building Partial Correspondences between Shapes Rolf Lakaemper, Marc Sobel Temple University, Philadelphia,PA,USA.

Slides:



Advertisements
Similar presentations
Complex Networks for Representation and Characterization of Images For CS790g Project Bingdong Li 9/23/2009.
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Hidden Markov Models (1)  Brief review of discrete time finite Markov Chain  Hidden Markov Model  Examples of HMM in Bioinformatics  Estimations Basic.
Fast Algorithms For Hierarchical Range Histogram Constructions
November 12, 2013Computer Vision Lecture 12: Texture 1Signature Another popular method of representing shape is called the signature. In order to compute.
1 Transportation problem The transportation problem seeks the determination of a minimum cost transportation plan for a single commodity from a number.
Representing Relations Using Matrices
Proportion Priors for Image Sequence Segmentation Claudia Nieuwenhuis, etc. ICCV 2013 Oral.
Generated Waypoint Efficiency: The efficiency considered here is defined as follows: As can be seen from the graph, for the obstruction radius values (200,
Observers and Kalman Filters
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
Hidden Markov Model based 2D Shape Classification Ninad Thakoor 1 and Jean Gao 2 1 Electrical Engineering, University of Texas at Arlington, TX-76013,
Applied Discrete Mathematics Week 12: Trees
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
Heuristic alignment algorithms and cost matrices
Reduced Support Vector Machine
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Sequence Alignment Variations Computing alignments using only O(m) space rather than O(mn) space. Computing alignments with bounded difference Exclusion.
Lecture 4 Unsupervised Learning Clustering & Dimensionality Reduction
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Computing Sketches of Matrices Efficiently & (Privacy Preserving) Data Mining Petros Drineas Rensselaer Polytechnic Institute (joint.
Unsupervised Learning
Similar Sequence Similar Function Charles Yan Spring 2006.
Introduction to Boosting Aristotelis Tsirigos SCLT seminar - NYU Computer Science.
Graph-based consensus clustering for class discovery from gene expression data Zhiwen Yum, Hau-San Wong and Hongqiang Wang Bioinformatics, 2007.
Hashed Samples Selectivity Estimators for Set Similarity Selection Queries.
Multiple Sequence Alignment CSC391/691 Bioinformatics Spring 2004 Fetrow/Burg/Miller (Slides by J. Burg)
Bioiformatics I Fall Dynamic programming algorithm: pairwise comparisons.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
Multimodal Interaction Dr. Mike Spann
1 TEMPLATE MATCHING  The Goal: Given a set of reference patterns known as TEMPLATES, find to which one an unknown pattern matches best. That is, each.
Shape Matching for Model Alignment 3D Scan Matching and Registration, Part I ICCV 2005 Short Course Michael Kazhdan Johns Hopkins University.
Texture. Texture is an innate property of all surfaces (clouds, trees, bricks, hair etc…). It refers to visual patterns of homogeneity and does not result.
CS 376b Introduction to Computer Vision 02 / 22 / 2008 Instructor: Michael Eckmann.
ECE 8443 – Pattern Recognition Objectives: Error Bounds Complexity Theory PAC Learning PAC Bound Margin Classifiers Resources: D.M.: Simplified PAC-Bayes.
Particle Filters for Shape Correspondence Presenter: Jingting Zeng.
1 N -Queens via Relaxation Labeling Ilana Koreh ( ) Luba Rashkovsky ( )
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
EE 685 presentation Utility-Optimal Random-Access Control By Jang-Won Lee, Mung Chiang and A. Robert Calderbank.
Multiple alignment: Feng- Doolittle algorithm. Why multiple alignments? Alignment of more than two sequences Usually gives better information about conserved.
Ground Truth Free Evaluation of Segment Based Maps Rolf Lakaemper Temple University, Philadelphia,PA,USA.
First topic: clustering and pattern recognition Marc Sobel.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
BLAST: Basic Local Alignment Search Tool Altschul et al. J. Mol Bio CS 466 Saurabh Sinha.
Object Recognition Based on Shape Similarity Longin Jan Latecki Computer and Information Sciences Dept. Temple Univ.,
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Shape similarity and Visual Parts Longin Jan Latecki Temple Univ., In cooperation with Rolf Lakamper (Temple Univ.),
October 1, 2013Computer Vision Lecture 9: From Edges to Contours 1 Canny Edge Detector However, usually there will still be noise in the array E[i, j],
A Brief Maximum Entropy Tutorial Presenter: Davidson Date: 2009/02/04 Original Author: Adam Berger, 1996/07/05
Monte Carlo Linear Algebra Techniques and Their Parallelization Ashok Srinivasan Computer Science Florida State University
V diagonal lines give equivalent residues ILS TRIVHVNSILPSTN V I L S T R I V I L P E F S T Sequence A Sequence B Dot Plots, Path Matrices, Score Matrices.
V diagonal lines give equivalent residues ILS TRIVHVNSILPSTN V I L S T R I V I L P E F S T Sequence A Sequence B Dot Plots, Path Matrices, Score Matrices.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Simultaneous Multi-Line- Segment Merging for Robot Mapping using Mean Shift Clustering Rolf Lakaemper Temple University, Philadelphia,PA,USA.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
On Testing Dynamic Environments
CH 5: Multivariate Methods
On Learning and Testing Dynamic Environments
Section 7.12: Similarity By: Ralucca Gera, NPS.
Statistics and Shape Analysis
Hidden Markov Models Part 2: Algorithms
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
Revealing priors on category structures through iterated learning
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Dynamic Time Warping and training methods
Handwritten Characters Recognition Based on an HMM Model
Data Transformations targeted at minimizing experimental variance
Presentation transcript:

Using the Particle Filter Approach to Building Partial Correspondences between Shapes Rolf Lakaemper, Marc Sobel Temple University, Philadelphia,PA,USA

Part I: Motivation

The Goal: Finding correspondences between feature-points in two (similar) shapes

The Motivation: Shape recognition. Classically divided into three steps: 1.Finding correspondences (this talk) 2.Alignment 3.Shape Similarity

We want to handle partial correspondences between arbitrary point sets based on local descriptors and certain global constraints. This includes but is not limited to…

The simplest case: Closed boundaries vs. Closed boundaries

Advanced: Closed polygons vs. polygons representing parts

More advanced: Partial matching of unordered 2D point sets

…and unordered 3D point sets Bad example due to insufficient constraints. Later we’ll see how to improve it.

These tasks can be described as an optimization problem. They will only differ in the way the global constraints are defined. We will present a Particle Filter (PF) based solution unifying these problems. The PF system is able to learn properties of the global constraints.

General Approach Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule TaskPF Framework

Part II: Illustration of the Approach using the simple example of correspondences between closed boundary curves

The example data: boundary polygons Each boundary curve is uniformly sub sampled Each shape is represented by an ordered set of boundary points

II.A Local Constraints

For each boundary point we compute local feature descriptors, eventually leading to a local correspondence matrix. This matrix describes the local constraints. Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule

Computation of Local Feature Descriptors. As an example we use Centroid Distance Curvature Remark: research is not about optimal, new and fancy local descriptors. On the contrary: for several reasons we use relatively weak descriptors

Centroid Distance (normalized, average dist. = 1) Relative distance to center of polygon (mean of vertices) Extendable to parts Curvature e.g. turn angle

Using each descriptor independently, we compute the correspondence probability between all pairs of points. The correspondence is computed in a symmetric way: –How likely is it that p i in shape1 corresponds to q k in shape2 relative to all points in shape2 AND –How likely is it that q k in shape2 corresponds to p i in shape1 relative to all points in shape1

Example: Centroid Distance Compute correspondence matrix MD 1 = [md 1 ij ] u i = centroid distance of point i in shape1 v j = centroid distance of point j in shape2 G σ = Gauss Distribution with standard deviation σ Row normalize MD 1

MD 1 described the correspondence probability of a point in shape1 to a point in shape2. To find the correspondence probability shape2 to shape1, compute MD 2 (column normalized):

Finally, the correspondence matrix MD is the element wise product of MD 1 and MD 2 : MD = MD 1.* MD 2

The correspondence matrix MC using curvature is computed accordingly The final local correspondence matrix is the joint probability of both features L = MD.* MC

Correspondences MCMC.*MDMD

MCMC.*MDMD

MC.*MD ? Examples for SELF similarity matrices

MC.*MD

Conclusion about L: L defines a probability P c over the set C of correspondences L(S1,S2) = L T (S2,S1). This means L is order independent with respect to S1, S2. This of course does NOT necessarily mean that L is symmetric. L is symmetric  S1=S2 just as a note: M(S1,S1) (the self similarity matrix) is not necessarily diagonal dominant (see ‘device’ example).

L defines the weights of single correspondences. Finding the optimal correspondence configuration is the task of finding a certain path in L under certain constraints. In our example, the constraints are: 1)One to one correspondences only 2)Order preservation The following section will formalize the optimization problem.

II.B Correspondence as optimization problem

Definitions: A grouping g in the set G of all groupings is a configuration of correspondences. Global constraints restrict the search space for our optimization process to G - (a subset of G), the admissible groupings Using L (=P c ), we define a weight function W over G

We formulate the correspondence problem as one of choosing the grouping, g^ ∈ G − from the set of admissible groupings, G − with maximal weight or, more specifically: Lemma 1: the optimal grouping is complete

The optimization problem could typically be solved using dynamic programming We want to use particle filters to solve the correspondence problem Reason: particle filters provide a less restricted framework, which enables us to extend the system to solve more general and complex recognition problems (parts, inner structures, 3D shapes)

II.C Using Particle Filters to solve the optimization problem

Particle Filters Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule

Some general remarks: Particle Filtering, e.g. in contrast to the deterministic Dynamic Programming, is a statistical approach. It does not guarantee an optimal (but only a near optimal ) solution.

But: the weight matrix is built from non-precise local descriptors. Hence a precise, optimal solution does not necessarily make sense anyway.

The goal of particle filters (PF) is to estimate the posterior distribution over the entire search space using discrete distributions (constructed dynamically at each of a number of different iterations) based on a limited number of particles. For our optimization problem, we are interested in the strongest particle. Whatever dialect of PF is used, they always consist of 2 major steps:

Let’s say we have n particles (hypotheses). Each particle has a ‘weight’. Step 1: Prediction Corresponding to additional information, update each particle and compute its new weight. Step 2: Evaluation Pick n updated particles according to their weights. ‘Better’ particles have a higher chance to survive.

In our example problem, a single particle is a set of order preserving correspondences. Its weight is computed using the weights of the single correspondences.

Prediction : add a new correspondence (order preserving) based on the and compute the new weight. The new correspondence is picked using the distribution defined by the weight matrix L Evaluation : Residual sub sampling. Additionally we use a RECEDE step : every m steps, n correspondences are deleted (m>n). This can be seen as an add on to the update step.

Adding correspondences randomly means: we do not need to know a starting point, we also do not need to know the direction of the path (this is an advantage, though not the main advantage over dynamic programming) The order constraint decreases the size of the search space rapidly

Updating using the correspondence matrix L as underlying distribution has 2 effects: 1.Correspondences of high probability are likely to be picked first 2.Correspondences in indistinct areas of L have a uniform distribution. This problem is automatically solved, since our system prefers particles with a higher number of correspondences.

The main contribution: L is dynamically modified using GLOBAL constraints These constraints are modeled by a global constraint matrix, which is re-built for each particle in each step. The matrix contains real valued elements, weakening the admissibility definition to ‘degree of admissibility’

II.D Global constraints in our example

Adding Global Constraints Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule

Finding the optimal global correspondence between two shapes: we want an optimal correspondence configuration based on L AND We want to maximize the number of correspondences We want to conserve the point order

We want to find a maximal and order preserving path in M We do NOT know the number of correspondences, the start point or the path direction (Matrix M is a torus, i.e. circular in both dimensions)

An order preserving max. path is a set of correspondences of shape vertices

Order preservation can be formulated in terms of a global constraint matrix with entries {0,1}

The particle selection process operates on the product of the local and global matrix. The strict distinction between local and global constraints will be of use in more advanced applications, which we will see later. Our optimization process operates on a dynamically changing matrix.

The algorithm: Construct L For each particle: initialize C Loop: Prediction: pick new correspondence from L.*C Update C (‘GC update rule’) Evaluation Recede

Results Part II (the alignment was computed using procrustes analysis)

Carriage (identical shapes)

Bat (similar shapes)

Part III Different tasks, different global constraints

Updating Global Constraints Global Constraints (GC) Single Particle (configuration of established correspondences) Local Constraints Particle Filtering GC update rule

Example 1: NO global constraints. C = ones GC update rule: C = C.* I (the solution to this problem is the configuration consisting of ALL correspondences) *

Example 2: one to one correspondences GC update rule: *

This optimization problem can be solved precisely using the Hungarian Algorithm. Since no further global constraints are defined, the local descriptor must contain global information. A typical system for this task is Belongie/Malik/Puzicha’s ‘Shape Context’ approach. Drawback: since the local descriptor captures global properties, it performs not too well for partial shape matching

Example 3: one to one correspondences and order preservation (our previous example) GC update rule: *

This optimization problem can precisely be solved using Dynamic Programming (DP). A typical system for this task is Scott/Nowak’s DP approach. Drawback: not naturally extendable

Example 4: one to one correspondences and order preservation and PARTIAL matching GC update rule: * ?

In contrast to the previous examples, the update rule can not be defined a priori in an obvious way. The following will motivate how to design the update rule. It will be seen that it will not only depend on the new correspondence, but on the entire grouping. The optimal global matrix will be learned during the PF process.

Example: fountain

Maximizing the weights allows non central points to find correspondences spread all over the target boundary Additional constraint is needed Adding PRIORS adds domain knowledge: we know target path should be close to a diagonal.

find dominant diagonals during the PF process

PF system is run twice: first time to find the best prior, second time to use the prior to find best correspondence Second run requires significantly less particles (from ~50 down to ~8).

RESULTS for partial matching (parts are of size 15 points, target = 50 points) See also [CVPR 08]

MPEG-7 (10 classes only) retrieval test on parts PFPA: Particle Filter Procrustes Analysis OSB: Latecki et al. DTWCW: Dynamic time warping LCSS: longest common sub sequence

Example 5: partial matching in unordered point sets (experiments of example 5&6 are performed by Shusha Li) GC update rule: * ?

The update rule involves learning of reasonable neighborhoods. (See [ICPR 08] for more details)

Results: Left column: shape context Right column: our PF approach based on shape context

Example 6: partial matching in unordered point sets, 3D version GC update rule: * ?

The update rule involves learning of reasonable neighborhoods. …work in progress… For this task, not only the GC update rule must be adjusted, but also the RECEDE step becomes more attention. RECEDE is done based on analysis of clusters. (more details in the next months)

Conclusions Particle Filters were used to solve the optimization problem of finding the best correspondence configuration Based on pre computed local constraints and learned global constraints the framework unifies different optimization tasks

Thanks!