Particle Filters for Shape Correspondence Presenter: Jingting Zeng.

Slides:



Advertisements
Similar presentations
Supervised Learning Recap
Advertisements

Lecture 3 Probability and Measurement Error, Part 2.
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Multiple People Detection and Tracking with Occlusion Presenter: Feifei Huo Supervisor: Dr. Emile A. Hendriks Dr. A. H. J. Stijn Oomes Information and.
On Constrained Optimization Approach To Object Segmentation Chia Han, Xun Wang, Feng Gao, Zhigang Peng, Xiaokun Li, Lei He, William Wee Artificial Intelligence.
PHD Approach for Multi-target Tracking
Probabilistic Robotics Bayes Filter Implementations Particle filters.
TOWARD DYNAMIC GRASP ACQUISITION: THE G-SLAM PROBLEM Li (Emma) Zhang and Jeff Trinkle Department of Computer Science, Rensselaer Polytechnic Institute.
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
Mutual Information Mathematical Biology Seminar
First introduced in 1977 Lots of mathematical derivation Problem : given a set of data (data is incomplete or having missing values). Goal : assume the.
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Clustering… in General In vector space, clusters are vectors found within  of a cluster vector, with different techniques for determining the cluster.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Relevance Feedback based on Parameter Estimation of Target Distribution K. C. Sia and Irwin King Department of Computer Science & Engineering The Chinese.
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Clustering.
Lecture 4 Unsupervised Learning Clustering & Dimensionality Reduction
Today Introduction to MCMC Particle filters and MCMC
Unsupervised Learning
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Clustering with Bregman Divergences Arindam Banerjee, Srujana Merugu, Inderjit S. Dhillon, Joydeep Ghosh Presented by Rohit Gupta CSci 8980: Machine Learning.
Clustering & Dimensionality Reduction 273A Intro Machine Learning.
Monte Carlo Methods in Partial Differential Equations.
Radial Basis Function Networks
Graph-based consensus clustering for class discovery from gene expression data Zhiwen Yum, Hau-San Wong and Hongqiang Wang Bioinformatics, 2007.
Particle Filtering in Network Tomography
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
Particle Filter & Search
Markov Localization & Bayes Filtering
Multimodal Interaction Dr. Mike Spann
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Mapping and Localization with RFID Technology Matthai Philipose, Kenneth P Fishkin, Dieter Fox, Dirk Hahnel, Wolfram Burgard Presenter: Aniket Shah.
Probabilistic Robotics Bayes Filter Implementations.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Maximum Entropy (ME) Maximum Entropy Markov Model (MEMM) Conditional Random Field (CRF)
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Object Recognition Based on Shape Similarity Longin Jan Latecki Computer and Information Sciences Dept. Temple Univ.,
HMM - Part 2 The EM algorithm Continuous density HMM.
MCMC (Part II) By Marc Sobel. Monte Carlo Exploration  Suppose we want to optimize a complicated distribution f(*). We assume ‘f’ is known up to a multiplicative.
Machine Learning Queens College Lecture 7: Clustering.
Flat clustering approaches
Using the Particle Filter Approach to Building Partial Correspondences between Shapes Rolf Lakaemper, Marc Sobel Temple University, Philadelphia,PA,USA.
Page 0 of 7 Particle filter - IFC Implementation Particle filter – IFC implementation: Accept file (one frame at a time) Initial processing** Compute autocorrelations,
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
A Brief Maximum Entropy Tutorial Presenter: Davidson Date: 2009/02/04 Original Author: Adam Berger, 1996/07/05
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
1 Cluster Analysis – 2 Approaches K-Means (traditional) Latent Class Analysis (new) by Jay Magidson, Statistical Innovations based in part on a presentation.
Hidden Markov Models. A Hidden Markov Model consists of 1.A sequence of states {X t |t  T } = {X 1, X 2,..., X T }, and 2.A sequence of observations.
Machine Learning in Practice Lecture 21 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
Gaussian Mixture Model classification of Multi-Color Fluorescence In Situ Hybridization (M-FISH) Images Amin Fazel 2006 Department of Computer Science.
Probabilistic Robotics
Presented by Prashant Duhoon
Clustering (3) Center-based algorithms Fuzzy k-means
Introduction to particle filter
Particle Filter/Monte Carlo Localization
Auxiliary particle filtering: recent developments
Introduction to particle filter
Handwritten Characters Recognition Based on an HMM Model
EM Algorithm and its Applications
Presentation transcript:

Particle Filters for Shape Correspondence Presenter: Jingting Zeng

Outline Review of Particle Filter How to Use Particle Filters in Shape Correspondence Further Implementation in Shape Clustering

Part One Review of Particle Filter

What Particle Filter is Particle filter is a technique for implementing recursive Bayesian filter by Monte Carlo sampling

How Particle Filters Algorithm works 1. Initialize the distribution.  The initial distribution can be anything. 2. Observe the system and find a (proportional) probability for each particle that the particle is an accurate representation of the system based on that observation.  This value is refereed as a particle's importance weight. 3. Normalize the particle weights. 4. Resample the distribution to get a new distribution.  A particle is selected at a frequency proportional to its importance weight. 5. Update each particle in the filter according to the prediction of system changes. 6. Repeat from step 2.

6 Particle Filters Algorithm Initialize particles Output Output estimates 12M... Particle generation New observation Exit Normalize weights 12M... Weigth computation Resampling More observations? yes no

Demonstration lter/ lter/

Part Two How to Use Particle Filters in Shape Correspondence

Goal The Goal of Shape Correspondence is to find correspondences between features points in two (similar) shapes

What is the data? Segmentation Boundary Tracking

Local Feature Extraction Centroid Distance (Relative distance to center of polygon ) Curvature (turning angle)

Correspondence Matrix The correspondence matrix W measures the correspondence probability between shapes A and B

Centroid DistanceCurvature Euclidian Distance Gaussian Distribution Normalization CenDist MatrixCurvature Matrix joint probability Correspondence Matrix W

CentDist Curvature W

Correspondence Given two shapes S1,S2 with n1, n2 vertices, we define the set of correspondences as the set of all pairs of vertices of S1 and S2: The matrix W defines a probability over the set of correspondences:

Grouping A Grouping is a member of the power set of. Each element takes the form Further constraints on groupings (such as correspondences in order) can limit the search space to a subset

Optimal Sets of Correspondences The weight of a grouping is defined as: The correspondence problem is formulated as choosing the complete grouping from the set of constrained groupings with maximal weight:

About Particle Filters A single particle contains a grouping represents a particle at time t Particles are built by adding single correspondences at each iteration Correspondences are selected based on the updated weight matrix W t at time t

Important Steps in PF Prediction: update each particle and compute its new weight according to W t. The posterior distribution of at time (iteration) t is given by eq.1:

Evaluation: Pick n updated particles according to their weights. ‘Better’ particles have a higher chance to survive. Recede: Every m steps, n correspondences are deleted (m>n). This can be seen as an add on to the update step.

Particle Filters algorithm 1. INIT: t=1, number of particles. W t = W. Init r for the recede-step. 2. Prepare the constraint matrices for i = 1..m and compute 3. Select a correspondence based on the distribution. 4. PREDICTION: compute posterior distribution (weight of particle) using eq Normalize weights:

6. EVALUATION: compute new set of particles using residual re-sampling (RRS) preserving most probably those particles with dominant weight. 7. RECEDE: if mod(t, r) = 0 delete n < r correspondences in each particle in. 8. LOOP: if not all particles are complete:, return to step 2, else return particle with maximum weight to represent a near optimal solution.

Demo Video demonstration Video

Shape Correspondence Result

Part Three Further Implementation in Shape Clustering

A New Distance Measure 1. Computation of shape correspondence 2. Pre-Alignment using Procrustes Analysis 3. Context dependent alignment using Force Field Simulation (FFS) 4. Mapping into a feature space (Density Computation) 5. Comparison of mapped shape and cluster

Step 1 & 2

Step 3 & 4

Soft K-Means Like Clustering (1) initialize the recursion parameter and the cluster matrix with random weights. (2) update the weights of the matrix based on the distance of density maps. (3) compute all new density maps (4) decrease the recursion parameter. (5) go back to step (2) unless convergence is achieved.

Experiment 55 shapes of MPEG-7 dataset 11 groups of 5 shapes each

References lter/ lter/ Theory and Implementation of Particle Filters.ppt by Miodrag Bolic Finding Shape Correspondences with Particle Filters.ppt by Rolf Lakaemper A Context Dependent Distance Measure for Shape Clustering (ISVC2008)