Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, using slides from Keith.

Slides:



Advertisements
Similar presentations
Dynamic Spatial Mixture Modelling and its Application in Cell Tracking - Work in Progress - Chunlin Ji & Mike West Department of Statistical Sciences,
Advertisements

Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Visual Tracking CMPUT 615 Nilanjan Ray. What is Visual Tracking Following objects through image sequences or videos Sometimes we need to track a single.
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Particle Filters.
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
A brief Introduction to Particle Filters
Sérgio Pequito Phd Student
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
Today Introduction to MCMC Particle filters and MCMC
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
Sampling Methods for Estimation: An Introduction
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Particle Filters.
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, using slides from Keith.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
HCI / CprE / ComS 575: Computational Perception
Bayesian Filtering for Robot Localization
Markov Localization & Bayes Filtering
Tutorial on Particle filters Pattern and Information Processing Group
Tracking with focus on the particle filter (part II) Michael Rubinstein IDC.
Object Tracking using Particle Filter
Computer vision: models, learning and inference Chapter 19 Temporal models.
System Identification of Nonlinear State-Space Battery Models
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Particle Filtering (Sequential Monte Carlo)
Probabilistic Robotics: Monte Carlo Localization
Probabilistic Robotics Bayes Filter Implementations.
Particle Filters for Shape Correspondence Presenter: Jingting Zeng.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Particle Filters.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Sanjay Patil 1 and Ryan Irwin 2 Graduate research assistant 1, REU undergrad 2 Human and Systems Engineering URL:
Sanjay Patil 1 and Ryan Irwin 2 Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Sanjay Patil 1 and Ryan Irwin 2 Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
-Arnaud Doucet, Nando de Freitas et al, UAI
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Mobile Robot Localization (ch. 7)
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Ch. 14: Markov Chain Monte Carlo Methods based on Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009.; C, Andrieu, N, de Freitas,
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
OBJECT TRACKING USING PARTICLE FILTERS. Table of Contents Tracking Tracking Tracking as a probabilistic inference problem Tracking as a probabilistic.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
Monte Carlo Localization for Mobile Robots Frank Dellaert 1, Dieter Fox 2, Wolfram Burgard 3, Sebastian Thrun 4 1 Georgia Institute of Technology 2 University.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks Arnaud Doucet, Nando de Freitas, Kevin Murphy and Stuart Russell CS497EA presentation.
Particle Filtering for Symmetry Detection and Segmentation Pramod Vemulapalli.
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Particle filters for Robot Localization An implementation of Bayes Filtering Markov Localization.
Particle Filters: Theory and applications to mobile robot localization and map building Jose Luis Blanco Claraco August 2005 System Engineering and Automation.
Introduction to Sampling based inference and MCMC
Probabilistic Robotics
Introduction to particle filter
Introduction to particle filter
Particle Filter in Tracking
Presentation transcript:

Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, using slides from Keith Copsey, Pattern and Information Processing Group, DERA Malvern; D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello, Univ. of Washington, Seattle Honggang Zhang, Univ. of Maryland, College Park Miodrag Bolic, University of Ottawa, Canada Michael Pfeiffer, TU Gratz, Austria

Outline zIntroduction to particle filters –Recursive Bayesian estimation zBayesian Importance sampling –Sequential Importance sampling (SIS) –Sampling Importance resampling (SIR) zImprovements to SIR –On-line Markov chain Monte Carlo zBasic Particle Filter algorithm zExample for robot localization zConclusions

Particle Filters zSequential Monte Carlo methods for on-line learning within a Bayesian framework. zKnown as –Particle filters –Sequential sampling-importance resampling (SIR) –Bootstrap filters –Condensation trackers –Interacting particle approximations –Survival of the fittest

History zFirst attempts – simulations of growing polymers –M. N. Rosenbluth and A.W. Rosenbluth, “Monte Carlo calculation of the average extension of molecular chains,” Journal of Chemical Physics, vol. 23, no. 2, pp. 356–359, zFirst application in signal processing –N. J. Gordon, D. J. Salmond, and A. F. M. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state estimation,” IEE Proceedings-F, vol. 140, no. 2, pp. 107–113, zBooks –A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, –B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, zTutorials –M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.

Problem Statement zTracking the state of a system as it evolves over time zSequentially arriving (noisy or ambiguous) observations zWe want to know: Best possible estimate of the hidden variables

Solution: Sequential Update zStoring and processing all incoming measurements is inconvenient and may be impossible zRecursive filtering: –Predict next state pdf from current estimate –Update the prediction using sequentially arriving new measurements zOptimal Bayesian solution: recursively calculating exact posterior density

Particle filtering ideas zParticle filter is a technique for implementing recursive Bayesian filter by Monte Carlo sampling zThe idea: represent the posterior density by a set of random particles with associated weights. zCompute estimates based on these samples and weights Sample space Posterior density

Global Localization of Robot with Sonar

Tools needed Recall “law of total probability” (or marginalization) and “Bayes’ rule”

Recursive Bayesian estimation (I) zRecursive filter: –System model: –Measurement model: –Information available:

Recursive Bayesian estimation (II) zSeek: –i = 0: filtering. –i > 0: prediction. –i<0: smoothing. zPrediction: –since:

Recursive Bayesian estimation (III) zUpdate: zwhere: –since:

Bayes Filters (second pass) System state dynamics Observation dynamics We are interested in: Belief or posterior density Estimating system state from noisy observations

From above, constructing two steps of Bayes Filters Predict: Update:

Predict: Update: Assumptions: Markov Process

Bayes Filter How to use it? What else to know? Motion Model Perceptual Model Start from:

Example 1 Step 0: initialization Step 1: updating

Example 1 (continue) Step 3: updating Step 4: predicting Step 2: predicting 1

Classical approximations zAnalytical methods: –Extended Kalman filter, –Gaussian sums… (Alspach et al. 1971) Perform poorly in numerous cases of interest zNumerical methods: –point masses approximations, –splines. (Bucy 1971, de Figueiro 1974…) Very complex to implement, not flexible.

Perfect Monte Carlo simulation zRecall that zRandom samples are drawn from the posterior distribution. zRepresent posterior distribution using a set of samples or particles. zEasy to approximate expectations of the form: –by:

Random samples and the pdf (I) zTake p(x)=Gamma(4,1) zGenerate some random samples zPlot histogram and basic approximation to pdf 200 samples

Random samples and the pdf (II) 500 samples 1000 samples

Random samples and the pdf (III) samples 5000 samples

Importance Sampling zUnfortunately it is often not possible to sample directly from the posterior distribution, but we can use importance sampling. zLet p(x) be a pdf from which it is difficult to draw samples. zLet x i ~ q(x), i=1, …, N, be samples that are easily generated from a proposal pdf q, which is called an importance density. zThen approximation to the density p is given by where

Bayesian Importance Sampling zBy drawing samples from a known easy to sample proposal distribution we obtain: where are normalized weights.

Sequential Importance Sampling (I) zFactorizing the proposal distribution: zand remembering that the state evolution is modeled as a Markov process zwe obtain a recursive estimate of the importance weights: zFactorizing is obtained by recursively applying

Sequential Importance Sampling (SIS) Particle Filter SIS Particle Filter Algorithm for i=1:N Draw a particle Assign a weight end (k is index over time and i is the particle index)

Derivation of SIS weights (I) zThe main idea is Factorizing : and Our goal is to expand p and q in time t

Derivation of SIS weights (II)

and under Markov assumptions

SIS Particle Filter Foundation zAt each time step k zRandom samples are drawn from the proposal distribution for i=1, …, N zThey represent posterior distribution using a set of samples or particles zSince the weights are given by zand q factorizes as

Sequential Importance Sampling (II) zChoice of the proposal distribution: zChoose proposal function to minimize variance of (Doucet et al. 1999): zAlthough common choice is the prior distribution: We obtain then

zIllustration of SIS: zDegeneracy problems: –variance of importance ratios increases stochastically over time (Kong et al. 1994; Doucet et al. 1999). –In most cases then after a few iterations, all but one particle will have negligible weight Sequential Importance Sampling (III)

Sequential Importance Sampling (IV) zIllustration of degeneracy:

SIS - why variance increase zSuppose we want to sample from the posterior –choose a proposal density to be very close to the posterior density Then and zSo we expect the variance to be close to 0 to obtain reasonable estimates –thus a variance increase has a harmful effect on accuracy

Sampling-Importance Resampling zSIS suffers from degeneracy problems so we don’t want to do that! zIntroduce a selection (resampling) step to eliminate samples with low importance ratios and multiply samples with high importance ratios. zResampling maps the weighted random measure on to the equally weighted random measure –by sampling uniformly with replacement from with probabilities zScheme generates children such that and satisfies:

Basic SIR Particle Filter - Schematic Initialisation Importance sampling step Resampling step measurement Extract estimate,

Basic SIR Particle Filter algorithm (I) zInitialisation – –For sample –and set zImportance Sampling step –For sample –For compute the importance weights w i k –Normalise the importance weights, and set

Basic SIR Particle Filter algorithm (II) zResampling step –Resample with replacement particles: –from the set: –according to the normalised importance weights, zSet –proceed to the Importance Sampling step, as the next measurement arrives.

Resampling x

Generic SIR Particle Filter algorithm M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters …,” IEEE Trans. on Signal Processing, 50( 2), 2002.

Improvements to SIR (I) zVariety of resampling schemes with varying performance in terms of the variance of the particles : –Residual sampling (Liu & Chen, 1998). –Systematic sampling (Carpenter et al., 1999). –Mixture of SIS and SIR, only resample when necessary (Liu & Chen, 1995; Doucet et al., 1999). zDegeneracy may still be a problem: –During resampling a sample with high importance weight may be duplicated many times. –Samples may eventually collapse to a single point.

Improvements to SIR (II) zTo alleviate numerical degeneracy problems, sample smoothing methods may be adopted. –Roughening ( Gordon et al., 1993 ). Adds an independent jitter to the resampled particles –Prior boosting ( Gordon et al., 1993 ). Increase the number of samples from the proposal distribution to M>N, but in the resampling stage only draw N particles.

Improvements to SIR (III) zLocal Monte Carlo methods for alleviating degeneracy: –Local linearisation - using an EKF (Doucet, 1999; Pitt & Shephard, 1999) or UKF (Doucet et al, 2000) to estimate the importance distribution. –Rejection methods ( Müller, 1991; Doucet, 1999; Pitt & Shephard, 1999 ). –Auxiliary particle filters ( Pitt & Shephard, 1999 ) –Kernel smoothing ( Gordon, 1994; Hürzeler & Künsch, 1998; Liu & West, 2000; Musso et al., 2000 ). –MCMC methods ( Müller, 1992; Gordon & Whitby, 1995; Berzuini et al., 1997; Gilks & Berzuini, 1998; Andrieu et al., 1999 ).

Improvements to SIR (IV) zIllustration of SIR with sample smoothing:

Ingredients for SMC zImportance sampling function –Gordon et al  –Optimal  –UKF  pdf from UKF at zRedistribution scheme –Gordon et al  SIR –Liu & Chen  Residual –Carpenter et al  Systematic –Liu & Chen, Doucet et al  Resample when necessary zCareful initialisation procedure (for efficiency)

Particle filters zAlso known as Sequential Monte Carlo Methods zRepresenting belief by sets of samples or particles z are nonnegative weights called importance factors zUpdating procedure is sequential importance sampling with re-sampling

Example 2: Particle Filter Step 0: initialization Each particle has the same weight Step 1: updating weights. Weights are proportional to p(z|x)

Example 2: Particle Filter Particles are more concentrated in the region where the person is more likely to be Step 3: updating weights. Weights are proportional to p(z|x) Step 4: predicting. Predict the new locations of particles. Step 2: predicting. Predict the new locations of particles.

Compare Particle Filter with Bayes Filter with Known Distribution Example 1 Example 2 Example 1 Example 2 Predicting Updating

Particle Filters

Sensor Information: Importance Sampling

Robot Motion

Sensor Information: Importance Sampling

Robot Motion

Application Examples zRobot localization zRobot mapping zVisual Tracking, e.g., human motion (body parts) zPrediction of (financial) time series, e.g., mapping gold price to stock price zTarget recognition from single or multiple images zGuidance of missiles zContour grouping zMatlab PF software for tracking zNice video demos: