Tutorial on Particle filters Pattern and Information Processing Group

Slides:



Advertisements
Similar presentations
Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Computer Vision Lab. SNU Young Ki Baik An Introduction to MCMC for Machine Learning (Markov Chain Monte Carlo)
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Particle Filters.
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
A brief Introduction to Particle Filters
Sérgio Pequito Phd Student
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Today Introduction to MCMC Particle filters and MCMC
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
End of Chapter 8 Neil Weisenfeld March 28, 2005.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Optimal Filtering of Jump Diffusions: Extracting Latent States from Asset Prices Jonathan Stroud, Wharton, U. Pennsylvania Stern-Wharton Conference on.
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Particle Filters.
Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, using slides from Keith.
HCI / CprE / ComS 575: Computational Perception
Introduction to Monte Carlo Methods D.J.C. Mackay.
Particle Filtering in Network Tomography
MCMC: Particle Theory By Marc Sobel. Particle Theory: Can we understand it?
Markov Localization & Bayes Filtering
Tracking with focus on the particle filter (part II) Michael Rubinstein IDC.
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
Object Tracking using Particle Filter
Computer vision: models, learning and inference Chapter 19 Temporal models.
System Identification of Nonlinear State-Space Battery Models
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
Nonlinear Data Assimilation and Particle Filters
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Particle Filtering (Sequential Monte Carlo)
Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, using slides from Keith.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Probabilistic Robotics Bayes Filter Implementations.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Particle Filters.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Sanjay Patil 1 and Ryan Irwin 2 Graduate research assistant 1, REU undergrad 2 Human and Systems Engineering URL:
Sanjay Patil 1 and Ryan Irwin 2 Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Sanjay Patil 1 and Ryan Irwin 2 Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
-Arnaud Doucet, Nando de Freitas et al, UAI
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Mobile Robot Localization (ch. 7)
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Ch. 14: Markov Chain Monte Carlo Methods based on Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009.; C, Andrieu, N, de Freitas,
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Mixture Kalman Filters by Rong Chen & Jun Liu Presented by Yusong Miao Dec. 10, 2003.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks Arnaud Doucet, Nando de Freitas, Kevin Murphy and Stuart Russell CS497EA presentation.
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Particle Filters: Theory and applications to mobile robot localization and map building Jose Luis Blanco Claraco August 2005 System Engineering and Automation.
Introduction to Sampling based inference and MCMC
Probabilistic Robotics
Jun Liu Department of Statistics Stanford University
Robust Full Bayesian Learning for Neural Networks
6.891 Computer Experiments for Particle Filtering
Presentation transcript:

Tutorial on Particle filters Pattern and Information Processing Group Keith Copsey Pattern and Information Processing Group DERA Malvern K.Copsey@signal.dera.gov.uk

Outline Introduction to particle filters Bayesian Importance sampling Recursive Bayesian estimation Bayesian Importance sampling Sequential Importance sampling (SIS) Sampling Importance resampling (SIR) Improvements to SIR On-line Markov chain Monte Carlo Basic Particle Filter algorithm Examples Conclusions Demonstration

Particle Filters Sequential Monte Carlo methods for on-line learning within a Bayesian framework. Known as Particle filters Sequential sampling-importance resampling (SIR) Bootstrap filters Condensation trackers Interacting particle approximations Survival of the fittest

Recursive Bayesian estimation (I) Recursive filter: System model: Measurement model: Information available:

Recursive Bayesian estimation (II) Seek: i = 0: filtering. i > 0: prediction. i<0: smoothing. Prediction: since:

Recursive Bayesian estimation (III) Update: where: since:

Classical approximations Analytical methods: Extended Kalman filter, Gaussian sums… (Alspach et al. 1971) Perform poorly in numerous cases of interest Numerical methods: point masses approximations, splines. (Bucy 1971, de Figueiro 1974…) Very complex to implement, not flexible.

Perfect Monte Carlo simulation (I) Introduce the notation Represent posterior distribution using a set of samples or particles. Random samples are drawn from the posterior distribution.

Perfect Monte Carlo simulation (II) Easy to approximate expectations of the form: by:

Random samples and the pdf (I) Take p(x)=Gamma(4,1) Generate some random samples Plot histogram and basic approximation to pdf 200 samples

Random samples and the pdf (II)

Random samples and the pdf (III)

Bayesian Importance Sampling (I) Unfortunately it is often not possible to sample directly from the posterior distribution. Circumvent by drawing from a known easy to sample proposal distribution giving:

Bayesian Importance Sampling (II) where are unnormalised importance weights: Now:

Bayesian Importance Sampling (III) Giving: so that: where are normalised importance weights and are independent random samples from

Sequential Importance Sampling (I) Factorising the proposal distribution: and remembering that the state evolution is modelled as a Markov process obtain a recursive estimate of the importance weights:

Derivation of SIR weights Since: We have: and

Sequential Importance Sampling (II) Choice of the proposal distribution: Choose proposal function to minimise variance of (Doucet et al. 1999): Although Common choice is the prior distribution:

Sequential Importance Sampling (III) Illustration of SIS: Degeneracy problems: variance of importance ratios increases stochastically over time (Kong et al. 1994; Doucet et al. 1999).

SIS - why variance increase is bad Suppose we want to sample from the posterior choose a proposal density to be very close to the posterior density Then and So we expect the variance to be close to 0 to obtain reasonable estimates thus a variance increase has a harmful effect on accuracy

Sequential Importance Sampling (IV) Illustration of degeneracy:

Sampling-Importance Resampling SIS suffers from degeneracy problems so we don’t want to do that! Introduce a selection (resampling) step to eliminate samples with low importance ratios and multiply samples with high importance ratios. Resampling maps the weighted random measure on to the equally weighted random measure by sampling uniformly with replacement from with probabilities Scheme generates children such that and satisfies:

Improvements to SIR (I) Variety of resampling schemes with varying performance in terms of the variance of the particles : Residual sampling (Liu & Chen, 1998). Systematic sampling (Carpenter et al., 1999). Mixture of SIS and SIR, only resample when necessary (Liu & Chen, 1995; Doucet et al., 1999). Degeneracy may still be a problem: During resampling a sample with high importance weight may be duplicated many times. Samples may eventually collapse to a single point.

Improvements to SIR (II) To alleviate numerical degeneracy problems, sample smoothing methods may be adopted. Roughening (Gordon et al., 1993). Adds an independent jitter to the resampled particles Prior boosting (Gordon et al., 1993). Increase the number of samples from the proposal distribution to M>N, but in the resampling stage only draw N particles.

Improvements to SIR (III) Local Monte Carlo methods for alleviating degeneracy: Local linearisation - using an EKF (Doucet, 1999; Pitt & Shephard, 1999) or UKF (Doucet et al, 2000) to estimate the importance distribution. Rejection methods (Müller, 1991; Doucet, 1999; Pitt & Shephard, 1999). Auxiliary particle filters (Pitt & Shephard, 1999) Kernel smoothing (Gordon, 1994; Hürzeler & Künsch, 1998; Liu & West, 2000; Musso et al., 2000). MCMC methods (Müller, 1992; Gordon & Whitby, 1995; Berzuini et al., 1997; Gilks & Berzuini, 1998; Andrieu et al., 1999).

Improvements to SIR (IV) Illustration of SIR with sample smoothing:

MCMC move step Improve results by introducing MCMC steps with invariant distribution . By applying a Markov transition kernel, the total variation of the current distribution w.r.t. the invariant distribution can only decrease. Introduces possibility of variable dimension state space through the use of reversible jump MCMC (de Freitas et al., 1999; Gilks & Berzuini, 2001)

Ingredients for SMC Importance sampling function Redistribution scheme Gordon et al  Optimal  UKF  pdf from UKF at Redistribution scheme Gordon et al  SIR Liu & Chen  Residual Carpenter et al  Systematic Liu & Chen, Doucet et al  Resample when necessary Careful initialisation procedure (for efficiency)

Basic Particle Filter - Schematic Initialisation measurement Resampling step Importance sampling step Extract estimate,

Basic Particle Filter algorithm (I) Initialisation For sample and set In practice, to avoid having to take too many samples, for the first step we may want to ensure that we have a reasonable number of particles in the region of high likelihood perhaps use MCMC techniques

Basic Particle Filter algorithm (II) Importance Sampling step For sample For evaluate the importance weights Normalise the importance weights, and set

Basic Particle Filter algorithm (III) Resampling step Resample with replacement particles: from the set: according to the normalised importance weights, Set proceed to the Importance Sampling step, as the next measurement arrives.

Example On-line Data Fusion (Marrs, 2000).

Example - Sensor Deployment Aim to reduce target sd below some threshold... … and keep it there … by placing the minimum number of sensors possible Sensor positions chosen according to particle distribution.

Example - In-situ monitoring of growing semiconductor crystal composition Si1-xGex substrate

Book Advert (or put this in or your fired) Sequential Monte Carlo methods in practice, Editors: Doucet, de Freitas, Gordon, Springer-Verlag (2001). Theorectical foundations - plus convergence proofs Efficiency measures Applications: Target tracking; missile guidance; image tracking; terrain referenced navigation; exchange rate prediction; portfolio allocation; ellipsometry; electricity load forecasting; pollution monitoring; population biology; communications and audio engineering. ISBN=0-387-95146-6, Price=$79.95.

Conclusions On-line Bayesian learning a realistic proposition for many applications. Appropriate for complex non-linear/non-Gaussian models don’t bother if KF based solution adequate. Representation of full posterior pdf leading to estimation of moments. estimation of HPD regions. multi-modality easy to deal with. Model order can be included in unknowns. Can mix SMC and KF based solutions

Tracking Demo Illustrate a running particle filter compare with Kalman Filter Running as we watch - not pre-recorded Pre-defined scenarios, or design your own available to play with at coffee and lunch breaks. Tracking Demo

2nd Book Advert Statistical Pattern Recognition Andrew Webb, DERA ISBN 0340741643, Paperback: 1999: £29.99 Butterworth Heinemann Contents: Introduction to SPR, Estimation, Density estimation, Linear discriminant analysis, Nonlinear discriminant analysis - neural networks, Nonlinear discriminant analysis - statistical methods, Classification trees, Feature selction and extraction, Clustering, Additional topics, Measures of dissimilarity, Parameter estimation, Linear algebra, Data, Probability theory.