A brief Introduction to Particle Filters

Slides:



Advertisements
Similar presentations
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
(Includes references to Brian Clipp
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Localization David Johnson cs6370. Basic Problem Go from thisto this.
Particle Filter Speed Up Using a GPU High Performance Embedded Computing Workshop MIT Lincoln Labs By John Sacha & Andrew Shaffer Applied Research Laboratory.
Particle Filters.
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Sérgio Pequito Phd Student
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Monte Carlo Localization
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Darryl MorrellStochastic Modeling Seminar1 Particle Filtering.
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Particle Filters.
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Tracking with focus on the particle filter
An introduction to Particle filtering
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Bayesian Filtering for Robot Localization
Mobile Robot controlled by Kalman Filter
Particle Filter & Search
Markov Localization & Bayes Filtering
Tracking with focus on the particle filter (part II) Michael Rubinstein IDC.
Object Tracking using Particle Filter
Computer vision: models, learning and inference Chapter 19 Temporal models.
System Identification of Nonlinear State-Space Battery Models
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Particle Filtering (Sequential Monte Carlo)
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Probabilistic Robotics Bayes Filter Implementations.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Particle Filters.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Sanjay Patil 1 and Ryan Irwin 2 Graduate research assistant 1, REU undergrad 2 Human and Systems Engineering URL:
Sanjay Patil 1 and Ryan Irwin 2 Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Virtual Vector Machine for Bayesian Online Classification Yuan (Alan) Qi CS & Statistics Purdue June, 2009 Joint work with T.P. Minka and R. Xiang.
Sanjay Patil 1 and Ryan Irwin 2 Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
-Arnaud Doucet, Nando de Freitas et al, UAI
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Mobile Robot Localization (ch. 7)
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Mixture Kalman Filters by Rong Chen & Jun Liu Presented by Yusong Miao Dec. 10, 2003.
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
CSE-473 Project 2 Monte Carlo Localization. Localization as state estimation.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use dynamics models.
OBJECT TRACKING USING PARTICLE FILTERS. Table of Contents Tracking Tracking Tracking as a probabilistic inference problem Tracking as a probabilistic.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks Arnaud Doucet, Nando de Freitas, Kevin Murphy and Stuart Russell CS497EA presentation.
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Particle Filters: Theory and applications to mobile robot localization and map building Jose Luis Blanco Claraco August 2005 System Engineering and Automation.
Introduction to Sampling based inference and MCMC
Particle Filtering for Geometric Active Contours
Probabilistic Robotics
Introduction to particle filter
Particle Filter/Monte Carlo Localization
Introduction to particle filter
Presentation transcript:

A brief Introduction to Particle Filters Michael Pfeiffer pfeiffer@igi.tugraz.at 18.05.2004

Agenda Problem Statement Classical Approaches Particle Filters Theory Algorithms Applications

Problem Statement Tracking the state of a system as it evolves over time We have: Sequentially arriving (noisy or ambiguous) observations We want to know: Best possible estimate of the hidden variables

Illustrative Example: Robot Localization Prob 1 t=0 Sensory model: never more than 1 mistake Motion model: may not execute action with small prob.

Illustrative Example: Robot Localization Prob 1 t=1

Illustrative Example: Robot Localization Prob 1 t=2

Illustrative Example: Robot Localization Prob 1 t=3

Illustrative Example: Robot Localization Prob 1 t=4

Illustrative Example: Robot Localization 1 2 3 4 Trajectory Prob 1 t=5

Applications Tracking of aircraft positions from radar Estimating communications signals from noisy measurements Predicting economical data Tracking of people or cars in surveillance videos

Bayesian Filtering / Tracking Problem Unknown State Vector x0:t = (x0, …, xt) Observation Vector z1:t Find PDF p(x0:t | z1:t) … posterior distribution or p(xt | z1:t) … filtering distribution Prior Information given: p(x0) … prior on state distribution p(zt | xt) … sensor model p(zt | xt-1) … Markovian state-space model complete solution: optimal estimate and measure of accuracy available Sensor and state model are given as transition functions (dependent on some noise variable)

Sequential Update Storing all incoming measurements is inconvenient Recursive filtering: Predict next state pdf from current estimate Update the prediction using sequentially arriving new measurements Optimal Bayesian solution: recursively calculating exact posterior density

Bayesian Update and Prediction update via Bayes‘ rule usually infeasible to calculate analytically the exact posterior desity only in restricted cases (Kalman filter, next section)

Agenda Problem Statement Classical Approaches Particle Filters Theory Algorithms Applications

Kalman Filter Optimal solution for linear-Gaussian case Assumptions: State model is known linear function of last state and Gaussian noise signal Sensory model is known linear function of state and Gaussian noise signal Posterior density is Gaussian

Kalman Filter: Update Equations

Limitations of Kalman Filtering Assumptions are too strong. We often find: Non-linear Models Non-Gaussian Noise or Posterior Multi-modal Distributions Skewed distributions Extended Kalman Filter: local linearization of non-linear models still limited to Gaussian posterior No algorithm can do better in linear Gaussian environment

Grid-based Methods Optimal for discrete and finite state space Keep and update an estimate of posterior pdf for every single state No constraints on posterior (discrete) density

Limitations of Grid-based Methods Computationally expensive Only for finite state sets Approximate Grid-based Filter divide continuous state space into finite number of cells Hidden Markov Model Filter Dimensionality increases computational costs dramatically

Agenda Problem Statement Classical Approaches Particle Filters Theory Algorithms Applications

Many different names… Particle Filters (Sequential) Monte Carlo filters Bootstrap filters Condensation Interacting Particle Approximations Survival of the fittest …

Sample-based PDF Representation Monte Carlo characterization of pdf: Represent posterior density by a set of random i.i.d. samples (particles) from the pdf p(x0:t|z1:t) For larger number N of particles equivalent to functional description of pdf For N approaches optimal Bayesian estimate

Sample-based PDF Representation Regions of high density Many particles Large weight of particles Uneven partitioning Discrete approximation for continuous pdf

Importance Sampling Draw N samples x0:t(i) from Importance sampling distribution (x0:t|z1:t) Importance weight Estimation of arbitrary functions ft: Usually impossible to draw efficiently from p  importancr sampling

Sequential Importance Sampling (SIS) Augmenting the samples Weight update factoring importance density Markov assumption; not necessary to keep all former states in memory

Degeneracy Problem After a few iterations, all but one particle will have negligible weight Measure for degeneracy: Effective sample size Small Neff indicates severe degeneracy Brute force solution: Use very large N wt* ... true weights at time t Updating these particles means a lot of computational effort, for particles whose contribution to the approximation of p(x|t) is almost zero

Choosing Importance Density Choose  to minimize variance of weights Optimal solution: Practical solution importance density = prior drawbacks of optimal solution: sampling from complicated distribution, evaluation of integral over new state for weight estimation restricted cases where optimal case is possible: finite state set,Gaussian transition model

Resampling Eliminate particles with small importance weights Concentrate on particles with large weights Sample N times with replacement from the set of particles xt(i) according to importance weights wt(i) „Survival of the fittest“ resample whenever effective sample size falls below certain threshold resulting particles have uniform weight

Sampling Importance Resample Filter: Basic Algorithm 1. INIT, t=0 for i=1,..., N: sample x0(i)~p(x0); t:=1; 2. IMPORTANCE SAMPLING for i=1,..., N: sample xt(i) ~ p(xt|xt-1(i)) x0:t(i) := (x0:t-1(i), xt(i)) for i=1,..., N: evaluate importance weights wt(i)=p(zt|xt(i)) Normalize the importance weights 3. SELECTION / RESAMPLING resample with replacement N particles x0:t(i) according to the importance weights Set t:=t+1 and go to step 2

Variations Auxiliary Particle Filter: Regularisation: resample at time t-1 with one-step lookahead (re-evaluate with new sensory information) Regularisation: resample from continuous approximation of posterior p(xt|z1:t)

Visualization of Particle Filter unweighted measure compute importance weights  p(xt-1|z1:t-1) resampling move particles predict p(xt|z1:t-1)

moving Gaussian + uniform, N=100 particles Particle Filter Demo 1 moving Gaussian + uniform, N=100 particles

moving Gaussian + uniform, N=1000 particles Particle Filter Demo 2 moving Gaussian + uniform, N=1000 particles

moving (sharp) Gaussian + uniform, N=100 particles Particle Filter Demo 3 moving (sharp) Gaussian + uniform, N=100 particles

moving (sharp) Gaussian + uniform, N=1000 particles Particle Filter Demo 4 moving (sharp) Gaussian + uniform, N=1000 particles

Particle Filter Demo 5 mixture of two Gaussians, filter loses track of smaller and less pronounced peaks

Obtaining state estimates from particles Any estimate of a function f(xt) can be calculated by discrete PDF-approximation Mean: MAP-estimate: particle with largest weight Robust mean: mean within window around MAP-estimate

Pros and Cons of Particle Filters Estimation of full PDFs Non-Gaussian distributions e.g. multi-modal Non-linear state and observation model Parallelizable Degeneracy problem High number of particles needed Computationally expensive Linear-Gaussian assumption is often sufficient

Agenda Problem Statement Classical Approaches Particle Filters Theory Algorithms Applications

Mobile Robot Localization Animation by Sebastian Thrun, Stanford http://robots.stanford.edu

Positioning Systems1 Track car position in given road map Track car position from radio frequency measurements Track aircraft position from estimated terrain elevation Collision Avoidance (Prediction) Replacement for GPS 1: Gustafsson, et.al.: Particle Filters for Positioning, Navigation and Tracking. IEEE Transactions on Signal Processing Vol. 50, 2002

Model Estimation Tracking with multiple motion-models Discrete hidden variable indicates active model (manoever) Recovery of signal from noisy measurements even if signal may be absent (e.g. synaptic currents) mixture model of several hypotheses Neural Network model selection [de Freitas]1 estimate parameters and architecture of RBF network from input-output pairs on-line classification (time-varying classes) 1: de Freitas, et.al.: Sequential Monte Carlo Methods for Neural Networks. in: Doucet, et.al.: Sequential Monte Carlo Methods in Practice, Springer Verlag, 2001

Other Applications Visual Tracking e.g. human motion (body parts) Prediction of (financial) time series e.g. mapping gold price  stock price Quality control in semiconductor industry Military applications Target recognition from single or multiple images Guidance of missiles

Possible Uses for our Group Reinforcement Learning POMDPs Estimating Opponent States RoboCup: Multi-robot localization and tracking Representation of PDFs Prediction Tasks Preprocessing of Visual Input Identifying Neural Network Layouts or other Hidden System Parameters Applications in Computational Neuroscience (?) Other suggestions?

Sources Doucet, de Freitas, Gordon: Sequential Monte Carlo Methods in Practice, Springer Verlag, 2001 Arulampalam, Maskell, Gordon, Clapp: A Tutorial on Particle Filters for on-line Non-linear / Non-Gaussian Bayesian Tracking, IEEE Transactions on Signal Processing, Vol. 50, 2002

Thank you!