Tracking with focus on the particle filter (part II) Michael Rubinstein IDC.

Slides:



Advertisements
Similar presentations
Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
1 Approximated tracking of multiple non-rigid objects using adaptive quantization and resampling techniques. J. M. Sotoca 1, F.J. Ferri 1, J. Gutierrez.
(Includes references to Brian Clipp
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Artificial Learning Approaches for Multi-target Tracking Jesse McCrosky Nikki Hu.
Formation et Analyse d’Images Session 8
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
A brief Introduction to Particle Filters
Sérgio Pequito Phd Student
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Tracking with focus on the particle filter
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Bayesian Filtering for Robot Localization
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Mobile Robot controlled by Kalman Filter
Particle Filtering in Network Tomography
Markov Localization & Bayes Filtering
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
Object Tracking using Particle Filter
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Computer vision: models, learning and inference Chapter 19 Temporal models.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
A General Framework for Tracking Multiple People from a Moving Camera
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Probabilistic Robotics: Monte Carlo Localization
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Probabilistic Robotics Bayes Filter Implementations.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.
Mobile Robot Localization (ch. 7)
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Expectation-Maximization (EM) Case Studies
Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of.
An Introduction to Kalman Filtering by Arthur Pece
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use dynamics models.
Tracking with dynamics
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Particle filters for Robot Localization An implementation of Bayes Filtering Markov Localization.
Learning Image Statistics for Bayesian Tracking Hedvig Sidenbladh KTH, Sweden Michael Black Brown University, RI, USA
Particle Filtering for Geometric Active Contours
Probabilistic Robotics
Introduction to particle filter
Introduction to particle filter
Particle Filter in Tracking
Particle Filtering.
Presentation transcript:

Tracking with focus on the particle filter (part II) Michael Rubinstein IDC

Last time… Background – State space – Dynamic systems – Recursive Bayesian filters – Restrictive cases Kalman filter Grid-based filter – Suboptimal approximations Extended Kalman filter © Michael Rubinstein

This talk Particle filtering – MC integration – Sequential Importance Sampling (SIS) – Resampling – PF variants Multiple-target tracking – BraMBLe: A Bayesian Multiple-Blob Tracker/ Isard, MacCormick © Michael Rubinstein

Dynamic System State equation: state vector at time instant k state transition function, i.i.d process noise Observation equation: observations at time instant k observation function, i.i.d measurement noise © Michael Rubinstein k-1kk+1 x k-1 xkxk x k+1 z k-1 zkzk z k+1 Stochastic diffusion

Recursive Bayes filter Prediction: Update: © Michael Rubinstein (1) (2) Sample space

Particle filtering Many variations, one general concept: Randomly Chosen = Monte Carlo (MC) As the number of samples become very large – the characterization becomes an equivalent representation of the true pdf Represent the posterior pdf by a set of randomly chosen weighted samples (particles) © Michael Rubinstein Sample space Posterior

Particle filtering Compared to methods we’ve mentioned last time – Can represent any arbitrary distribution – multimodal support – Keep track of many hypotheses as there are particles – Approximate representation of complex model rather than exact representation of simplified model The basic building-block: Importance Sampling © Michael Rubinstein

Monte Carlo integration Evaluate complex integrals using probabilistic techniques Assume we are trying to estimate a complicated integral of a function f over some domain D: Also assume there exists some PDF p defined over D © Michael Rubinstein

Monte Carlo integration Then But This is true for any PDF p over D! © Michael Rubinstein

Monte Carlo integration Now, if we have i.i.d random samples sampled from p, then we can approximate by Guaranteed by law of large numbers: © Michael Rubinstein

Importance Sampling (IS) What about? If p is very small, can be arbitrarily large, ‘damaging’ the average Design p such that is bounded Rule of thumb: take p similar to f as possible The effect: get more samples in ‘important’ areas of f, i.e. where f is large © Michael Rubinstein Importance weights Importance or proposal density

Convergence of MC integration Chebyshev’s inequality: let X be a random variable with expected value μ and std σ. For any real number k>0, For example, for, it shows that at least half the values lie in interval Let, then MC estimator is Pafnuty Lvovich Chebyshev © Michael Rubinstein

Convergence of MC integration By Chebyshev’s,  Hence, for a fixed threshold, the error decreases at rate © Michael Rubinstein

Convergence of MC integration Meaning 1.To cut the error in half, it is necessary to evaluate 4 times as many samples 2.Convergence rate is independent of the integrand dimension! On contrast, the convergence rate of grid-based approximations decreases as increases © Michael Rubinstein

IS for Bayesian estimation We characterize the posterior pdf using a set of samples (particles) and their weights Then the joint posterior density at time k is approximated by © Michael Rubinstein

IS for Bayesian estimation We draw the samples from the importance density with importance weights Sequential update (after some calculation…) © Michael Rubinstein Particle update Weight update

Sequential Importance Sampling (SIS) FOR i=1:N – Draw – Update weights END Normalize weights © Michael Rubinstein

State estimates Any functioncan be calculated by discrete pdf approximation © Michael Rubinstein MAPMean Robust mean Example: – Mean (simple average) – MAP estimate: particle with largest weight – Robust mean: mean within window around MAP estimate

Choice of importance density © Michael Rubinstein Hsiao et al.

Choice of importance density Most common (suboptimal): the transitional prior © Michael Rubinstein Grid filter weight update:

The degeneracy phenomenon Unavoidable problem with SIS: after a few iterations most particles have negligible weights – Large computational effort for updating particles with very small contribution to Measure of degeneracy - the effective sample size: – Uniform:, severe degeneracy: © Michael Rubinstein

Resampling The idea: when degeneracy is above some threshold, eliminate particles with low importance weights and multiply particles with high importance weights The new set is generated by sampling with replacement from the discrete representation of such that © Michael Rubinstein

Resampling Generate N i.i.d variables Sort them in ascending order Compare them with the cumulative sum of normalized weights Ristic et al. © Michael Rubinstein

Resampling Complexity: O(NlogN) – O(N) sampling algorithms exist © Michael Rubinstein Hsiao et al.

Generic PF Apply SIS filtering Calculate IF END © Michael Rubinstein

Generic PF Van der Merwe et al. Uniformly weighted measure Approximates Compute for each particle its importance weight to Approximate (Resample if needed) Project ahead to approximate © Michael Rubinstein

PF variants Sampling Importance Resampling (SIR) Auxiliary Sampling Importance Resampling (ASIR) Regularized Particle Filter (RPF) Local-linearization particle filters Multiple models particle filters (maneuvering targets) … © Michael Rubinstein

Sampling Importance Resampling (SIR) A.K.A Bootstrap filter, Condensation © Michael Rubinstein Initialize from prior distribution For k > 0 do Resample into Predict Reweight Normalize weights Estimate (for display)

Intermission Questions? © Michael Rubinstein

Multiple Targets (MTT/MOT) Previous challenges – Full/partial occlusions – Entering/leaving the scene – … And in addition – Estimating the number of objects – Computationally tractable for multiple simultaneous targets – Interaction between objects – Many works on multiple single-target filters © Michael Rubinstein

BraMBLe: A Bayesian Multiple- Blob Tracker M. Isard and J. MacCormick Compaq Systems Research Center ICCV 2001 Some slides taken from Qi Zhao Some images taken from Isard and MacCormick

BraMBLE First rigorous particle filter implementation with variable number of targets Posterior distribution is constructed over possible object configurations and number Sensor: single static camera Tracking: SIR particle filter Performance: real-time for 1-2 simultaneous objects © Michael Rubinstein

The BraMBLe posterior © Michael Rubinstein State at frame kImage Sequence Number, Positions, Shapes, Velocities, …

State space Hypothesis configuration: Object configuration: © Michael Rubinstein identifier position velocity shape

Object model A person is modeled as a generalized- cylinder with vertical axis in the world coords © Michael Rubinstein Calibrated Camera

Observation likelihood Image overlaid with rectangular Grid (e.g. 5 pixels) © Michael Rubinstein Y Cr Cb Mexican Hat (second deriv of G) Gaussian

Observation likelihood The response values are assumed conditionally independent given X © Michael Rubinstein

Appearance models GMMs for background and foreground are trained using kmeans © Michael Rubinstein

Observation likelihood © Michael Rubinstein

System (prediction) model The number of objects can change: – Each object has a constant probability to remain in the scene. – At each time step, there is constant probability that a new object will enter the scene. © Michael Rubinstein Prediction function Initialization function

Prediction function Motion evolution: damped constant velocity Shape evolution: 1 st order auto-regressive process model (ARP) © Michael Rubinstein

Particles © Michael Rubinstein N Points: N Weights:

Estimate Denotethe set of existing unique identifiers © Michael Rubinstein Total probability that object  i exists (particle,target)

Results N=1000 particles initialization samples always generated © Michael Rubinstein

Results Single foreground model cannot distinguish between overlapping objects – causes id switches © Michael Rubinstein

Parameters © Michael Rubinstein

Summary The particle filters were shown to produce good approximations under relatively weak assumptions – can deal with nonlinearities – can deal with non-Gaussian noise – Multiple hypotheses – can be implemented in O(N) – easy to implement – Adaptive focus on more probable regions of the state-space © Michael Rubinstein

In practice 1.State (object) model 2.System (evolution) model 3.Measurement (likelihood) model 4.Initial (prior) state 5.State estimate (given the pdf) 6.PF specifics 1.Importance density 2.Resampling method Configurations for specific problems can be found in literature © Michael Rubinstein

Thank you! © Michael Rubinstein

References Beyond the Kalman filter/ Ristic,Arulamplam,Gordon – Online tutorial: A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking/ Arulampalam et al 2002 Stochastic models, estimation and control/ Peter S. Maybeck An Introduction to the Kalman Filter/ Greg Welch, Gary Bishop Particle filters an overview/ Matthias Muhlich © Michael Rubinstein

Sequential derivation 1 Suppose at time k-1, characterize We receive new measurement and need to approximate using new set of samples We choose q such that And we can generate new particles © Michael Rubinstein

Sequential derivation 2 For the weight update equation, it can be shown that And so © Michael Rubinstein

Sequential derivation 3 Further, if Then the weights update rule becomes (and need not store entire particle paths and full history of observations) Finally, the (filtered) posterior density is approximated by (3) © Michael Rubinstein

Choice of importance density Choose q to minimize variance of weights Optimal choice: – Usually cannot sample from or solve for (in some specific cases is works) Most commonly used (suboptimal) alternative: – i.e. the transitional prior © Michael Rubinstein

Generic PF Resampling reduces degeneracy, but new problems arise… 1.Limits parallelization 2.Sample impoverishment: particles with high weights are selected many times which leads to loss of diversity – if process noise is small – all particles tend to collapse to single point within few interations – Methods exist to counter this as well… © Michael Rubinstein