Particle Filter in Tracking

Slides:



Advertisements
Similar presentations
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
1 Approximated tracking of multiple non-rigid objects using adaptive quantization and resampling techniques. J. M. Sotoca 1, F.J. Ferri 1, J. Gutierrez.
Dynamic Bayesian Networks (DBNs)
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Particle Filters.
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
A brief Introduction to Particle Filters
Sérgio Pequito Phd Student
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
Today Introduction to MCMC Particle filters and MCMC
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Tracking with focus on the particle filter
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
HCI / CprE / ComS 575: Computational Perception
Bayesian Filtering for Robot Localization
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Mobile Robot controlled by Kalman Filter
Particle Filter & Search
Markov Localization & Bayes Filtering
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
Tracking with focus on the particle filter (part II) Michael Rubinstein IDC.
Object Tracking using Particle Filter
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Probabilistic Robotics: Monte Carlo Localization
Probabilistic Robotics Bayes Filter Implementations.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Particle Filters.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Mobile Robot Localization (ch. 7)
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
OBJECT TRACKING USING PARTICLE FILTERS. Table of Contents Tracking Tracking Tracking as a probabilistic inference problem Tracking as a probabilistic.
Tracking with dynamics
Nonlinear State Estimation
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Lecture 1.31 Criteria for optimal reception of radio signals.
Intro to Sampling Methods
Tracking Objects with Dynamics
Department of Civil and Environmental Engineering
Particle Filtering for Geometric Active Contours
Probabilistic Robotics
Course: Autonomous Machine Learning
Introduction to particle filter
Visual Tracking CMPUT 615 Nilanjan Ray.
Particle Filter/Monte Carlo Localization
Particle filters for Robot Localization
Filtering and State Estimation: Basic Concepts
Introduction to particle filter
A Tutorial on Bayesian Speech Feature Enhancement
Particle Filtering.
Non-parametric Filters: Particle Filters
Non-parametric Filters: Particle Filters
Presentation transcript:

Particle Filter in Tracking Presenter: Muhammad IR Rao

Presentation Layout: Motivation for tracking Background Problem Overview Applications Challenges Different approaches for Tracking Kalman Vs Particle Filters Background Model Setup (HMM) The Bayesian Approach Bayes Recursive Filter Examples

Presentation Layout: Particle Filtering Introduction to particle filtering MC integration Sequential Importance Sampling (SIS) Resampling Basic Particle Filter algorithm (SIR) Example PF variants

Problem overview Input Goal (Noisy) Sensor measurements Goal Estimate most probable measurement at time k using measurement up to time k’ k’<k: prediction k‘>k: smoothing Many problems require estimation of the state of systems that change over time using noisy measurements on the system

Applications Tracking - the process of locating a moving object (or multiple objects) over time using a camera. Variety of uses: human-computer interaction, security and surveillance, video communication, traffic control…

Applications Ballistics Robotics Tracking hands/cars/… Econometrics © Michael Rubinstein Ballistics Robotics Robot localization Tracking hands/cars/… Econometrics Stock prediction Navigation Many more…

Challenges-Tracking Measurements Detection specific Efficiency Noise Errors Detection specific Full/partial occlusions False positives/false negatives Entering/leaving the scene Efficiency Multiple models and switching dynamics Multiple targets

Different approaches for Object Tracking Correlation – based Feature based Gradient based Color Histograms Kalman Filter Particle Filter

Particle Vs Kalman Kalman filter: Particle filter: Assumes uni-modal (Gaussian) distribution. Predicts single new state for each object tracked. Updates state based on error between predicted state and observed data. Can work for multi-modal distribution. Predicts multiple possible states for each object tracked. Each possible state has a different probability. Estimates probabilities of predicted states based on observed data.

Presentation Layout: Motivation Background Problem Overview Applications Challenges Different approaches for Tracking Kalman Vs Particle Filters Background Model Setup (HMM) The Bayesian Approach Bayes Recursive Filter Examples

Hidden Markov Model (HMM) The state is not directly visible, but output dependent on the state is visible k-1 k k+1 time xk-1 xk xk+1 States (hidden) zk-1 zk zk+1 Measurements (observed)

Conceptual Example: Weather State1 , rain or snow State2 , cloudy State3 , sunny Suppose the state transition probabilities are Given that day 1 is sunny x0 = 3, What is the probability that this week will be sun-sun-rain-rain-sun-cloudy-sun?

Conceptual Example: Weather

Dynamic System State equation: Observation equation: k-1 k k+1 xk-1 xk xk+1 zk-1 zk zk+1 Stochastic diffusion State equation: state vector at time instant k state transition function, i.i.d process noise Observation equation: observations at time instant k observation function, i.i.d measurement noise State equation a.k.a evolution equation, system model, system equation Note the k subscript which allows the system model to be time-variant

A simple dynamic system (4-dimensional state space) Constant velocity motion: Only position is observed:

The Bayesian Approach Thomas Bayes In classical approaches to estimation, the parameter X you are trying to estimate is assumed to be deterministic but unknown In the Bayesian context, we assume that the random variable X that we are trying to estimate is random and unknown This is called a Bayesian approach because it's based directly on Bayes' theorem

Maximum Likelihood Example

Recursive Bayes filters Given: System models in probabilistic forms (known statistics of vk, wk) Initial state also known as the prior Measurements We are interested in: Belief or posterior density Markovian process System state dynamics Measurements are conditionally independent given the state Observation dynamics

Recursive Bayes filters Prediction step (a-priori) Uses the system model to predict forward Deforms/translates/spreads state pdf due to random noise Update step (a-posteriori) Update the prediction in light of new data Tightens the state pdf

Recursive Bayes filter Sample space Prediction: Update: Previous posterior System model (1) Measurement model Current prior Normalization constant (2)

Generating Estimates Knowledge of enables to compute optimal estimate with respect to any criterion. e.g. Minimum mean-square error (MMSE) Maximum a-posteriori

Example 1 Step 0: initialization Step 1: updating

Example 1 Step 2: predicting Step 3: updating 1 Step 4: predicting

Presentation Layout: Motivation Background Problem Overview Applications Challenges Different approaches for Tracking Kalman Vs Particle Filters Background Model Setup (HMM) The Bayesian Approach Bayes Recursive Filter Examples

Presentation Layout: Particle Filtering Introduction to particle filtering MC integration Sequential Importance Sampling (SIS) Resampling PF variants Example

Particle Filters Sequential Monte Carlo methods for on-line learning within a Bayesian framework. Known as Particle filters Sequential sampling-importance resampling (SIR) Bootstrap filters Condensation trackers Interacting particle approximations Survival of the fittest

Particle filtering ideas Particle filter is a technique for implementing recursive Bayesian filter by Monte Carlo sampling The idea: represent the posterior density by a set of random particles with associated weights. Compute estimates based on these samples and weights Posterior density Particle filters are based on recursive generation of random measures that approximate the distributions of the unknowns. Random measures: particles and importance weights. As new observations become available, the particles and the weights are propagated by exploiting Bayes theorem. Sample space

Problem Domain: Monte Carlo Integration Prediction: Update: The integral in the prediction step is impossible to evaluate analytically in most cases If model is linear and noise is Gaussian, end up with the Kalman filter recursions Monte Carlo Integration is a possible way out of this problem

Monte Carlo simulation Recall that Random samples are drawn from the posterior distribution. Represent posterior distribution using a set of samples or particles. Prediction:

Random samples and the pdf (I) Take p(x)=Gamma(4,1) Generate some random samples Plot histogram and basic approximation to pdf 200 samples

Random samples and the pdf (II)

Importance Sampling Unfortunately it is often not possible to sample directly from the posterior distribution, but we can use importance sampling. Let p(x) be a pdf from which it is difficult to draw samples. Let xi ~ q(x), i=1, …, N, be samples that are easily generated from a proposal pdf q, which is called an importance density. Then approximation to the density p is given by where

Choice of importance density Hsiao et al.

IS for Bayesian Estimation We characterize the posterior pdf using a set of samples (particles) and their weights Then the joint posterior density at time k is approximated by We draw the samples from the importance density with importance weights Sequential update (after some calculation…) Particle update Weight update

Choice of importance density Most common (suboptimal): the transitional prior

Sequential Importance Sampling (SIS) FOR i=1:N Draw Update weights END Normalize weights Measurement model System model

State estimates Any function can be calculated by discrete pdf approximation MAP Mean Robust mean Example: Mean (simple average) MAP estimate: particle with largest weight Robust mean: mean within window around MAP estimate

Problem Domain: Degeneracy Unavoidable problem with SIS: after a few iterations most particles have negligible weights Large computational effort for updating particles with very small contribution to Measure of degeneracy - the effective sample size: Uniform: , severe degeneracy:

Resampling The idea: when degeneracy is above some threshold, eliminate particles with low importance weights and multiply particles with high importance weights The new set is generated by sampling with replacement from the discrete representation of such that

Resampling Generate N i.i.d variables Sort them in ascending order Compare them with the cumulative sum of normalized weights Ristic et al.

Resampling Hsiao et al.

Generic PF Apply SIS filtering Calculate IF END

Generic PF Van der Merwe et al. Uniformly weighted measure Approximates Compute for each particle its importance weight to Approximate (Resample if needed) Project ahead to approximate Van der Merwe et al.

PF variants Sampling Importance Resampling (SIR) Auxiliary Sampling Importance Resampling (ASIR) Regularized Particle Filter (RPF) Local-linearization particle filters Multiple models particle filters (maneuvering targets) Mean shift embedded particle filter (MSEPF)

Sampling Importance Resampling (SIR) A.K.A Bootstrap filter, Condensation Initialize from prior distribution For k > 0 do Resample into Predict Reweight Normalize weights Estimate (for display)

Example 2: Particle Filter Step 0: initialization Each particle has the same weight Step 1: updating weights. Weights are proportional to p(z|x)

Step 2: predicting. Predict the new locations of particles. Step 3: updating weights. Weights are proportional to p(z|x) Step 4: predicting. Predict the new locations of particles. Particles are more concentrated in the region where the person is more likely to be

Comparison: PF vs Bayes Estimation Updating Example 1 Example 2 Predicting Example 1 Example 2

Numerical Example Tracking a robot: Objective: Track the position of a robot across two states using four particles. States: A, B ( 𝑥 𝑡 𝑖 =A means that particle i is at position A at time t)

Numerical Example Step 1 – model setup: The robot starts out as equally likely to be in either state. 𝑃 𝐴 0 =0.5 , 𝑃 𝐵 0 =0.5 Motion Model: Any object has probability of moving to or staying in B of 2 3 and any object of moving to or staying in A of 1 3 𝑃 𝐴 𝑡 𝐵 𝑡−1 = 1 3 , 𝑃 𝐵 𝑡 𝐴 𝑡−1 = 2 3 Observation Model: If an objects is in state s in {A,B}, the probability that our observation is accurate is 80% 𝑃(O𝑡=𝑠│𝑋𝑡 =𝑠)=4/5 We create four particles: Each particle is equally likely to be in A or B B A 0.5 B A 2 3 1 3

Numerical Example Step 2 – Sample: For each particle: 𝑃 𝐴 1 =𝑃 𝐴 1 𝐴 0 𝑃 𝐴 0 + 𝑃 𝐴 1 𝐵 0 𝑃 𝐵 0 =( 1 3 ) 1 2 + 1 3 1 2 = 1 3 𝑃 𝐵 1 = 2 3 Let’s say we end up with two particles in A, two particles in B. Weights: Two Particles in A each have unnormalized weight 4/5 (𝑃(𝑂1=𝐴│𝑋1=𝐴)). Two particles in B have weight 1/5 (𝑃(𝑂1=𝐵│𝑋1=𝐵)).

Numerical Example Step 3 - resample: So our particles have weights: A: 4 5 , 4 5 B: 1 5 , 1 5 These weights add up to two, so we normalize by dividing through by 2. A: 2 5 , 2 5 B: 1 10 , 1 10 Step 3 - resample: Now we resample our four particles. Every time, there is: 80% chance it will be A ( 2 5 + 2 5 ) 20% chance it will be B( 1 10 + 1 10 ) Let’s say that we end up with 3 particles in A, and 1 particle in B. We repeat step 2 and 3 until we have the robot localized.

Matlab Example Color Based Probabilistic Tracking These trackers rely on the deterministic search of a window, whose color content matches a reference histogram color model. Uses principle of color histogram distance. Reference Color Window The target object to be tracked forms the reference color window. Its histogram is calculated, which is used to compute the histogram distance while performing a deterministic search for a matching window.

Matlab Example System Dynamics Observation yt A second-order auto-regressive dynamics is chosen on the parameters used to represent our state space i.e (x,y). The dynamics is given as: Xt+1 = Axt + Bxt-1 Matrices A and B could be learned from a set of sequences where correct tracks have been obtained. Observation yt The observation yt is proportional to the histogram distance between the color window of the predicted location in the frame and the reference color window. Yt α Dist(q,qx), Where q = reference color histogram. qx = color histogram of predicted location

Matlab Example Particle Filter Iteration Steps: Initialize xt for first frame Generate a particle set of N particles {xmt}m=1..N Prediction for each particle using second order auto- regressive dynamics. Compute histogram distance Weigh each particle based on histogram distance Select the location of target as a particle with minimum histogram distance. Sampling the particles for next iteration.

Matlab Example An step by step look and highlighting the concepts applied: Initialization of state space for the first frame and calculating the reference histogram: reference = imread('reference.jpg'); [ref_count,ref_bin] = imhist(reference); x1= 45; y1= 45; Describing the N particles within a specified window: for i = 1:N x(1,i,1) = x1 + 50 * rand(1) - 50 *rand(1); x(2,i,1) = y1 + 50 * rand(1) - 50 *rand(1); end For each particle, apply the second order dynamics equation to predict new states: if (j==2) x(:,i,j) = A * x(:,i,j-1); else x(:,i,j)=rand(n_x)*x(:,i,j-1)+rand(n_x)*x(:,i,j-2);

Matlab Example Re-sampling step, where the new particle set is chosen: The color window is defined and the histogram is calculated: rect = [(x(1,i,j)-15),(x(2,i,j)-15),30,30]; [count,binnumber] = imhist(imcrop(I(:,:,:,j),rect)); Calculate the histogram distance: for k = 1:255 d( I , j ) = d( i, j ) + (double ( count ( k ) ) - double(ref_count( k ) ) ) ^ 2; end Calculating the normalized weight for each particle: w(:,j) = w(:,j)./sum(w(:,j)); w(:,j) = one(:,1) - w(:,j); Re-sampling step, where the new particle set is chosen: for i = 1:N x(1,i,j) = state(1,j) + 50 * rand(1) - 50 *rand(1); x(2,i,j) = state(2,j) + 50 * rand(1) - 50 *rand(1);

References L. R. Rabiner. A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE, 77(2):257{286, 1989. Andrew M. Fraser. Hidden Markov Models and Dynamical Systems. Society of Industrial and Applied Mathematics, 2008. Steven M. Kay. Fundamentals of Statistical Signal Processing, Detection Theory. Prentice Hall, 1998. Branko Ristic, Sanjeev Arulampalam, and Neil Gordon. Beyond the Kalman Filter: Particle Filters for Tracking Applications. Artech House, 2004. M. P. Wand and M. C. Jones. Kernel Smoothing. Number 60 in Monographs on Statistics and Applied Probability. Chapman & Hall, 1995