From Bayesian to Particle Filter

Slides:



Advertisements
Similar presentations
Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
1 Approximated tracking of multiple non-rigid objects using adaptive quantization and resampling techniques. J. M. Sotoca 1, F.J. Ferri 1, J. Gutierrez.
Dynamic Bayesian Networks (DBNs)
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
Observers and Kalman Filters
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters.
TOWARD DYNAMIC GRASP ACQUISITION: THE G-SLAM PROBLEM Li (Emma) Zhang and Jeff Trinkle Department of Computer Science, Rensselaer Polytechnic Institute.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Sérgio Pequito Phd Student
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
Object Detection and Tracking Mike Knowles 11 th January 2005
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Probabilistic Robotics
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Stanford CS223B Computer Vision, Winter 2006 Lecture 11 Filters / Motion Tracking Professor Sebastian Thrun CAs: Dan Maynes-Aminzade, Mitul Saha, Greg.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Overview and Mathematics Bjoern Griesbach
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Colorado Center for Astrodynamics Research The University of Colorado STATISTICAL ORBIT DETERMINATION Project Report Unscented kalman Filter Information.
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
1 Mohammed M. Olama Seddik M. Djouadi ECE Department/University of Tennessee Ioannis G. PapageorgiouCharalambos D. Charalambous Ioannis G. Papageorgiou.
Markov Localization & Bayes Filtering
Object Tracking using Particle Filter
Computer vision: models, learning and inference Chapter 19 Temporal models.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Particle Filtering (Sequential Monte Carlo)
Computer vision: models, learning and inference Chapter 19 Temporal models.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics Bayes Filter Implementations.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
-Arnaud Doucet, Nando de Freitas et al, UAI
Mobile Robot Localization (ch. 7)
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
An Introduction to Kalman Filtering by Arthur Pece
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Mixture Kalman Filters by Rong Chen & Jun Liu Presented by Yusong Miao Dec. 10, 2003.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
Tracking with dynamics
Nonlinear State Estimation
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Tracking Objects with Dynamics
PSG College of Technology
Course: Autonomous Machine Learning
Dynamical Statistical Shape Priors for Level Set Based Tracking
Introduction to particle filter
Filtering and State Estimation: Basic Concepts
Introduction to particle filter
Presentation transcript:

From Bayesian to Particle Filter San Francisco State University Mathematics Department Fang-I Chu

Outline Introduction Mathematical Background Literatures Review Application – Object Tracking in Video Conclusion

Introduction Why tracking moving object is important ? Simulate the moving path of the target Important applications: robot, medical (eye movement, neurology), security systems, Wii What we want to do Simple movement: easy Erratic movement: need to be done How we approach it Filtering algorithms in computer science vision

Mathematical Background Conditional Probability Bayes’ Theorem Prior and Posterior Probability Markov Chain

Conditional Probability Any probability that is revised to take into account the known occurrence of other events Written as , given the probability of event happens, the probability of event will happen

Bayes’ Theorem Law of Total Probability we obtain Let is an event in , if events form a partition of the events form a partition of and since they are disjoint, we have substituting with the formula of conditional probability, we obtain

Bayes’ Thereom Bayes’ Theorem Relatively minor extension of definition of conditional probability Compute from The form is also known as Bayes’ Formula Let events form the partition of the space, and , for ,

Prior and Posterior Probability Definition Prior Probabilities the original probabilities of an outcome Posterior Probabilities the probabilities we obtained after updating with new information from the experiment

Prior Density Function Let stand for the density function of , where is a random vector with range of The function is called prior density function represents our information about before the experiment

Posterior Density Function The posterior density function of , given observed value , using the definition of conditional probability, we have In words,

Markov Chain Markov Chains A stochastic process Suppose that whenever the process is in state , there is a fixed probability that it will next be in state , which can be written as, For a Markov chain, above equation states that the conditional distribution of any future state given the past states and the present state is independent of the past states and depends only on the present state

Literature Review Linear systems Nonlinear systems Particle Filter Kalman Filter Nonlinear systems Extended Kalman Filter Unscented Filter Particle Filter Bayesian filter Monte Carlo Simulation Sequential Importance Sampling Sequential Importance Re-sampling CONDENSATION algorithm

Kalman Filter Linear system-Kalman Filter Recursive linear estimator Applys only to Gaussian density Algorithm State-space model Forward recursion Smoothing Diffuseness

Kalman Filter as density propagation

Kalman Filter State-space model A special case of the signal-plus-noise model Under the assumption of signal-plus-noise model, as stands for the response vector, we have where the signal vector as the state vectors are assumed to propagate via state equations Detail of state space model

Kalman Filter We write the state-space model as, We derive the Best Linear Unbiased Prediction (BLUP) of based on (innovation vectors) as, Detail of state space model

Kalman Filter Forward Recursions Follow 2 steps: 1. translate the response vectors to the innovation vectors 2. find the BLUP of state vector and signal vector We get the forward recursion formula as

Kalman Filter Smoothing (Backward Recursion) Including the information from new response data into predictors The idea of smoothing is modifying the BLUP from being based on to being based on for some How much new information we need to incorporate into the estimator ? Use the most information possible

Kalman Filter Diffuseness Assume is a random vector with mean zero and variance-covariance matrix The specific choice for have a profound effect on predictors When we produce predictions which can be computed directly without specifying , we need to modify the recursions into diffuse Kalman Filter

Nonlinear Systems Nonlinear systems Kalman Filter is not enough because most applications are presented in nonlinear systems When observation density is non-Gaussian, the evolving state density is also generally non- Gaussian Let the conditional density be a time-independent function

Non-Gaussian State-density Propagation

Nonlinear Systems The rule for propagation of state density over time is is a normalization constant that does not depend on

Nonlinear filtering Apply nonlinear filter to evaluate the state density over time Four distinct probability distributions represented in non- linear Bayesian Filter; three of them form part of the problem specification and fourth constitutes the solution Three specified distributions are: The prior density for the state The process density that describes the stochastic dynamics The observation density

Nonlinear filtering Interested in the 2nd type, the approach to this type of filtering is to approximate the dynamics as a linear process and proceed as for linear Kalman filter The 4th type is simply to integrate the equation in previous slide directly, using a suitable numerical representation of the state density Two filtering methods of 2nd type: Extended Kalman Filter Unscented Filter

Extended Kalman Filter (EKF) The most common approach Reliable for systems which are almost linear on the time scale of the update interval Difficult to implement Heavy computational work required Algorithm scheme to linearize all nonlinear models, then apply the traditional Kalman Filter

Example of EKF With linearization, the position where 1meter is, in reality it represents 96.7 cm. In practice, the inconsistency can be resolved by introducing additional stabilizing noise which increase the size of the transformed covariance. The noise leads to biased estimates. Why EKFs are difficult to tune ? Since sufficient noise is needed to offset the defects of linearization.

Unscented Filter Unscented filter Based on the concept of the unscented transform The mean is calculated to a higher order of accuracy than the EKF The algorithm is suitable for any process model, and implementation is rapid (avoid the linearization computation as in EKF)

Unscented Transform

Example of Unscented Filter The true mean lies at with a dotted covariance contour. The unscented mean lies at with a solid contour. The linearized mean lies at with a dashed contour. The unscented mean value is the same as the true value. The unscented transform is consistent.

Particle Filter Does not require the linearization of the relation between the state and the measurement Maintains several hypotheses over time, and gives increased robustness Develop our idea in following order Bayesian filter Monte Carlo Simulation Sequential Importance Sampling Sequential Importance Re-sampling CONDENSATION algorithm

Particle Filter Bayesian filter Considering the probabilistic inference problem in which the state variable set is estimated from the observed evidence , , the filtered pdf through recursive estimation is

Particle Filter Monte Carlo Simulation Sequential Importance sampling The recursive estimation from Bayesian Filter needs strong assumptions to evaluate. This problem is resolved by using Monte Carlo methods. Sequential Importance sampling Avoid the difficulty to sampling directly from the posterior density by sampling from an proposal distribution. The posterior function can be approximated well by drawing samples from a proposal distribution.

Particle Filter Sequential Importance Re-sampling A re-sampling stage is used to prune those particles with negligible importance weights, and multiply those with higher ones. A posterior density function is iteratively computed This pdf undergoes a diffusion-reinforcement process, which is followed by factored sampling algorithm.

Particle Filter CONDENSATION algorithm (conditional density propagation for visual tracking) A particular Particle Filtering method Not necessary Gaussian The probability densities must be established for dynamic of the object and the observation process Factor sampling generate a random variables from a distribution that approximates the posterior weight coefficients are decided after a sample set is generated from the prior density

Steps in CONDENSATION algorithm An element undergoes drift (deterministic step) Identical elements in the new set undergo the same drift Diffusion step : random and identical elements now split The sample set with new time-step is generated without its weight The observation step from factor sampling is applied, generating weights from observation density The sample-set representation of state-density is obtained.

Observation process The thick line is a hypothesised shaped, represented as a parametric spline curve.

Example: CONDENSATION Tracking agile motion in clutter

Object tracking in video Problem Taking a real-time video on a moving object in certain period of time, tracking this object and marking it Data 60 consecutive frames are selected from 800 frames with dimension 240×320 image which extracted from a 55 second real-time video (running dog image) 60 consecutive frames are offered by Toby Breckon from Matlab Central Forum (dropping ball image) Method Kalman Filter Particle Filter Environment Matlab

Results for Kalman Filter (data set 1)

Results for Kalman Filter (data set 2)

Results for Particle Filter (data set 1)

Results for Particle Filter (data set 2)

Discussion Why did data set 1 (running dog) show less precision using Kalman Filter algorithm ? Possible noise factor in data set 1 absent of initial background images non identical background( background changes along with the running motion) multi moving object Why we did not see the true position (green circle) in the images of Particle Filter for both data sets ? First marking the true position as green circle, and second marking the predicted position. When the predicted position overlapped the true position, the true position concealed.

Comparison Kalman Filter Particle Filter Prediction accuracy Good at the starting frames Almost 100% close to true position Max lag effect Abrupt acceleration or bouncing No lag effect Noise factor Possible reason for increasing lag effect Noise does not influence the accuracy to predict true position Implementation process (code) Less iteration; simple More iterations; complicated Assumption Used only in Gaussian distribution Not necessary Gaussian

Conclusion Particle Filter is a superior algorithm than Kalman Filter in the requirement of assumption the ability to deal with noise in model the accuracy of its prediction The re-sampling stage in Particle Filter algorithm is considered to play a vital role in increasing the accuracy of the prediction. The accuracy of prediction in Kalman Filter algorithm reduced when the motions of object is abrupt acceleration or bouncing. (lag effects) The trait of data could be a factor to affect the prediction result.

Future Research Design appropriate models to cope with different types of data set Multi object tracking Moving object in non-identical background Tracking in clutter Comparison of implementation between Extended Kalman Filter, Unscented Filter, Kalman Filter and Particle Filter on real-time example

Special thanks to My thesis advisor Dr. Mohammad Kafai My thesis committees Dr. Yitwah Cheung Dr. Alexandra Piryatinska Computer Science Professor Dr. Kaz Okada

Thanks for technical support from Scott M. Shell Network Engineer Allen Yen Telecommunication Engineer Mehran Kafai Computer Science Ph.D candidate

Thanks for your attending! The End.