Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL: www.cavs.msstate.edu/hse/ies/publications/seminars/msstate/2005/particle_filtering/www.cavs.msstate.edu/hse/ies/publicati

Slides:



Advertisements
Similar presentations
Dynamic Spatial Mixture Modelling and its Application in Cell Tracking - Work in Progress - Chunlin Ji & Mike West Department of Statistical Sciences,
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
On-Line Probabilistic Classification with Particle Filters Pedro Højen-Sørensen, Nando de Freitas, and Torgen Fog, Proceedings of the IEEE International.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Visual Recognition Tutorial
Particle Filter Speed Up Using a GPU High Performance Embedded Computing Workshop MIT Lincoln Labs By John Sacha & Andrew Shaffer Applied Research Laboratory.
Particle Filters.
Page 0 of 8 Time Series Classification – phoneme recognition in reconstructed phase space Sanjay Patil Intelligent Electronics Systems Human and Systems.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
A brief Introduction to Particle Filters
Sérgio Pequito Phd Student
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Darryl MorrellStochastic Modeling Seminar1 Particle Filtering.
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
HCI / CprE / ComS 575: Computational Perception
Bayesian Filtering for Robot Localization
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Tracking Pedestrians Using Local Spatio- Temporal Motion Patterns in Extremely Crowded Scenes Louis Kratz and Ko Nishino IEEE TRANSACTIONS ON PATTERN ANALYSIS.
Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Particle Filtering in Network Tomography
Markov Localization & Bayes Filtering
Object Tracking using Particle Filter
Computer vision: models, learning and inference Chapter 19 Temporal models.
System Identification of Nonlinear State-Space Battery Models
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
Particle Filtering (Sequential Monte Carlo)
Probabilistic Robotics: Monte Carlo Localization
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Probabilistic Robotics Bayes Filter Implementations.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Particle Filters.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems URL:
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
Sanjay Patil 1 and Ryan Irwin 2 Graduate research assistant 1, REU undergrad 2 Human and Systems Engineering URL:
Sanjay Patil 1 and Ryan Irwin 2 Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Sanjay Patil 1 and Ryan Irwin 2 Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Mobile Robot Localization (ch. 7)
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems URL:
An Introduction to Kalman Filtering by Arthur Pece
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Mixture Kalman Filters by Rong Chen & Jun Liu Presented by Yusong Miao Dec. 10, 2003.
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL:
S.Patil, S. Srinivasan, S. Prasad, R. Irwin, G. Lazarou and J. Picone Intelligent Electronic Systems Center for Advanced Vehicular Systems Mississippi.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Tracking with dynamics
Nonlinear State Estimation
Page 0 of 7 Particle filter - IFC Implementation Particle filter – IFC implementation: Accept file (one frame at a time) Initial processing** Compute autocorrelations,
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Statistical Significance Hypothesis Testing.
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems URL:
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems URL:
Introduction to Sampling based inference and MCMC
LECTURE 11: Advanced Discriminant Analysis
Probabilistic Robotics
Introduction to particle filter
Filtering and State Estimation: Basic Concepts
Introduction to particle filter
Presentation transcript:

Sanjay Patil and Ryan Irwin Intelligent Electronics Systems, Human and Systems Engineering Center for Advanced Vehicular Systems URL: HUMAN AND SYSTEMS ENGINEERING: Introduction to Particle Filtering

Page 1 of 22 Introduction to Particle Filtering Abstract Most conventional techniques used for speech analysis are based on modeling the speech signal as Gaussian mixture models. Nonlinear approaches are expected to outperform the conventional techniques because of their abilities to compensate for the mismatched channel conditions and to significantly reduce the complexity of the models. Particle filtering is one such nonlinear method based on sequential Monte Carlo technique. Particle filtering works by approximating the target probability distribution. Thus, it greatly reduces the complexities associated with the models.

Page 2 of 22 Introduction to Particle Filtering 5000 samples 500 samples200 samples Consider a some pdf p(x) Generate some random samples Conclusion More the number of samples better is the distribution function represented. The number of samples drawn at a particular probability represent the weight (contribution) by those samples towards the distribution function The contribution is called as the weight of the sample. Each sample is called as ‘Particle’ Drawing samples to represent a probability distribution function Concept of particles and their weights weight

Page 3 of 22 Introduction to Particle Filtering Particle filtering algorithm Different Names Sequential Monte Carlo filters Bootstrap filters Condensation Algorithm Survival of the fittest Problem Statement Tracking the state (parameters or hidden variables) as it evolves over time Sequentially arriving (noisy and non-Gaussian) observations Idea is to have best possible estimate of hidden variables

Page 4 of 22 Introduction to Particle Filtering Assume that pdf p(x k-1 | y 1:k-1 ) is available at time k -1. Prediction stage: This is a priori of the state at time k ( without the information on measurement). Thus, it is the probability of the state given only the previous measurements Update stage: This is posterior pdf from predicted prior pdf and newly available measurement. Particle filtering algorithm continued General two-stage framework (Prediction-Update stages)

Page 5 of 20 Introduction to Particle Filtering Particle filtering algorithm step-by-step (1) time Measurements / Observations States (unknown / hidden) cannot be measured Initial set-up: No observations available Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics Draw samples to represent x0 by its distribution p(x0)

Page 6 of 20 Introduction to Particle Filtering Particle filtering algorithm step-by-step (2) time Measurements / Observations States (unknown / hidden) cannot be measured Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics Still no observations or measurements are available. Predict x1 using equation

Page 7 of 20 Introduction to Particle Filtering Particle filtering algorithm step-by-step (3) time Measurements / Observations States (unknown / hidden) cannot be measured Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics First observation / measurement is available. Update x1 using equation

Page 8 of 20 Introduction to Particle Filtering Particle filtering algorithm step-by-step (4) time Measurements / Observations States (unknown / hidden) cannot be measured Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics Second observation / measurement is NOT available. Predict x2 using equation

Page 9 of 20 Introduction to Particle Filtering Particle filtering algorithm step-by-step (5) time Measurements / Observations States (unknown / hidden) cannot be measured Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics Second observation / measurement is available. Update x2 using equation

Page 10 of 22 Introduction to Particle Filtering Particle filtering - visualization Drawing samples Predicting next state Updating this state What is THIS STEP??? Resampling….

Page 11 of 22 Introduction to Particle Filtering Sampling Importance Resample algorithm (necessity)

Page 12 of 22 Introduction to Particle Filtering Applications Most of the applications involve tracking Visual Tracking – e.g. human motion (body parts) Prediction of (financial) time series – e.g. mapping gold price, stocks Quality control in semiconductor industry Military applications Target recognition from single or multiple images Guidance of missiles For IES NSF funded project, particle filtering has been used for: Time series estimation for speech signal (Java demo) Speaker Verification (details on next slide)

Page 13 of 22 Introduction to Particle Filtering Speaker Verification Time series estimation of speech signal Speaker Verification: Hypothesis: particle filters approximate the probability distribution of a signal. If large number of particles are used, it approximates the pdf better. Only needed is the initial guess of the distribution. ! How are we going to achieve this..

Page 14 of 22 Introduction to Particle Filtering Pattern Recognition Applet Java applet that gives a visual of algorithms implemented at IES Classification of Signals PCA - Principle Component Analysis LDA - Linear Discrimination Analysis SVM - Support Vector Machines RVM - Relevance Vector Machines Tracking of Signals LP - Linear Prediction KF - Kalman Filtering PF – Particle Filtering URL:

Page 15 of 22 Introduction to Particle Filtering Classification – Best Case Data sets need to be differentiated Classifying distinguishes between sets of data without the samples Algorithms separate data sets with a line of discrimination To have zero error the line of discrimination should completely separate the classes These patterns are easy to classify

Page 16 of 22 Introduction to Particle Filtering Classification – Worst Case Toroidals are not classified easily with a straight line Error should be around 50% because half of each class is separated A proper line of discrimination of a toroidal would be a circle enclosing only the inside set The toroidal is not common in speech patterns

Page 17 of 22 Introduction to Particle Filtering Classification – Realistic Case A more realistic case of two mixed distributions using RVM This algorithm gives a more complex line of discrimination More involved computation for RVM yields better results than LDA and PCA Again, LDA, PCA, SVM, and RVM are pattern classification algorithms More information given online in tutorials about algorithms

Page 18 of 22 Introduction to Particle Filtering Signal Tracking – Kalman Filter The input signals are now time based with the x-axis representing time Signal tracking algorithms interpolate data Interpolation ensures that the input samples are at regular intervals Sampling is always done on regular intervals Kalman filter is shown here

Page 19 of 22 Introduction to Particle Filtering Signal Tracking – Particle Filter Algorithm has realistic noise Gaussian noise is actually generated at each step Noise variances and number of particles can be customized Algorithm runs as previously described 1.State prediction stage 2.State update stage Average of the black particles is where the overall state is predicted

Page 20 of 22 Introduction to Particle Filtering Summary Particle filtering promises to be one of the nonlinear techniques. More points to follow

Page 21 of 22 Introduction to Particle Filtering References S. Haykin and E. Moulines, "From Kalman to Particle Filters," IEEE International Conference on Acoustics, Speech, and Signal Processing, Philadelphia, Pennsylvania, USA, March M.W. Andrews, "Learning And Inference In Nonlinear State-Space Models," Gatsby Unit for Computational Neuroscience, University College, London, U.K., December P.M. Djuric, J.H. Kotecha, J. Zhang, Y. Huang, T. Ghirmai, M. Bugallo, and J. Miguez, "Particle Filtering," IEEE Magazine on Signal Processing, vol 20, no 5, pp , September N. Arulampalam, S. Maskell, N. Gordan, and T. Clapp, "Tutorial On Particle Filters For Online Nonlinear/ Non-Gaussian Bayesian Tracking," IEEE Transactions on Signal Processing, vol. 50, no. 2, pp , February R. van der Merve, N. de Freitas, A. Doucet, and E. Wan, "The Unscented Particle Filter," Technical Report CUED/F-INFENG/TR 380, Cambridge University Engineering Department, Cambridge University, U.K., August S. Gannot, and M. Moonen, "On The Application Of The Unscented Kalman Filter To Speech Processing," International Workshop on Acoustic Echo and Noise, Kyoto, Japan, pp 27-30, September J.P. Norton, and G.V. Veres, "Improvement Of The Particle Filter By Better Choice Of The Predicted Sample Set," 15th IFAC Triennial World Congress, Barcelona, Spain, July J. Vermaak, C. Andrieu, A. Doucet, and S.J. Godsill, "Particle Methods For Bayesian Modeling And Enhancement Of Speech Signals," IEEE Transaction on Speech and Audio Processing, vol 10, no. 3, pp , March 2002.