Download presentation
Presentation is loading. Please wait.
Published byBruce Holmes Modified over 9 years ago
1
Sanjay Patil and Ryan Irwin Intelligent Electronics Systems Human and Systems Engineering Center for Advanced Vehicular Systems URL: www.cavs.msstate.edu/hse/ies/publications/seminars/msstate/2005/particle_filtering/www.cavs.msstate.edu/hse/ies/publications/seminars/msstate/2005/particle_filtering/ HUMAN AND SYSTEMS ENGINEERING: Introduction to Particle Filtering
2
Page 1 of 22 Introduction to Particle Filtering Abstract The conventional techniques in speech recognition applications model speech as Gaussian mixtures lacks robustness to noise and mismatched channel Nonlinear techniques model speech as a time-varying and non-stationary signal Particle filtering a nonlinear method based on sequential Monte Carlo techniques a technique that can be used for prediction or filtering of signal works by approximating the target probability distribution (e.g. amplitude of speech signal) possible to increase the number of Gaussian mixtures to improve the prediction or filtering of signal.
3
Page 2 of 22 Introduction to Particle Filtering 5000 samples 500 samples200 samples consider a pdf p(x) (blue line) generate random samples which can represent this pdf (N = # of samples) plot the histogram of the samples (red lines) Conclusion approximation depends on number (N) of samples amplitude ( x) of a sample (i) is its weight. each sample is called as ‘Particle’ Drawing samples to represent a probability distribution function Particles and their weights
4
Page 3 of 22 Introduction to Particle Filtering An experimental set-up (Markov chain and conditional probability) –A person is present behind a curtain –He has N urns containing M color balls each –He picks a ball from one of the urns and shows you the color. From the sequence of colored balls, –determine from which urn the ball was picked To find the urn sequence – Starting point –probability of selecting an urn, –probability of picking a particular ball from the selected urn. Hidden states and Observations Observations (known) States (unknown)
5
Page 4 of 22 Introduction to Particle Filtering Particle filtering algorithm Problem Statement – find what x is at a given time instant Observations: known (measured) (y 1, y 2, y 3, … y k-2, y k-1, y k,...) States: unknown (calculated) (x 0, x 1, x 2, x 3, … x k-2, x k-1, x k, … ) subscripts indicate time index. State-Space Model State-Transition equation State-Observation equation
6
Page 5 of 22 Introduction to Particle Filtering Assume that pdf p(x k-1 | y 1:k-1 ) is available at time k -1. Prediction stage: This is a priori of the state at time k ( without the information on measurement). Thus, it is the probability of the state given only the previous measurements Update stage: This is posterior pdf from predicted prior pdf and newly available measurement. Particle filtering algorithm continued General two-stage framework (Prediction-Update stages)
7
Page 6 of 22 Introduction to Particle Filtering Particle filtering algorithm step-by-step (1) Initial set-up: No observations available Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics Draw samples to represent x0 by its distribution p(x0) time Measurements / Observations States (unknown / hidden) cannot be measured (1.00, -1.176, 0.427, 0.906, 1.072) N = 5
8
Page 7 of 22 Introduction to Particle Filtering Particle filtering algorithm step-by-step (2) Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics Still no observations or measurements are available. Predict x1 using equation time Measurements / Observations States (unknown / hidden) cannot be measured (0.5370, -0.9480, 0.63080, 1.51697, 0.39145 )
9
Page 8 of 22 Introduction to Particle Filtering Particle filtering algorithm step-by-step (3) Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics First observation / measurement is available. Update x1 using equation time Measurements / Observations States (unknown / hidden) cannot be measured 0.42 (0.5370, 0.63080, 0.630, 0.630, 1.0 )0.685
10
Page 9 of 22 Introduction to Particle Filtering Particle filtering algorithm step-by-step (4) Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics Second observation / measurement is NOT available. Predict x2 using equation time Measurements / Observations States (unknown / hidden) cannot be measured (-1.651, 0.831, 1.888, 1.459, 2.540)
11
Page 10 of 22 Introduction to Particle Filtering Particle filtering algorithm step-by-step (5) Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics Second observation / measurement is available. update x2 using equation time Measurements / Observations States (unknown / hidden) cannot be measured -0.01 (-1.651, -1.651, 0.831, 0.831, 1.0 ) -0.12
12
Page 11 of 22 Introduction to Particle Filtering Particle filtering algorithm step-by-step (6) Known parameters – x 0, p(x 0 ), p(x k |x k-1 ), p(y k |x k ), noise statistics k th observation / measurement is available. Predict and Update xk using equation time Measurements / Observations States (unknown / hidden) cannot be measured
13
Page 12 of 22 Introduction to Particle Filtering Applications Most of the applications involve tracking Ice Hockey Game – tracking the players demo*demo Ref.* Kenji Okuma, Ali Taleghani, Nando de Freitas, Jim Little and David Lowe. A Boosted Particle Filter: Multitarget Detection and Tracking. 8 th European Conference on Compute Vision, ECCV 2004, Prague, Czech Republic. http://www.cs.ubc.ca/~nando/publications.html http://www.cs.ubc.ca/~nando/publications.html Ref.^M. Gabrea, “Robust adaptive Kalman Filtering-based speech enhancement algorithm,” ICASSP 2004, vol 1, pp I-301-4, May 2004. K. Paliwal, “Estimation of noise variance from the noisy AR signal and its application in speech enhancement,” IEEE transaction on Acoustics, Speech, and Signal Processing, vol 36, no 2, pp 292-294, Feb 1988. At IES – NSF funded project, particle filtering has been used for: Time series estimation for speech signal^ Speaker Verification Speech verification algorithm based on HMM and Particle Filtering algorithm.
14
Page 13 of 22 Introduction to Particle Filtering Time Series Prediction Implementation : Problem statement : in presence of noise, estimate the clean speech signal. Order defines the number of previous samples used for prediction. Noise calculation is based on Modified Yule-Walker equations. y t – speech amplitude in presence of noise, x t – cleaned speech signal. Feature Extraction Order of Prediction Model Estimation State PredictsState Updates Number of particles part of the figure (ref): www.bioid.com/sdk/docs/About_Preprocessing.htmwww.bioid.com/sdk/docs/About_Preprocessing.htm
15
Page 14 of 22 Introduction to Particle Filtering Speaker Verification Hypothesis Particle filters approximate the probability distribution of a signal If large number of particles are used, it approximates the pdf better Attempt will be made to use more Gaussian mixtures as compared to the existing system Trade-off between number of passes and number of particles Feature Extraction Claimed ID ClassifierDecision Accept Reject Speaker ModelImposter Model Changes will be made here…
16
Page 15 of 22 Introduction to Particle Filtering Pattern Recognition Applet Java applet that gives a visual of algorithms implemented at IES Classification of Signals PCA - Principal Component Analysis LDA - Linear Discrimination Analysis SVM - Support Vector Machines RVM - Relevance Vector Machines Tracking of Signals LP - Linear Prediction KF - Kalman Filtering PF – Particle Filtering URL: http://www.cavs.msstate.edu/hse/ies/projects/speech/software/demonstrations/applets/util/pattern_recognition/current/index.html
17
Page 16 of 22 Introduction to Particle Filtering Classification Algorithms – Best Case Data sets need to be differentiated Classifying distinguishes between sets of data without the samples Algorithms separate data sets with a line of discrimination To have zero error the line of discrimination should completely separate the classes These patterns are easy to classify
18
Page 17 of 22 Introduction to Particle Filtering Classification Algorithms – Worst Case Toroidals are not classified easily with a straight line Error should be around 50% because half of each class is separated A proper line of discrimination of a toroidal would be a circle enclosing only the inside set The toroidal is not common in speech patterns
19
Page 18 of 22 Introduction to Particle Filtering Classification Algorithms – Realistic Case A more realistic case of two mixed distributions using RVM This algorithm gives a more complex line of discrimination More involved computation for RVM yields better results than LDA and PCA Again, LDA, PCA, SVM, and RVM are pattern classification algorithms More information given online in tutorials about algorithms
20
Page 19 of 22 Introduction to Particle Filtering Signal Tracking Algorithms – Kalman Filter Predicts the next state of the signal given prior information Signals must be time based or drawn from left to right X-axis represents time axis Algorithms interpolate data ensuring periodic sampling Kalman filter is shown here
21
Page 20 of 22 Introduction to Particle Filtering Signal Tracking Algorithms – Particle Filter The model has realistic noise Gaussian noise is actually generated at each step Noise variances and number of particles can be customized Algorithm runs as previously described 1.State prediction stage 2.State update stage Each step gives a collection of possible next states of signal The collection is represented in the black particles Mean value of particles becomes the predicted state
22
Page 21 of 22 Introduction to Particle Filtering Summary Particle filtering promises to be one of the nonlinear techniques. More points to follow
23
Page 22 of 22 Introduction to Particle Filtering References S. Haykin and E. Moulines, "From Kalman to Particle Filters," IEEE International Conference on Acoustics, Speech, and Signal Processing, Philadelphia, Pennsylvania, USA, March 2005. M.W. Andrews, "Learning And Inference In Nonlinear State-Space Models," Gatsby Unit for Computational Neuroscience, University College, London, U.K., December 2004. P.M. Djuric, J.H. Kotecha, J. Zhang, Y. Huang, T. Ghirmai, M. Bugallo, and J. Miguez, "Particle Filtering," IEEE Magazine on Signal Processing, vol 20, no 5, pp. 19-38, September 2003. N. Arulampalam, S. Maskell, N. Gordan, and T. Clapp, "Tutorial On Particle Filters For Online Nonlinear/ Non- Gaussian Bayesian Tracking," IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174-188, February 2002. R. van der Merve, N. de Freitas, A. Doucet, and E. Wan, "The Unscented Particle Filter," Technical Report CUED/F- INFENG/TR 380, Cambridge University Engineering Department, Cambridge University, U.K., August 2000. S. Gannot, and M. Moonen, "On The Application Of The Unscented Kalman Filter To Speech Processing," International Workshop on Acoustic Echo and Noise, Kyoto, Japan, pp 27-30, September 2003. J.P. Norton, and G.V. Veres, "Improvement Of The Particle Filter By Better Choice Of The Predicted Sample Set," 15th IFAC Triennial World Congress, Barcelona, Spain, July 2002. J. Vermaak, C. Andrieu, A. Doucet, and S.J. Godsill, "Particle Methods For Bayesian Modeling And Enhancement Of Speech Signals," IEEE Transaction on Speech and Audio Processing, vol 10, no. 3, pp 173-185, March 2002. M. Gabrea, “Robust Adaptive Kalman Filtering-based Speech Enhancement Algorithm,” ICASSP 2004, vol 1, pp. I- 301-I-304, May 2004. K. Paliwal, :Estiamtion og noise variance from the noisy AR signal and its application in speech enhancement,” IEEE transaction on Acoustics, Speech, and Signal Processing, vol 36, no 2, pp 292-294, Feb 1988.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.