Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, using slides from Keith.

Similar presentations


Presentation on theme: "Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, using slides from Keith."— Presentation transcript:

1 Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, latecki@temple.edulatecki@temple.edu using slides from Keith Copsey, Pattern and Information Processing Group, DERA Malvern; D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello, Univ. of Washington, Seattle Honggang Zhang, Univ. of Maryland, College Park Miodrag Bolic, University of Ottawa, Canada Michael Pfeiffer, TU Gratz, Austria

2 Outline zIntroduction to particle filters –Recursive Bayesian estimation zBayesian Importance sampling –Sequential Importance sampling (SIS) –Sampling Importance resampling (SIR) zImprovements to SIR –On-line Markov chain Monte Carlo zBasic Particle Filter algorithm zExample for robot localization zConclusions

3 Particle Filters zSequential Monte Carlo methods for on-line learning within a Bayesian framework. zKnown as –Particle filters –Sequential sampling-importance resampling (SIR) –Bootstrap filters –Condensation trackers –Interacting particle approximations –Survival of the fittest

4 History zFirst attempts – simulations of growing polymers –M. N. Rosenbluth and A.W. Rosenbluth, “Monte Carlo calculation of the average extension of molecular chains,” Journal of Chemical Physics, vol. 23, no. 2, pp. 356–359, 1956. zFirst application in signal processing - 1993 –N. J. Gordon, D. J. Salmond, and A. F. M. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state estimation,” IEE Proceedings-F, vol. 140, no. 2, pp. 107–113, 1993. zBooks –A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, 2001. –B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, 2004. zTutorials –M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.

5 Problem Statement zTracking the state of a system as it evolves over time zSequentially arriving (noisy or ambiguous) observations zWe want to know: Best possible estimate of the hidden variables

6 Solution: Sequential Update zStoring and processing all incoming measurements is inconvenient and may be impossible zRecursive filtering: –Predict next state pdf from current estimate –Update the prediction using sequentially arriving new measurements zOptimal Bayesian solution: recursively calculating exact posterior density

7 Particle filtering ideas zParticle filter is a technique for implementing recursive Bayesian filter by Monte Carlo sampling zThe idea: represent the posterior density by a set of random particles with associated weights. zCompute estimates based on these samples and weights Sample space Posterior density

8 Global Localization of Robot with Sonar http://www.cs.washington.edu/ai/Mobile_Robotics/mcl/animations/global-floor.gif http://www.cs.washington.edu/ai/Mobile_Robotics/mcl/animations/global-floor.gif

9

10 Tools needed Recall “law of total probability” (or marginalization) and “Bayes’ rule”

11 Recursive Bayesian estimation (I) zRecursive filter: –System model: –Measurement model: –Information available:

12 Recursive Bayesian estimation (II) zSeek: –i = 0: filtering. –i > 0: prediction. –i<0: smoothing. zPrediction: –since:

13 Recursive Bayesian estimation (III) zUpdate: zwhere: –since:

14 Bayes Filters (second pass) System state dynamics Observation dynamics We are interested in: Belief or posterior density Estimating system state from noisy observations

15 From above, constructing two steps of Bayes Filters Predict: Update:

16 Predict: Update: Assumptions: Markov Process

17 Bayes Filter How to use it? What else to know? Motion Model Perceptual Model Start from:

18 Example 1 Step 0: initialization Step 1: updating

19 Example 1 (continue) Step 3: updating Step 4: predicting Step 2: predicting 1

20 Classical approximations zAnalytical methods: –Extended Kalman filter, –Gaussian sums… (Alspach et al. 1971) Perform poorly in numerous cases of interest zNumerical methods: –point masses approximations, –splines. (Bucy 1971, de Figueiro 1974…) Very complex to implement, not flexible.

21 Perfect Monte Carlo simulation zRecall that zRandom samples are drawn from the posterior distribution. zRepresent posterior distribution using a set of samples or particles. zEasy to approximate expectations of the form: –by:

22 Random samples and the pdf (I) zTake p(x)=Gamma(4,1) zGenerate some random samples zPlot histogram and basic approximation to pdf 200 samples

23 Random samples and the pdf (II) 500 samples 1000 samples

24 Random samples and the pdf (III) 200000 samples 5000 samples

25 Importance Sampling zUnfortunately it is often not possible to sample directly from the posterior distribution, but we can use importance sampling. zLet p(x) be a pdf from which it is difficult to draw samples. zLet x i ~ q(x), i=1, …, N, be samples that are easily generated from a proposal pdf q, which is called an importance density. zThen approximation to the density p is given by where

26 Bayesian Importance Sampling zBy drawing samples from a known easy to sample proposal distribution we obtain: where are normalized weights.

27

28 Sequential Importance Sampling (I) zFactorizing the proposal distribution: zand remembering that the state evolution is modeled as a Markov process zwe obtain a recursive estimate of the importance weights: zFactorizing is obtained by recursively applying

29 Sequential Importance Sampling (SIS) Particle Filter SIS Particle Filter Algorithm for i=1:N Draw a particle Assign a weight end (k is index over time and i is the particle index)

30 Derivation of SIS weights (I) zThe main idea is Factorizing : and Our goal is to expand p and q in time t

31 Derivation of SIS weights (II)

32 and under Markov assumptions

33 SIS Particle Filter Foundation zAt each time step k zRandom samples are drawn from the proposal distribution for i=1, …, N zThey represent posterior distribution using a set of samples or particles zSince the weights are given by zand q factorizes as

34 Sequential Importance Sampling (II) zChoice of the proposal distribution: zChoose proposal function to minimize variance of (Doucet et al. 1999): zAlthough common choice is the prior distribution: We obtain then

35 zIllustration of SIS: zDegeneracy problems: –variance of importance ratios increases stochastically over time (Kong et al. 1994; Doucet et al. 1999). –In most cases then after a few iterations, all but one particle will have negligible weight Sequential Importance Sampling (III)

36 Sequential Importance Sampling (IV) zIllustration of degeneracy:

37 SIS - why variance increase zSuppose we want to sample from the posterior –choose a proposal density to be very close to the posterior density Then and zSo we expect the variance to be close to 0 to obtain reasonable estimates –thus a variance increase has a harmful effect on accuracy

38

39 Sampling-Importance Resampling zSIS suffers from degeneracy problems so we don’t want to do that! zIntroduce a selection (resampling) step to eliminate samples with low importance ratios and multiply samples with high importance ratios. zResampling maps the weighted random measure on to the equally weighted random measure –by sampling uniformly with replacement from with probabilities zScheme generates children such that and satisfies:

40 Basic SIR Particle Filter - Schematic Initialisation Importance sampling step Resampling step measurement Extract estimate,

41 Basic SIR Particle Filter algorithm (I) zInitialisation – –For sample –and set zImportance Sampling step –For sample –For compute the importance weights w i k –Normalise the importance weights, and set

42 Basic SIR Particle Filter algorithm (II) zResampling step –Resample with replacement particles: –from the set: –according to the normalised importance weights, zSet –proceed to the Importance Sampling step, as the next measurement arrives.

43 Resampling x

44 Generic SIR Particle Filter algorithm M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters …,” IEEE Trans. on Signal Processing, 50( 2), 2002.

45 Improvements to SIR (I) zVariety of resampling schemes with varying performance in terms of the variance of the particles : –Residual sampling (Liu & Chen, 1998). –Systematic sampling (Carpenter et al., 1999). –Mixture of SIS and SIR, only resample when necessary (Liu & Chen, 1995; Doucet et al., 1999). zDegeneracy may still be a problem: –During resampling a sample with high importance weight may be duplicated many times. –Samples may eventually collapse to a single point.

46 Improvements to SIR (II) zTo alleviate numerical degeneracy problems, sample smoothing methods may be adopted. –Roughening ( Gordon et al., 1993 ). Adds an independent jitter to the resampled particles –Prior boosting ( Gordon et al., 1993 ). Increase the number of samples from the proposal distribution to M>N, but in the resampling stage only draw N particles.

47 Improvements to SIR (III) zLocal Monte Carlo methods for alleviating degeneracy: –Local linearisation - using an EKF (Doucet, 1999; Pitt & Shephard, 1999) or UKF (Doucet et al, 2000) to estimate the importance distribution. –Rejection methods ( Müller, 1991; Doucet, 1999; Pitt & Shephard, 1999 ). –Auxiliary particle filters ( Pitt & Shephard, 1999 ) –Kernel smoothing ( Gordon, 1994; Hürzeler & Künsch, 1998; Liu & West, 2000; Musso et al., 2000 ). –MCMC methods ( Müller, 1992; Gordon & Whitby, 1995; Berzuini et al., 1997; Gilks & Berzuini, 1998; Andrieu et al., 1999 ).

48 Improvements to SIR (IV) zIllustration of SIR with sample smoothing:

49 Ingredients for SMC zImportance sampling function –Gordon et al  –Optimal  –UKF  pdf from UKF at zRedistribution scheme –Gordon et al  SIR –Liu & Chen  Residual –Carpenter et al  Systematic –Liu & Chen, Doucet et al  Resample when necessary zCareful initialisation procedure (for efficiency)

50 Particle filters zAlso known as Sequential Monte Carlo Methods zRepresenting belief by sets of samples or particles z are nonnegative weights called importance factors zUpdating procedure is sequential importance sampling with re-sampling

51 Example 2: Particle Filter Step 0: initialization Each particle has the same weight Step 1: updating weights. Weights are proportional to p(z|x)

52 Example 2: Particle Filter Particles are more concentrated in the region where the person is more likely to be Step 3: updating weights. Weights are proportional to p(z|x) Step 4: predicting. Predict the new locations of particles. Step 2: predicting. Predict the new locations of particles.

53 Compare Particle Filter with Bayes Filter with Known Distribution Example 1 Example 2 Example 1 Example 2 Predicting Updating

54 Particle Filters

55 Sensor Information: Importance Sampling

56 Robot Motion

57 Sensor Information: Importance Sampling

58 Robot Motion

59 Tracking in 1D: the blue trajectory is the target. The best of10 particles is in red.

60 Matlab code: truex is a vector of 100 positions to be tracked.

61

62

63

64

65

66 Application Examples zRobot localization zRobot mapping zVisual Tracking –e.g. human motion (body parts) zPrediction of (financial) time series –e.g. mapping gold price to stock price zTarget recognition from single or multiple images zGuidance of missiles zContour grouping zNice video demos: http://www.cs.washington.edu/ai/Mobile_Robotics/mcl/

67 2nd Book Advert zStatistical Pattern Recognition zAndrew Webb, DERA zISBN 0340741643, zPaperback: 1999: £29.99 zButterworth Heinemann zContents: –Introduction to SPR, Estimation, Density estimation, Linear discriminant analysis, Nonlinear discriminant analysis - neural networks, Nonlinear discriminant analysis - statistical methods, Classification trees, Feature selction and extraction, Clustering, Additional topics, Measures of dissimilarity, Parameter estimation, Linear algebra, Data, Probability theory.

68 Homework zImplement all three particle filter algorithms SIS Particle Filter Algorithm (p. 27) Basic SIR Particle Filter algorithm (p. 39,40) Generic SIR Particle Filter algorithm (p. 42) zand evaluate their performance on a problem of your choice. zGroups of two are allowed. zSubmit a report and a ready to run Matlab code (with a script and the data). zPresent a report to the class.


Download ppt "Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, using slides from Keith."

Similar presentations


Ads by Google