Download presentation
Presentation is loading. Please wait.
1
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAA
2
2 For continuous spaces: often no analytical formulas for Bayes filter updates Solution 1: Histogram Filters: (not studied in this lecture) Partition the state space Keep track of probability for each partition Challenges: What is the dynamics for the partitioned model? What is the measurement model? Often very fine resolution required to get reasonable results Solution 2: Particle Filters: Represent belief by random samples Can use actual dynamics and measurement models Naturally allocates computational resources where required (~ adaptive resolution) Aka Monte Carlo filter, Survival of the fittest, Condensation, Bootstrap filter Motivation
3
Sample-based Localization (sonar)
4
Given a sample-based representation of Bel (x t ) = P(x t | z 1, …, z t, u 1, …, u t ) Find a sample-based representation of Bel (x t+1 ) = P(x t+1 | z 1, …, z t, z t+1, u 1, …, u t+1 ) Problem to be Solved
5
Given a sample-based representation of Bel (x t ) = P(x t | z 1, …, z t, u 1, …, u t ) Find a sample-based representation of P(x t+1 | z 1, …, z t, u 1, …, u t+1 ) Solution: For i=1, 2, …, N Sample x i t+1 from P(X t+1 | X t = x i t ) Dynamics Update
6
Sampling Intermezzo
7
Given a sample-based representation of P(x t+1 | z 1, …, z t ) Find a sample-based representation of P(x t+1 | z 1, …, z t, z t+1 ) = C * P(x t+1 | z 1, …, z t ) * P(z t+1 | x t+1 ) Solution: For i=1, 2, …, N w (i) t+1 = w (i) t * P(z t+1 | X t+1 = x (i) t+1 ) the distribution is represented by the weighted set of samples Observation update
8
Sample x 1 1, x 2 1, …, x N 1 from P(X 1 ) Set w i 1 = 1 for all i=1,…,N For t=1, 2, … Dynamics update: For i=1, 2, …, N Sample x i t+1 from P(X t+1 | X t = x i t ) Observation update: For i=1, 2, …, N w i t+1 = w i t * P(z t+1 | X t+1 = x i t+1 ) At any time t, the distribution is represented by the weighted set of samples { ; i=1,…,N} Sequential Importance Sampling (SIS) Particle Filter
9
SIS Demo
10
The resulting samples are only weighted by the evidence The samples themselves are never affected by the evidence Fails to concentrate particles/computation in the high probability areas of the distribution P(x t | z 1, …, z t ) SIS particle filter major issue
11
At any time t, the distribution is represented by the weighted set of samples { ; i=1,…,N} Sample N times from the set of particles The probability of drawing each particle is given by its importance weight More particles/computation focused on the parts of the state space with high probability mass Sequential Importance Resampling (SIR)
12
1. Algorithm particle_filter( S t-1, u t, z t ): 2. 3. For Generate new samples 4. Sample index j(i) from the discrete distribution given by w t-1 5. Sample from using and 6. Compute importance weight 7. Update normalization factor 8. Insert 9. For 10. Normalize weights
13
Particle Filters
14
Sensor Information: Importance Sampling
15
Robot Motion
16
Sensor Information: Importance Sampling
17
Robot Motion
18
SIR Demo
19
SIR demo with random
20
20
21
21
22
22
23
23
24
24
25
25
26
26
27
27
28
28
29
29
30
30
31
31
32
32
33
33
34
34
35
35
36
36
37
37
38
38 Initial Distribution
39
39 After Incorporating Ten Ultrasound Scans
40
40 After Incorporating 65 Ultrasound Scans
41
41 Estimated Path
42
42 Summary – Particle Filters Particle filters are an implementation of recursive Bayesian filtering They represent the posterior by a set of weighted samples They can model non-Gaussian distributions Proposal to draw new samples Weight to account for the differences between the proposal and the target
43
43 Summary – PF Localization In the context of localization, the particles are propagated according to the motion model. They are then weighted according to the likelihood of the observations. In a re-sampling step, new particles are drawn with a probability proportional to the likelihood of the observation.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.