Download presentation
Presentation is loading. Please wait.
Published byFrederick Green Modified over 9 years ago
1
P ARTICLE F ILTER L OCALIZATION Mohammad Shahab Ahmad Salam AlRefai
2
O UTLINE References Introduction Bayesian Filtering Particle Filters Monte-Carlo Localization Visually… The Use of Negative information Localization Architecture in GT What Next? 2
3
R EFERENCES Sebastian Thrun, Dieter Fox, Wolfram Burgard. “Monte Carlo Localization With Mixture Proposal Distribution”. Wolfram Burgard. “Recursive Bayes Filtering”, PPT file Jan Hoffmann, Michael Spranger, Daniel Gohring, and Matthias Jungel. “Making Use of What you Don’t See: Negative Information in Markov Localization. Dieter Fox, Jeffrey Hightower, Lin Liao, and Dirk Schulz 3
4
I NTRODUCTION 4
5
M OTIVATION Where am I? ? 5
6
L OCALIZATION P ROBLEM “Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91] Given Map of the environment: Soccer Field Sequence of percepts & actions: Camera Frames, Odometry, etc Wanted Estimate of the robot’s state (pose): 6
7
P ROBABILISTIC S TATE E STIMATION Advantages Can accommodate inaccurate models Can accommodate imperfect sensors Robust in real-world applications Best known approach to many hard robotics problems Disadvantages Computationally demanding False assumptions Approximate! 7
8
B AYESIAN F ILTER 8
9
B AYESIAN F ILTERS Bayes’ Rule with background knowledge Total Probability 9
10
B AYESIAN F ILTERS Let x(t) be pose of robot at time instant t o(t) be robot observation (sensor information) a(t) be robot action (odometry ) The Idea in Bayesian Filtering is to find Probability Density (distribution) of the Belief 10
11
B AYESIAN F ILTERS So, by Bayes Rule Markov Assumption : Past & Future data are independent if current state known 11
12
B AYESIAN F ILTERS Denominator is not a function of x(t), then it is replaced with normalization constant With Law of Total Probability for rightmost term in numerator; and further simplifications We get the Recursive Equation 12
13
B AYESIAN F ILTERS So we need for any Bayesian Estimation problem: 1. Initial Belief distribution, 2. Next State Probabilities, 3. Observation Likelihood, 13
14
P ARTICLE F ILTER 14
15
P ARTICLE F ILTER The Belief is modeled as the discrete distribution as m is the number of particles hypothetical state estimates weights reflecting a “confidence” in how well is the particle 15
16
P ARTICLE F ILTER Estimation of non-Gaussian, nonlinear processes It is also called: Monte Carlo filter, Survival of the fittest, Condensation, Bootstrap filter, 16
17
M ONTE -C ARLO L OCALIZATION Framework Observation Model Motion Model Previous Belief 17
18
M ONTE -C ARLO L OCALIZATION 18
19
M ONTE -C ARLO L OCALIZATION Algorithm 1. Using previous samples, project ahead by generating a new samples by the motion model 2. Reweight each sample based upon the new sensor information One approach is to compute for each i 3. Normalize the weight factors for all m particles 4. Maybe resample or not! And go to step 1 The normalized weight defines the potential distribution of state 19
20
M ONTE -C ARLO L OCALIZATION Algorithm Step 2&3 for all m Step 1 for all m after Step 4 20
21
M ONTE -C ARLO L OCALIZATION State Estimation, i.e. Pose Calculation Mean particle with the highest weight find the cell (particle subset) with the highest total weight, and calculate the mean over this particle subset. GT2005 Most crucial thing about MCL is the calculation of weights Other alternatives can be imagined 21
22
M ONTE -C ARLO L OCALIZATION Advantages to using particle filters (MCL) Able to model non-linear system dynamics and sensor models No Gaussian noise model assumptions In practice, performs well in the presence of large amounts of noise and assumption violations (e.g. Markov assumption, weighting model…) Simple implementation Disadvantages Higher computational complexity Computational complexity increases exponentially compared with increases in state dimension In some applications, the filter is more likely to diverge with more accurate measurements!!!! 22
23
… V ISUALLY 23
24
24
25
O NE – D IMENSIONAL ILLUSTRATION OF B AYES F ILTER 25
26
O NE – D IMENSIONAL ILLUSTRATION OF B AYES F ILTER 26
27
O NE – D IMENSIONAL ILLUSTRATION OF B AYES F ILTER 27
28
O NE – D IMENSIONAL ILLUSTRATION OF B AYES F ILTER 28
29
O NE – D IMENSIONAL ILLUSTRATION OF B AYES F ILTER 29
30
A PPLYING P ARTICLE F ILTERS TO L OCATION E STIMATION 30
31
A PPLYING P ARTICLE F ILTERS TO L OCATION E STIMATION 31
32
A PPLYING P ARTICLE F ILTERS TO L OCATION E STIMATION 32
33
A PPLYING P ARTICLE F ILTERS TO L OCATION E STIMATION 33
34
A PPLYING P ARTICLE F ILTERS TO L OCATION E STIMATION 34
35
N EGATIVE I NFORMATION 35
36
M AKING U SE OF N EGATIVE I NFORMATION 36
37
M AKING U SE OF N EGATIVE I NFORMATION 37
38
M AKING U SE OF N EGATIVE I NFORMATION 38
39
M AKING U SE OF N EGATIVE I NFORMATION 39
40
M ATHEMATICAL M ODELING t : Time l: Landmark z: Observation u: action s: State *: negative information r: sensing range o: possible occlusion 40
41
A LGORITHM if (landmark l detected) then else end if 41
42
E XPERIMENTS Particle Distribution 100 Particles (MCL) 2000 Particles to get better representation. Not Using negative Information VS using negative information. Entropy H (information theoretical quality measure of the positon estimate. 42
43
R ESULTS 43
44
R ESULTS 44
45
R ESULTS 45
46
G ERMAN T EAM L OCALIZATION ARCHITECTURE 46
47
G ERMAN T EAM S ELF -L OCALIZATION C LASSES 47
48
C OGNITION 48
49
W HAT N EXT ? Monte Carlo is bad for accurate sensors??! There are different types of localization techniques: Kalman, Multihypothesis tracking, Grid, Topology, in addition to particle… What is the difference between them? And which one is better? All These issues will be discussed with a lot more in our next presentation (next week) Inshallah. 49
50
F UTURE 50
51
G UIDENCE 51
52
H OLDING OUR B AGS 52
53
M EDICINE 53
54
D ANCING … 54
55
U NDERSTAND AND F EAL 55
56
P LAY W ITH 56
57
O R M AYBE … 57
58
Q UESTIONS 58
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.