Download presentation
Presentation is loading. Please wait.
Published byAllyson Collins Modified over 9 years ago
1
Probabilistic Robotics Bayes Filter Implementations
2
Prediction Correction Bayes Filter Reminder
3
Gaussians -- Univariate Multivariate
4
Properties of Gaussians
5
We stay in the “Gaussian world” as long as we start with Gaussians and perform only linear transformations. Multivariate Gaussians
6
6 Discrete Kalman Filter Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation with a measurement
7
7 Components of a Kalman Filter Matrix (nxn) that describes how the state evolves from t to t-1 without controls or noise. Matrix (nxm) that describes how the control u t changes the state from t to t-1. Matrix (kxn) that describes how to map the state x t to an observation z t. Random variables representing the process and measurement noise that are assumed to be independent and normally distributed with covariance R t and Q t respectively.
8
8 Prediction Correction Bayes Filter Reminder
9
9 Kalman Filter Algorithm 1. Algorithm Kalman_filter( t-1, t-1, u t, z t ): 2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return t, t
10
10 Dynamics are linear function of state and control plus additive noise: Linear Gaussian Systems: Dynamics
11
11 Observations are linear function of state plus additive noise: Linear Gaussian Systems: Observations
12
12 Linear Gaussian Systems: Initialization Initial belief is normally distributed:
13
13 Kalman Filter Updates in 1D
14
14 Kalman Filter Updates
15
15 Kalman Filter Updates in 1D
16
16 Kalman Filter Updates in 1D
17
17 Linear Gaussian Systems: Dynamics
18
18 Linear Gaussian Systems: Observations
19
19 The Prediction-Correction-Cycle Prediction
20
20 The Prediction-Correction-Cycle Correction
21
21 The Prediction-Correction-Cycle Correction Prediction
22
22 Kalman Filter Summary Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k 2.376 + n 2 ) Optimal for linear Gaussian systems! Most robotics systems are nonlinear!
23
23 Nonlinear Dynamic Systems Most realistic robotic problems involve nonlinear functions
24
24 Linearity Assumption Revisited
25
25 Non-linear Function
26
26 EKF Linearization (1)
27
27 EKF Linearization (2)
28
28 EKF Linearization (3)
29
29 Prediction: Correction: EKF Linearization: First Order Taylor Series Expansion
30
30 EKF Algorithm 1.Extended_Kalman_filter ( t-1, t-1, u t, z t ): 2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return t, t
31
31 EKF Summary Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k 2.376 + n 2 ) Not optimal! Can diverge if nonlinearities are large! Works surprisingly well even when all assumptions are violated!
32
32 Unscented Transform Sigma points Weights Pass sigma points through nonlinear function Recover mean and covariance
33
33 Linearization via Unscented Transform EKF UKF
34
34 UKF Sigma-Point Estimate (2) EKF UKF
35
35 UKF Sigma-Point Estimate (3) EKF UKF
36
36 UKF Algorithm
37
37 UKF Summary Highly efficient: Same complexity as EKF, with a constant factor slower in typical practical applications Better linearization than EKF: Accurate in first two terms of Taylor expansion (EKF only first term) Derivative-free: No Jacobians needed Still not optimal!
38
38 Represent belief by random samples Estimation of non-Gaussian, nonlinear processes Monte Carlo filter, Survival of the fittest, Condensation, Bootstrap filter, Particle filter Filtering: [Rubin, 88], [Gordon et al., 93], [Kitagawa 96] Computer vision: [Isard and Blake 96, 98] Dynamic Bayesian Networks: [Kanazawa et al., 95]d Particle Filters
39
39 Particle Filters
40
40 Particle Filter Algorithm
41
41 Resampling Given: Set S=(w 1, w 2, …, w M ) of weight samples. Wanted : Random sample, where the probability of drawing x i is given by w i. Typically done M times with replacement to generate new sample set X t.
42
42 Resampling Algorithm
43
43 w2w2 w3w3 w1w1 wnwn W n-1 Resampling w2w2 w3w3 w1w1 wnwn W n-1 Roulette wheel Binary search, n log n Stochastic universal sampling Systematic resampling Linear time complexity Easy to implement, low variance
44
44 Summary Particle filters are an implementation of recursive Bayesian filtering They represent the posterior by a set of weighted samples. In the context of localization, the particles are propagated according to the motion model. They are then weighted according to the likelihood of the observations. In a re-sampling step, new particles are drawn with a probability proportional to the likelihood of the observation.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.