Download presentation
Presentation is loading. Please wait.
1
SLAM: Simultaneous Localization and Mapping: Part II BY TIM BAILEY AND HUGH DURRANT-WHYTE Presented by Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT Press, 2005 Many images are also taken from Probabilistic Robotics. http://www.probabilistic-robotics.com
2
Overview Review SLAM Reducing complexity State Augmentation Partitioned Updates Sparsification Data association Batch Gating SIFT Multi-Hypothesis Future works
3
What is SLAM? Given: The robot’s controls Observations of nearby features Estimate: Map of features Path of the robot A robot is exploring an unknown, static environment.
4
Terminology Robot State (or pose): Position and heading Robot Controls: Robot motion and manipulation Sensor Measurements: Range scans, images, etc. Landmark or Map: Landmarks or Map z t u t x t = ( x ; y ; µ ) x 1 : t = f x 1 ; x 2 ; ::: ; x t g u 1 : t = f u 1 ; u 2 ; ::: ; u t g z 1 : t = f z 1 ; z 2 ; ::: ; z t }
5
Terminology Observation model: or The probability of a measurement z t given that the robot is at position x t and map m. Motion Model: The posterior probability that action u t carries the robot from x t-1 to x t.
6
SLAM algorithm Prediction Update
7
7 EKF State Space Model Prediction Update where
8
Maintaining values: Bel(x t,m) and its covariance matrix P t. Map with N landmarks:(3+2N)-dimensional Gaussian. 8 EKF-SLAM
9
Overview Review SLAM Reducing complexity State Augmentation Partitioned Updates Sparsification Data association Batch Gating SIFT Multi-Hypothesis Future works
10
Complexity O(N 3) with N landmarks due to the covariance matrix and matrix multiplication of Jacobian. Can handle hundreds of dimensions? It can be reduced by approximation methods: State Augmentation for the prediction stage Partitioned Updates for the update stage Sparsification using an information form 10 EKF-SLAM : Complexity
11
11 State Augmentation Prediction : Solution : State Augmentation Separating the state into an augmented states Update only affected matrixes Static
12
State Augmentation Covariance prediction State Augmentation Static O(N 3 ) O(N)
13
13 Partitioned Updates Update : Solution : Partitioned Update with local submap. Confines the map to a small local region. Only Updates the small local region. Updates the whole map only at a much lower frequency
14
Partitioned Updates Local State : Global State:Periodically registers Updated by Local SLAM
15
State Bel(x t,m) and covariance matrix P t are Gaussian probability density which, implicitly describes the two central moments of Gaussian Using Moment or Information Form Sparsification P t Y t Many of none diagonal components are very close to 0 they can be set to zero. Sparsification
16
Covariance prediction Sparsification using the information form O(N 3 ) O(N)
17
Overview Review SLAM Computational complexity State Augmentation Partitioned Updates Sparsification Data association Batch Gating SIFT Multi-Hypothesis Future works
18
Data Association Problem A robust SLAM must consider possible data associations Solutions: three key methods : Batch Gating SIFT Multi-Hypothesis Which observation belongs to which landmark?
19
Batch Gating Basic Principle of Batch: RANSAC Gating : constrained by robot position estimation If true robot movement is ==> the left case is chosen by using the gating
20
Batch Gating is not enough for reliable data association SIFT features have “landmark-quality” for SLAM SIFT correspondences tend to be reliable and recognizable under variable conditions Gating If true robot movement is ==> the left case is chosen by using the gating SIFT
21
Multi-Hypothesis Data Association Multi-hypothesis data association Generate a separate track estimate for each association hypothesis. Low-likelihood tracks are pruned FastSLAM is inherently a Multi-hypothesis solution because its data association is done on a per-particle basis. Landmark 1Landmark 2Landmark M … x, y, Landmark 1Landmark 2Landmark M … x, y, Particle #1 Landmark 1Landmark 2Landmark M … x, y, Particle #2 Particle N …
22
Per-Particle Data Association Was the observation generated by the red or the blue landmark? P(observation|red) = 0.3P(observation|blue) = 0.7 Per-particle data association Pick the most probable match If the probability is too low, generate a new landmark
23
Future Woks Large scale mapping including many vehicles in mixed environments with sensor networks and dynamic landmark. The delayed data-fusion concept instead of batch association and iterative smoothing to improve estimation quality and robustness
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.