Download presentation
Presentation is loading. Please wait.
Published byCharity Poole Modified over 8 years ago
1
Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum
2
Motivation Goal: Improved Localization The absolute position error of a robot accumulates due to: - Wheel slippage - Sensor drift - Sensor error Without correction the error growth is unbounded Accurate knowledge of absolute position is necessary for effective navigation and accurate mapping Goal: Efficient Map Representation By abstracting raw sensor data into representative features we - require less storage space - reduce the computation complexity for many mapping and localization algorithms The data representation must not filter out excessive useful data
3
Robot Laser Scanner Range Measurements Mobile Robot - On board Pentium III PC SICK Laser Range Scanner - 8 meter range / 1 mm resolution - 180 degree field-of-view Testbed Equipment
4
Goal : Improved Localization Scan Matching – Align two range scans taken at different poses to calculate an improved displacement estimate. Method: - Discard outliers - Correspond closest points across scans - Use iterative Maximum Likelihood algorithm to calculate an optimal displacement estimate
5
1 m X500 Combined Uncertanties Weighted Approach Correspondence Errors Explicit models of uncertainty & noise sources are developed for each scan point taking into account: - Sensor noise & errors - Point correspondence uncertainty Improvements vs. Unweighted Method : - More accurate displacement estimate - More realistic covariance estimate - Increased robustness to initial conditions - Improved convergence properties
6
Weighted Formulation Goal: Estimate displacement (p ij, ij ) Measured range data from poses i and j Sensor noise bias True range Error between k th scan point pair = rotation of ij Correspondence Error Noise Error Bias Error
7
Maximum Likelihood Estimation Likelihood of obtaining errors { ij k } given displacement Position displacement estimate obtained in closed form Orientation estimate found using 1-D numerical optimization, or series expansion approximation methods Non-linear Optimization Problem
8
- Increased robustness to inaccurate initial displacement guesses - Fewer iterations for convergence Weighted vs. Unweighted matching of two poses 512 trials with different initial displacements within : +/- 15 degrees of actual angular displacement +/- 150 mm of actual spatial displacement Localization Results
9
Displacement estimate errors at end of path: Odometry = 950mm Unweighted = 490mm Weighted = 120mm Localization Results - More accurate covariance estimate - Improved knowledge of measurement uncertainty - Better fusion with other sensors
10
Goal: Efficient Mapping Line Fitting - Given a group of range measurements, each with a unique uncertainty, determine the set of optimally fit lines as well as the uncertainty of those lines. Method : - Group roughly collinear points using Hough Transform - Calculate optimal line fit using Maximum Likelihood framework with each point weighted according to its individual uncertainty - Calculate uncertainty of line fit - Merge similar lines across data scans for further map simplification Weighted Line Fit Range Scan Points Point Uncertainty Line Uncertainty Fit Line
11
Line Fitting Results Fig. A : Raw Points with associated uncertainties - 7200 raw range points Fig. B : Fit lines with associated uncertainties - 114 fit lines Fig. C : Final line map after line merging - 46 fit lines Final data compression : 98.7% Map built from laser range scans taken from 10 poses in a hallway Fig. A Fig. B Fig. C
12
Robot Poses Range Scan Points 50x Covariance (3 σ ) Line Fitting Results Raw Range Data Points Taken in Lab (10 Poses, 7200 Points)
13
Robot Poses Fit Lines Line Endpoints 50x Covariance (3 σ ) Line Fitting Results 141 Lines Fit
14
Robot Poses Fit Lines Line Endpoints 50x Covariance (3 σ ) Line Fitting Results 74 Lines After Mergin g (97.9% Compr ession)
15
Future Work - Transition to CCD camera as primary sensor - Extend theory to include non-planar features - Extract features invariant to small changes in robot displacement - Identify a metric to measure which features maximally distinguish between locations - Establish a general framework for automatic feature selection - Merge multiple sensors for optimal localization and mapping
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.