Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum.

Slides:



Advertisements
Similar presentations
Mobile Robot Localization and Mapping using the Kalman Filter
Advertisements

Image Registration  Mapping of Evolution. Registration Goals Assume the correspondences are known Find such f() and g() such that the images are best.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Advanced Mobile Robotics
Discussion topics SLAM overview Range and Odometry data Landmarks
KinectFusion: Real-Time Dense Surface Mapping and Tracking
Odometry Error Modeling Three noble methods to model random odometry error.
Object Recognition using Invariant Local Features Applications l Mobile robots, driver assistance l Cell phone location or object recognition l Panoramas,
Probabilistic Robotics
Reducing Drift in Parametric Motion Tracking
Probabilistic Robotics
Object Recognition & Model Based Tracking © Danica Kragic Tracking system.
Automatic Feature Extraction for Multi-view 3D Face Recognition
Probabilistic Robotics SLAM. 2 Given: The robot’s controls Observations of nearby features Estimate: Map of features Path of the robot The SLAM Problem.
Silvina Rybnikov Supervisors: Prof. Ilan Shimshoni and Prof. Ehud Rivlin HomePage:
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Using Perception for mobile robot. 2D ranging for mobile robot.
Probabilistic Robotics
Simultaneous Localization & Mapping - SLAM
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
1 Observers Data Only Fault Detection Bo Wahlberg Automatic Control Lab & ACCESS KTH, SWEDEN André C. Bittencourt Department of Automatic Control UFSC,
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
A Versatile Depalletizer of Boxes Based on Range Imagery Dimitrios Katsoulas*, Lothar Bergen*, Lambis Tassakos** *University of Freiburg **Inos Automation-software.
Laser Scan Matching in Polar Coordinates with Application to SLAM
Segmentation into Planar Patches for Recovery of Unmodeled Objects Kok-Lim Low COMP Computer Vision 4/26/2000.
Sam Pfister, Stergios Roumeliotis, Joel Burdick
Active SLAM in Structured Environments Cindy Leung, Shoudong Huang and Gamini Dissanayake Presented by: Arvind Pereira for the CS-599 – Sequential Decision.
ECE 7340: Building Intelligent Robots QUALITATIVE NAVIGATION FOR MOBILE ROBOTS Tod S. Levitt Daryl T. Lawton Presented by: Aniket Samant.
Object Recognition with Invariant Features n Definition: Identify objects or scenes and determine their pose and model parameters n Applications l Industrial.
3D Mapping Robots Intelligent Robotics School of Computer Science Jeremy Wyatt James Walker.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
Probabilistic Robotics
Pores and Ridges: High- Resolution Fingerprint Matching Using Level 3 Features Anil K. Jain Yi Chen Meltem Demirkus.
Università La Sapienza Rome, Italy Scan matching in the Hough domain Andrea Censi, Luca Iocchi, Giorgio Grisetti dis.uniroma1.it
Weighted Range Sensor Matching Algorithms for Mobile Robot Displacement Estimation Sam Pfister, Kristo Kriechbaum, Stergios Roumeliotis, Joel Burdick Mechanical.
1/53 Key Problems Localization –“where am I ?” Fault Detection –“what’s wrong ?” Mapping –“what is my environment like ?”
Overview and Mathematics Bjoern Griesbach
Probability in Robotics
Indoor Localization Using a Modern Smartphone Carick Wienke Advisor: Dr. Nicholas Kirsch Although indoor localization is an important tool for a wide range.
/09/dji-phantom-crashes-into- canadian-lake/
MESA LAB Multi-view image stitching Guimei Zhang MESA LAB MESA (Mechatronics, Embedded Systems and Automation) LAB School of Engineering, University of.
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Quality Assessment for LIDAR Point Cloud Registration using In-Situ Conjugate Features Jen-Yu Han 1, Hui-Ping Tserng 1, Chih-Ting Lin 2 1 Department of.
LECTURE 6 Segment-based Localization. Position Measurement Systems The problem of Mobile Robot Navigation: Where am I? Where am I going? How should I.
Mapping and Localization with RFID Technology Matthai Philipose, Kenneth P Fishkin, Dieter Fox, Dirk Hahnel, Wolfram Burgard Presenter: Aniket Shah.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
A 3D Model Alignment and Retrieval System Ding-Yun Chen and Ming Ouhyoung.
Young Ki Baik, Computer Vision Lab.
Ground Truth Free Evaluation of Segment Based Maps Rolf Lakaemper Temple University, Philadelphia,PA,USA.
Parameter estimation. 2D homography Given a set of (x i,x i ’), compute H (x i ’=Hx i ) 3D to 2D camera projection Given a set of (X i,x i ), compute.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Robotics Club: 5:30 this evening
State Estimation for Autonomous Vehicles
Probability in Robotics Trends in Robotics Research Reactive Paradigm (mid-80’s) no models relies heavily on good sensing Probabilistic Robotics (since.
Sensors Uncertainties, Line extraction from laser scans Vision
Probability in Robotics Trends in Robotics Research Reactive Paradigm (mid-80’s) no models relies heavily on good sensing Probabilistic Robotics (since.
Simultaneous Multi-Line- Segment Merging for Robot Mapping using Mean Shift Clustering Rolf Lakaemper Temple University, Philadelphia,PA,USA.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Line fitting.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Paper – Stephen Se, David Lowe, Jim Little
Simultaneous Localization and Mapping
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
A Short Introduction to the Bayes Filter and Related Models
Probabilistic Robotics
Kalman Filtering COS 323.
Probability in Robotics
Presentation transcript:

Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum

Motivation Goal: Improved Localization The absolute position error of a robot accumulates due to: - Wheel slippage - Sensor drift - Sensor error Without correction the error growth is unbounded Accurate knowledge of absolute position is necessary for effective navigation and accurate mapping Goal: Efficient Map Representation By abstracting raw sensor data into representative features we - require less storage space - reduce the computation complexity for many mapping and localization algorithms The data representation must not filter out excessive useful data

Robot Laser Scanner Range Measurements Mobile Robot - On board Pentium III PC SICK Laser Range Scanner - 8 meter range / 1 mm resolution degree field-of-view Testbed Equipment

Goal : Improved Localization Scan Matching – Align two range scans taken at different poses to calculate an improved displacement estimate. Method: - Discard outliers - Correspond closest points across scans - Use iterative Maximum Likelihood algorithm to calculate an optimal displacement estimate

1 m X500 Combined Uncertanties Weighted Approach Correspondence Errors Explicit models of uncertainty & noise sources are developed for each scan point taking into account: - Sensor noise & errors - Point correspondence uncertainty Improvements vs. Unweighted Method : - More accurate displacement estimate - More realistic covariance estimate - Increased robustness to initial conditions - Improved convergence properties

Weighted Formulation Goal: Estimate displacement (p ij,  ij ) Measured range data from poses i and j Sensor noise bias True range Error between k th scan point pair = rotation of  ij Correspondence Error Noise Error Bias Error

Maximum Likelihood Estimation Likelihood of obtaining errors {  ij k } given displacement Position displacement estimate obtained in closed form Orientation estimate found using 1-D numerical optimization, or series expansion approximation methods Non-linear Optimization Problem

- Increased robustness to inaccurate initial displacement guesses - Fewer iterations for convergence Weighted vs. Unweighted matching of two poses 512 trials with different initial displacements within : +/- 15 degrees of actual angular displacement +/- 150 mm of actual spatial displacement Localization Results

Displacement estimate errors at end of path: Odometry = 950mm Unweighted = 490mm Weighted = 120mm Localization Results - More accurate covariance estimate - Improved knowledge of measurement uncertainty - Better fusion with other sensors

Goal: Efficient Mapping Line Fitting - Given a group of range measurements, each with a unique uncertainty, determine the set of optimally fit lines as well as the uncertainty of those lines. Method : - Group roughly collinear points using Hough Transform - Calculate optimal line fit using Maximum Likelihood framework with each point weighted according to its individual uncertainty - Calculate uncertainty of line fit - Merge similar lines across data scans for further map simplification Weighted Line Fit Range Scan Points Point Uncertainty Line Uncertainty Fit Line

Line Fitting Results Fig. A : Raw Points with associated uncertainties raw range points Fig. B : Fit lines with associated uncertainties fit lines Fig. C : Final line map after line merging - 46 fit lines Final data compression : 98.7% Map built from laser range scans taken from 10 poses in a hallway Fig. A Fig. B Fig. C

Robot Poses Range Scan Points 50x Covariance (3 σ ) Line Fitting Results Raw Range Data Points Taken in Lab (10 Poses, 7200 Points)

Robot Poses Fit Lines Line Endpoints 50x Covariance (3 σ ) Line Fitting Results 141 Lines Fit

Robot Poses Fit Lines Line Endpoints 50x Covariance (3 σ ) Line Fitting Results 74 Lines After Mergin g (97.9% Compr ession)

Future Work - Transition to CCD camera as primary sensor - Extend theory to include non-planar features - Extract features invariant to small changes in robot displacement - Identify a metric to measure which features maximally distinguish between locations - Establish a general framework for automatic feature selection - Merge multiple sensors for optimal localization and mapping