Sensors Fusion for Mobile Robotics localization

Slides:



Advertisements
Similar presentations
Fusion of Time-of-Flight Depth and Stereo for High Accuracy Depth Maps Reporter :鄒嘉恆 Date : 2009/11/17.
Advertisements

Introduction to Robotics Lecture One Robotics Club -Arjun Bhasin.
Advanced Mobile Robotics
Discussion topics SLAM overview Range and Odometry data Landmarks
Odometry Error Modeling Three noble methods to model random odometry error.
Odometry Error Detection & Correction - Sudhan Kanitkar.
ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead.
IR Lab, 16th Oct 2007 Zeyn Saigol
Introduction to Probabilistic Robot Mapping. What is Robot Mapping? General Definitions for robot mapping.
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
Integrating a Short Range Laser Probe with a 6-DOF Vertical Robot Arm and a Rotary Table Theodor Borangiu Anamaria Dogar
Motion planning, control and obstacle avoidance D. Calisi.
Dana Cobzas-PhD thesis Image-Based Models with Applications in Robot Navigation Dana Cobzas Supervisor: Hong Zhang.
Active SLAM in Structured Environments Cindy Leung, Shoudong Huang and Gamini Dissanayake Presented by: Arvind Pereira for the CS-599 – Sequential Decision.
A Robotic Wheelchair for Crowded Public Environments Choi Jung-Yi EE887 Special Topics in Robotics Paper Review E. Prassler, J. Scholz, and.
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
A Self-Supervised Terrain Roughness Estimator for Off-Road Autonomous Driving David Stavens and Sebastian Thrun Stanford Artificial Intelligence Lab.
Mobile Intelligent Systems 2004 Course Responsibility: Ola Bengtsson.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
Grid Maps for Robot Mapping. Features versus Volumetric Maps.
Weighted Range Sensor Matching Algorithms for Mobile Robot Displacement Estimation Sam Pfister, Kristo Kriechbaum, Stergios Roumeliotis, Joel Burdick Mechanical.
1/53 Key Problems Localization –“where am I ?” Fault Detection –“what’s wrong ?” Mapping –“what is my environment like ?”
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
ROBOT MAPPING AND EKF SLAM
An INS/GPS Navigation System with MEMS Inertial Sensors for Small Unmanned Aerial Vehicles Masaru Naruoka The University of Tokyo 1.Introduction.
Friday, 4/8/2011 Professor Wyatt Newman Smart Wheelchairs.
Localisation & Navigation
Sensors. Sensors are for Perception Sensors are physical devices that measure physical quantities. – Such as light, temperature, pressure – Proprioception.
Analog vs Digital  Digital on/off voltage Analog variable voltage.
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
An Introduction to Robotic Navigation ECE 450 Introduction to Robotics.
오 세 영, 이 진 수 전자전기공학과, 뇌연구센터 포항공과대학교
Probabilistic Robotics: Monte Carlo Localization
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
SLAM: Robotic Simultaneous Location and Mapping With a great deal of acknowledgment to Sebastian Thrun.
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
IMPROVE THE INNOVATION Today: High Performance Inertial Measurement Systems LI.COM.
Autonomous Guidance Navigation and Control Michael Gillham University of Kent SYSIASS Meeting University of Essex
M. De Cecco - Lucidi del corso di Robotica e Sensor Fusion Laser Range Finder Camera  direct depth measurement  wide accuracy span (till 200 m)  only.
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
Juan David Rios IMDL FALL 2012 Dr. Eric M. Schwartz – A. Antonio Arroyo September 18/2012.
Robotics Club: 5:30 this evening
State Estimation for Autonomous Vehicles
Contents: 1. Introduction 2. Gyroscope specifications 3. Drift rate compensation 4. Orientation error correction 5. Results 6. Gyroscope and odometers.
Learning Roomba Module 5 - Localization. Outline What is Localization? Why is Localization important? Why is Localization hard? Some Approaches Using.
Software Narrative Autonomous Targeting Vehicle (ATV) Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers.
Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum.
EE 495 Modern Navigation Systems INS Error Mechanization Mon, March 21 EE 495 Modern Navigation Systems Slide 1 of 10.
Sensors for Robotics Introduction. Human Sensing Sense: Vision Audition Gustation Olfaction Tactition What sensed: EM waves Pressure waves Chemicals -
Paper – Stephen Se, David Lowe, Jim Little
A Forest of Sensors: Using adaptive tracking to classify and monitor activities in a site Eric Grimson AI Lab, Massachusetts Institute of Technology
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Sensors for industrial mobile Robots: environment referred laser scanning Need more/better pictures.
Sensors and Sensing for Reactive Robots
Map for Easy Paths GIANLUCA BARDARO
Simultaneous Localization and Mapping
Robot Teknolojisine Giriş Yrd. Doç. Dr. Erkan Uslu, Doç. Dr
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
دکتر سعید شیری قیداری & فصل 4 کتاب
Closing the Gaps in Inertial Motion Tracking
Inertial Measurement Units
Sensors for industrial mobile Robots Incremental sensors
Probabilistic Robotics
First Homework One week
A Short Introduction to the Bayes Filter and Related Models
Motion Models (cont) 2/16/2019.
Nome Sobrenome. Time time time time time time..
Presentation transcript:

Sensors Fusion for Mobile Robotics localization

Until now we’ve presented the main principles and features of incremental and absolute (environment referred localization systems) could you summarize the main features and differences??? Main problems of both categories??? Need more/better pictures

Robot Sensors - localization Infrared Ranging Do you recognize the difference between the two categories? Magnetometer GPS IR Modulator Receiver Accelerometer Camera Linear Encoder Sonar Ranging Gyroscope Rotary Encoder Laser triangulation Laser Rangefinder Need more/better pictures Compass Incremental vs Absolute

Robot Sensors - localization Good features Bad features Need more/better pictures Incremental vs Absolute

Example: localization with encoders Example: the vehicle shall be driven on a corridor localized only by encoders mounted on the wheels. Problem: Left wheel smaller radius (wrt to the nominal value). Ideal = the path that the vehicle assumes to lie on Drift

Importance of Uncertainty Estimation By estimating the uncertainty it is possible to detect and avoid accident but also to combine information Importance of Uncertainty Estimation

Uncertainty Estimation – Sensor Fusion Event: vehicle localization with another sensor referred to the environment (for example a laser triangulation, a camera, etc) Uncertainty Estimation – Sensor Fusion

Without using uncertainty: simple average Uncertainty Estimation – Sensor Fusion

Uncerainty Estimation – Sensor Fusion Using uncertainty: Sensor Fusion Uncerainty Estimation – Sensor Fusion

Questions?

Incremental vs Global Localization Vehicle localization main classification : INCREMENTAL LOCALIZATION The current vehicle pose at time t is evaluated wrt the information achieved in the previous localization at time t-1. GLOBAL LOCALIZATION The current vehicle pose at time t is evaluated wrt the information referred to a global reference system. Incremental vs Global

Incremental an Global Localization Σ0: global reference system Vehicle is globally localized with a direct estimation of H0,k. Vehicle is incrementally localized using the concatenation of the estimations Hi,j. Incremental vs Global

Incremental an Global Localization Incremental vs Global

INCREMENTAL LOCALIZATION Sensor (most used one) classification: INCREMENTAL LOCALIZATION GLOBAL LOCALIZATION Encoders on wheels Triangulation Systems Gyroscope + magnetometers Ultrasound beacon Laser Scanner – comparison with previous acquisition with a map Camera looking on the floor Camera looking on the ceiling Incremental vs Global

Incremental an Global Localization Feature INCREMENTAL GLOBAL Drift in pose estimation HIGH NO Measurement update rate LOW Repeatability Needs of environment information YES Incremental vs Global

Odometric - Global Navigation Fusion First issue: time alignment due to the different update rate Incremental-Global Localization Sensor Fusion

Odometric - Global Navigation Fusion First issue: time alignment due to the different update rate Incremental-Global Localization Sensor Fusion

Odometric - Global Navigation Fusion Second issue: Sensor Fusion and how to continue! Example Matlab Incremental-Global Localization Sensor Fusion

Incremental and Global Localization HIGH, SMOOTH TRAJECTORY Feature INCREMENTAL GLOBAL SENSOR FUSION Drift in pose estimation HIGH NO Measurement update rate LOW Repeatability HIGH, SMOOTH TRAJECTORY Needs of environment information YES Incremental-Global Localization Sensor Fusion

Example: Use of encoders + gyro + laser triangulation … my first industrial AGV

Sensor Fusion Laser triangulation Encoders Gyro No drift 1° STEP (a) 1° STEP (b) Laser triangulation Encoders Gyro 2° STEP No drift Low repeatability (especially in motion or with low number of reflectors) High frequency of update Drift 1° & 2° STEP: High frequency of update & No drift Sensor Fusion

1° STEP (a) Fusion between incremental systems xR calibrated as a Function of the manoeuvre * Fusion of the increments Kinematics equations … already seen this example

1° STEP (b) Real time covariance estimation X is the POSE (position and attitude) * White noise This part takes into account correlation as a function of time wk vector of the uncertainty parameters

2° STEP (a) Estimation of covariance of laser triangulation as a function of the manoeuvre 1. State of the encoders 2. Laser quality factor * 2° STEP (b) Fusion between environment referred and incremental estimations

C.I. 2 sigma C.I. 30 sigma

Delay

List of symbols:

X is the pose (position and attitude)