Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.

Slides:



Advertisements
Similar presentations
Fusion of Time-of-Flight Depth and Stereo for High Accuracy Depth Maps Reporter :鄒嘉恆 Date : 2009/11/17.
Advertisements

Travi-Navi: Self-deployable Indoor Navigation System
Use it Free: Instantly Knowing Your Phone Attitude Pengfei Zhou*, Mo Li Nanyang Technological University Guobin (Jacky) Shen Microsoft Research.
Use it Free: Instantly Knowing Your Phone Attitude Pengfei Zhou*, Mo Li Nanyang Technological University Guobin (Jacky) Shen Microsoft Research.
Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
INS : State of the art Yves PATUREL. 2 INS : noise on the sensors For inertial sensors, one typical way of measuring noise is the draw the Allan variance.
A vision-based system for grasping novel objects in cluttered environments Ashutosh Saxena, Lawson Wong, Morgan Quigley, Andrew Y. Ng 2007 Learning to.
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
September, School of Aeronautics & Astronautics Engineering Performance of Integrated Electro-Optical Navigation Systems Takayuki Hoshizaki
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
Dr. Shanker Balasubramaniam
Multi video camera calibration and synchronization.
Ubiquitous Navigation
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
August, School of Aeronautics & Astronautics Engineering Optical Navigation Systems Takayuki Hoshizaki Prof. Dominick Andrisani.
Understanding Perception and Action Using the Kalman filter Mathematical Models of Human Behavior Amy Kalia April 24, 2007.
Weighted Range Sensor Matching Algorithms for Mobile Robot Displacement Estimation Sam Pfister, Kristo Kriechbaum, Stergios Roumeliotis, Joel Burdick Mechanical.
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
InerVis Mobile Robotics Laboratory Institute of Systems and Robotics ISR – Coimbra Contact Person: Jorge Lobo Human inertial sensor:
Overview and Mathematics Bjoern Griesbach
Lunar Lander Phase B1 p. 0 9 th International Planetary Probe Workshop, Toulouse, June 21 th, 2012 Innovative Visual Navigation Solutions for ESA’s Lunar.
TAXISAT PROJECT Low Cost GNSS and Computer Vision based data fusion solution for driverless vehicles Marc POLLINA
1. The Promise of MEMS to LBS and Navigation Applications Dr. Naser El-Shiemy, CEO Trusted Positioning Inc. 2.
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
An INS/GPS Navigation System with MEMS Inertial Sensors for Small Unmanned Aerial Vehicles Masaru Naruoka The University of Tokyo 1.Introduction.
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Presented by: Z.G. Huang May 04, 2011 Did You See Bob? Human Localization using Mobile Phones Romit Roy Choudhury Duke University Durham, NC, USA Ionut.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
1 SAE Aerospace Control and Guidance Systems, 2006 Lake Tahoe, NV 1-3 March 2006 Tye Brady The Inertial Stellar Compass (ISC) Tye Brady.
Navi Rutgers University 2012 Design Presentation
A General Framework for Tracking Multiple People from a Moving Camera
3D SLAM for Omni-directional Camera
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
Use of GIS Methodology for Online Urban Traffic Monitoring German Aerospace Center Institute of Transport Research M. Hetscher S. Lehmann I. Ernst A. Lippok.
Inertial Navigation System Overview – Mechanization Equation
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.
State Estimation for Autonomous Vehicles
Autonomous Navigation for Flying Robots Lecture 6.3: EKF Example
Presentation: Shashank Gundu.  Introduction  Related work  Hardware Platform  Experiments  Conclusion.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use dynamics models.
EE 495 Modern Navigation Systems Wednesday, January 13 EE 495 Modern Navigation Systems Slide 1 of 18.
Visual Odometry David Nister, CVPR 2004
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Quadcopters A CEV Talk. Agenda Flight PreliminariesWhy Quadcopters The Quadcopter SystemStability: The NotionSensors and FusionControl AlgorithmsThe Way.
EE 495 Modern Navigation Systems Wednesday, January 8 EE 495 Modern Navigation Systems Slide 1 of 18.
Contents: 1. Introduction 2. Gyroscope specifications 3. Drift rate compensation 4. Orientation error correction 5. Results 6. Gyroscope and odometers.
G. Casalino, E. Zereik, E. Simetti, A. Turetta, S. Torelli and A. Sperindè EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Person Following with a Mobile Robot Using Binocular Feature-Based Tracking Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering.
1 Long-term image-based motion estimation Dennis Strelow and Sanjiv Singh.
Strapdown Inertial Navigation Systems (INS) Sensors and UAVs Avionic
Copyright 2011 controltrix corpwww. controltrix.com Global Positioning System ++ Improved GPS using sensor data fusion
EE 495 Modern Navigation Systems
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh Carnegie Mellon University.
1 26/03/09 LIST – DTSI Design of controllers for a quad-rotor UAV using optical flow Bruno Hérissé CEA List Fontenay-Aux-Roses, France
EE 495 Modern Navigation Systems INS Error Mechanization Mon, March 21 EE 495 Modern Navigation Systems Slide 1 of 10.
Localization Life in the Atacama 2004 Science & Technology Workshop January 6-7, 2005 Daniel Villa Carnegie Mellon Matthew Deans QSS/NASA Ames.
Design and Calibration of a Multi-View TOF Sensor Fusion System Young Min Kim, Derek Chan, Christian Theobalt, Sebastian Thrun Stanford University.
Autonomous Navigation of a
EE 440 Modern Navigation Systems
Paper – Stephen Se, David Lowe, Jim Little
Dejavu:An accurate Energy-Efficient Outdoor Localization System
Map for Easy Paths GIANLUCA BARDARO
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
دکتر سعید شیری قیداری & فصل 4 کتاب
Inertial Measurement Units
Proprioceptive Visual Tracking of a Humanoid Robot Head Motion
Sensor Fusion Localization and Navigation for Visually Impaired People
-Koichi Nishiwaki, Joel Chestnutt and Satoshi Kagami
Presentation transcript:

Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center (DLR), Institute of Robotics and Mechatronics, Dept. of Data Processing for Optical Systems, Rutherfordstr. 2, Berlin

Overview -Introduction -Integrated Positioning System -Inertial Navigation -Stereo Vision -Real time Framework -Experimental Results -Conclusions and Outlook Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Motivation -Navigation -Determination of 6 DoF (Position, Attitude) -Path planning, collision avoidance, … Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Introduction -Assumption: -No single technology is able to provide accurate position and attitude -Multi-sensor approach is needed -DLR research aims to: -Generic developments -Indoor/Outdoor capability -No infrastructure/external referencing -No maps (unknown environments) -No a priori assumptions -Passive system -Real time Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown Fusion

Inertial Navigation - Inertial Measurement Unit -Technology: inertial navigation, based on dead reckoning -Core component: Inertial Measurement Unit (IMU) -Angular velocity [deg/s] -Accelerations [m/s 2 ] -Microelectromechanical System IMU (MEMS) -Small -Lightweight -Low cost -Low Energy consumption -Very robust -Already widely used Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Inertial Navigation - Mechanization -Strapdown mechanization for IMU integration Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Inertial Navigation - Mechanization -Strapdown mechanization for IMU integration -Integration -Bias  Drift -Noise  Random Walk -MEMS-IMUs aggravate the problem -High noise -Less stable bias -Less stable scale factors -g-dependent errors -Aiding systems are necessary! -GPS/Galileo -Camera (stereo vision) -…-… Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Stereo Vision – Visual Odometry

Real Time Data Handling Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Experimental Results – Setup -Visual aided navigation with: -Stereo camera system (15 Hz) -Low cost MEMS based IMU (400 Hz) -Low cost MEMS based inclinometer -Sigma Point Kalman filter for sensor fusion -“Given” prerequisites -Synchronized measurements -Calibrated cameras -Calibrated IMU -Registered sensors Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Experimental Results – Navigation Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Experimental Results – Navigation -Unknown indoor environment -Path of about 317 m -Covering 4 floors -Closed loop for evaluation -21 similar runs -Tracker uses about 60 Features -Frame to frame accuracy: 5 mm/0.2 deg (2 mm/0.1 deg for viewing axis) -Final distance error < 1% of total path length Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Experimental Results – Closed Loop Error Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown Mean1σ Std x [m] y [m] z [m] absolute [m]2.71.3

Experimental Results – 3D Point Cloud -Parallel processing chain with matching with Semi Global Matching (SGM) -Combined with navigation solution to generate 3D point cloud -not real time yet Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Conclusion -A framework for multi-sensor navigation was presented -The framework was applied for visual aided inertial navigation with low cost MEMS-components and stereo vision -Accuracy of < 1% of the total path length (over 21 runs) -Low textured scenes cause short periods of pure inertial navigation -Uncompensated IMU errors (scale error, g-sensitivity, …) -Additional processing of a high density 3D point cloud Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown

Outlook -Creation of reference data sets (Camera, IMU, GPS, etc.) with ground truth measurements for indoor and outdoor environments -Synchronized -Calibrated -Registered -Advancing the modelling and calibration of low cost IMUs -Integrating absolute position measurements -GNSS measurements for outdoor -RFID/Wi-Fi/Bluetooth measurements for indoor -Seamless outdoor/indoor navigation Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments > Denis Grießbach > ISPRS 2013 > Capetown