An Inexpensive Method for Evaluating the Localization Performance of a Mobile Robot Navigation System Harsha Kikkeri, Gershon Parent, Mihai Jalobeanu,

Slides:



Advertisements
Similar presentations
Brian Peasley and Stan Birchfield
Advertisements

TERRESTRIAL LASER SCANNING (TLS): APPLICATIONS TO ARCHITECTURAL AND LANDSCAPE HERITAGE PRESERVATION – PART 1.
ABSOLUTE MOTION ANALYSIS (Section 16.4)
Motors and Automatic Target Recognition Motors... l Motorized Instruments have 2 Motors: –One to Rotate the Alidade Horizontally –A Second to Rotate.
Surveying I. Lecture 2..
Accurate On-Line 3D Occupancy Grids Using Manhattan World Constraints Brian Peasley and Stan Birchfield Dept. of Electrical and Computer Engineering Clemson.
Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.
Kinematics & Grasping Need to know: Representing mechanism geometry Standard configurations Degrees of freedom Grippers and graspability conditions Goal.
Extracting Minimalistic Corridor Geometry from Low-Resolution Images Yinxiao Li, Vidya, N. Murali, and Stanley T. Birchfield Department of Electrical and.
Hybrid Position-Based Visual Servoing
Presented for: CPS Lab-ASU By: Ramtin Kermani
Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland.
ECE 7340: Building Intelligent Robots QUALITATIVE NAVIGATION FOR MOBILE ROBOTS Tod S. Levitt Daryl T. Lawton Presented by: Aniket Samant.
Multi video camera calibration and synchronization.
1 Incremental Evolution of Autonomous Controllers for Unmanned Aerial Vehicles using Multi-objective Genetic Programming Gregory J. Barlow, Choong K. Oh,
Use of a commercial laser tracker for optical alignment James H. Burge, Peng Su, Chunyu Zhao, Tom Zobrist College of Optical Sciences Steward Observatory.
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
CAU Kiel DAGM 2001-Tutorial on Visual-Geometric 3-D Scene Reconstruction 1 The plan for today Leftovers and from last time Camera matrix Part A) Notation,
Design of Autonomous Navigation Controllers for Unmanned Aerial Vehicles using Multi-objective Genetic Programming Gregory J. Barlow March 19, 2004.
Mobile Robotics: 10. Kinematics 1
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Projected image of a cube. Classical Calibration.
Screw Rotation and Other Rotational Forms
Camera Parameters and Calibration. Camera parameters From last time….
Angular measurement 1. Line of sight – the join of the points S and P Horizontal direction – the intersection between the vertical plane  i where is.
Integration and Alignment of Optical Subsystem Roy W. Esplin Dave McLain.
Kalman filter and SLAM problem
Simple Calibration of Non-overlapping Cameras with a Mirror
A HIGH RESOLUTION 3D TIRE AND FOOTPRINT IMPRESSION ACQUISITION DEVICE FOR FORENSICS APPLICATIONS RUWAN EGODA GAMAGE, ABHISHEK JOSHI, JIANG YU ZHENG, MIHRAN.
POLAR COORDINATES (Ch )
Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
3D SLAM for Omni-directional Camera
LASER AND ADVANCES IN METROLOGY
MODERN SURVEY (FAMILARISATION WITH EQUIPMENTS). Modern equipments EDM – Electronic distance measurement eqp. EDM – Electronic distance measurement eqp.
Week 5 - Wednesday.  What did we talk about last time?  Project 2  Normal transforms  Euler angles  Quaternions.
SS5305 – Motion Capture Initialization 1. Objectives Camera Setup Data Capture using a Single Camera Data Capture using two Cameras Calibration Calibration.
Toby Shoobridge David Benham NERC Space Geodesy Facility Herstmonceux, UK
Photogrammetry for Large Structures M. Kesteven CASS, CSIRO From Antikythera to the SKA Kerastari Workshop, June
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
ERT247/4 GEOMATICS ENGINEERING
Self-Calibration and Metric Reconstruction from Single Images Ruisheng Wang Frank P. Ferrie Centre for Intelligent Machines, McGill University.
Generalized Hough Transform
Geometric Camera Models
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
A Passive Approach to Sensor Network Localization Rahul Biswas and Sebastian Thrun International Conference on Intelligent Robots and Systems 2004 Presented.
Lei Li Computer Science Department Carnegie Mellon University Pre Proposal Time Series Learning completed work 11/27/2015.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Peter Henry1, Michael Krainin1, Evan Herbst1,
Active Vision Sensor Planning of CardEye Platform Sherif Rashad, Emir Dizdarevic, Ahmed Eid, Chuck Sites and Aly Farag ResearchersSponsor.
ERT247/4 GEOMATICS ENGINEERING
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Alternate River/Valley Crossing Procedure for Theodolites Kendall Fancher Chief, NGS Instrumentation & Methodologies Branch
City College of New York 1 John (Jizhong) Xiao Department of Electrical Engineering City College of New York Mobile Robot Control G3300:
Fixed-Center Pan-Tilt Projector and Its Calibration Methods Ikuhisa Mitsugami Norimichi Ukita Masatsugu Kidode Graduate School of Information Science Nara.
Vision Sensors for Stereo and Motion Joshua Gluckman Polytechnic University.
Digital Image Processing Additional Material : Imaging Geometry 11 September 2006 Digital Image Processing Additional Material : Imaging Geometry 11 September.
Contents: 1. Introduction 2. Gyroscope specifications 3. Drift rate compensation 4. Orientation error correction 5. Results 6. Gyroscope and odometers.
1 Teaching Innovation - Entrepreneurial - Global The Centre for Technology enabled Teaching & Learning, N Y S S, India DTEL DTEL (Department for Technology.
Viewing. Classical Viewing Viewing requires three basic elements - One or more objects - A viewer with a projection surface - Projectors that go from.
Image Warping 2D Geometric Transformations
Design and Calibration of a Multi-View TOF Sensor Fusion System Young Min Kim, Derek Chan, Christian Theobalt, Sebastian Thrun Stanford University.
Construction Surveys. List of Figures Stages and Measurements of CS Design stages: topographic Surveying, and site maps Construction stage: Setting-out.
Calibration ECE 847: Digital Image Processing Stan Birchfield Clemson University.
Computer vision: models, learning and inference
Group 3 Corey Jamison, Joel Keeling, & Mark Langen
Paper – Stephen Se, David Lowe, Jim Little
Sensors for industrial mobile Robots: environment referred laser scanning Need more/better pictures.
Single-view geometry Odilon Redon, Cyclops, 1914.
Presentation transcript:

An Inexpensive Method for Evaluating the Localization Performance of a Mobile Robot Navigation System Harsha Kikkeri, Gershon Parent, Mihai Jalobeanu, and Stan Birchfield Microsoft Robotics

Motivation Goal: Automatically measure performance of mobile robot navigation system Purpose: Internal comparison – how is my system improving over time? External comparison – how does my system compare to others? Requirements: Repeatable – not just playback of recorded file, but run the system again (with environment dynamics) Reproducible – others should be able to measure the performance of their system in their environment Comparable – need to compare solutions with different hardware and sensors, in different environments Inexpensive – cost should not be a barrier to use We focus only on localization performance here

Scalability System should scale in space (large environments) in time (long runs) in variety (different types of environments) Simplicity is key to scalability: Low setup time Easy calibration Inexpensive components Non-intrusive

Previous work Datasets: Radish, New College, SLAM datasets do not always have ground truth SLAM with ground truth: Rawseeds, Freiburg, TUM use prerecorded data, do not scale easily Qualitative evaluation: RoboCupRescue, RoboCupHome focus is on achieving a particular task Benchmarking initiatives: EURON, RoSta, PerMIS, RTP have not yet created definitive set of metrics / benchmarks for nav Comparison on small scale: Teleworkbench small scale Retroreflective markers and laser: Tong-Barfoot ICRA 2011 requires laser, subject to occlusion

Our approach Checkerboard pattern Yields 3D pose of camera relative to target Convert to 2D pose of robot on floor Landmark x y

A useful instrument Laser level: Upward facing laser provides plumb-up line Downward facing laser provides plumb-down line Horizontal laser (not used) Self-leveling, so plumb lines are parallel to gravity Used to determine point on ground directly below origin of target

Procedure Calibration Internal camera parameters External camera parameters w.r.t. robot (position, tilt) Floor parameters under each landmark (tilt) Map-building Build map When under landmark, user presses button Pose estimation + calibration  robot pose w.r.t. landmark Store robot pose w.r.t. map* Runtime Generate sequence of waypoints When robot thinks it is under a landmark,* Pose estimation + calibration  robot pose w.r.t. landmark Error is difference between pose at runtime and pose at map-building *Note: Any type of map can be used

internal camera parameters Coordinate systems image camera robot landmark world 2D/3D Euclidean 2D Euclidean (optional) 3D Euclidean (external camera parameters) 2D Euclidean (absolute metric) 2D/3D Euclidean (relative metric) { CALIBRATION { POSE ESTIMATION { LOCALIZATION (what we want) ? ?

Camera-to-robot calibration Need to determine: rotation between camera and robot 3 translation between camera and robot+ 3 6 parameters If floor were completely flat, and camera were mounted perfectly upright, then x r = x – d rc cos  rc y r = y – d rc sin  rc  r =  –  a robot camera driving direction wheel base But floor is often not flat, and camera is never upright camera poserobot pose camera offset camera roll robot center

x r = x – d rc cos  rc – z sin  c cos (  c +  ) – z sin  f cos  f y r = y – d rc sin  rc – z sin  c cos (  c +  ) – z sin  f cos  f  r =  –  a Camera-to-robot calibration When floor is not flat, and camera is not upright, then estimate tilt of camera w.r.t. floor normal (  c ) azimuth of camera tilt plane w.r.t. forward direction of robot (  c ) tilt of floor w.r.t. gravity (  f ) azimuth of floor tilt plane w.r.t. positive x axis of landmark (  f ) Rotate robot incrementally 360 degrees Rotation axis is perpendicular to floor Optical axis traces cone rcrc rfrf }} floor gravity floor normal optical axis cc ff

Calibration geometry floor landmark gravity

floor landmark gravity robot camera center Calibration geometry

robot Calibration geometry ff floor landmark camera center gravity

Calibration geometry ff floor landmark axis of rotation (= floor normal) ff camera center gravity

Calibration geometry ff floor landmark axis of rotation ff camera center gravity optical axis 1 cc

Calibration geometry ff floor landmark axis of rotation ff camera center gravity z1z1 optical axis 1 cc

Calibration geometry ff floor landmark axis of rotation ff camera center gravity z1z1 optical axis 1 cc x1x1

Calibration geometry ff floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 cc x1x1 rotate robot

Calibration geometry ff floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 cc x1x1 rotate robot These are 180 o apart

Calibration geometry cc ff cc floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 x1x1

x1x1 Calibration geometry cc ff z2z2 cc floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1

x1x1 Calibration geometry cc ff z2z2 x2x2 cc floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1

x1x1 Calibration geometry cc ff z2z2 x2x2 cc x 2 – x 1 floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 (x 1,z 1 ), (x 2,z 2 ) are from pose estimation sin  c = (x 2 -x 1 ) / 2z sin  f = (x 2 +x 1 ) / 2z where z = (z 1 +z 2 )/2 Note: x 1 + (x 2 -x 1 ) / 2 = (x 2 +x 1 ) / 2

Calibration geometry x1x1 cc ff z2z2 x2x2 cc x 2 – x 1 floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 radius of circle: distance from landmark center to circle center: (x 1,z 1 ), (x 2,z 2 ) are from pose estimation sin  c = (x 2 -x 1 ) / 2z sin  f = (x 2 +x 1 ) / 2z where z = (z 1 +z 2 )/2

Calibration geometry x1x1 cc ff z2z2 x2x2 cc x 2 – x 1 floor landmark optical axis 2 axis of rotation ff camera center gravity z1z1 optical axis 1 radius of circle: distance from landmark center to circle center: where (x 1,z 1 ), (x 2,z 2 ) are from pose estimation r c / z sin  c = (x 2 -x 1 ) / 2z sin  f = (x 2 +x 1 ) / 2z r f / z where z = (z 1 +z 2 )/2

Calibration geometry (from real data) Azimuth angles Tilt angles where Top-down view of circle

Evaluating accuracy Mounted camera to carriage of CNC machine Move to different known (x,y,  ), measure pose Covered area 1.3 x 0.6 m  Position err:  =5  =2 mm max=11 mm Angular err:  =0.3  =0.2 deg max=1 deg

Evaluating accuracy Placed robot at 20 random positions under one landmark  Position err usually < 20 mm Orient err usually < 1 deg

Evaluating accuracy 15 landmarks across 2 bldgs. Placed robot at 5 canonical positions  Position err usually < 20 mm Orient err usually < 1 deg

Evaluating accuracy Our accuracy is comparable to other systems Our system is scalable to large environments GTvision/GTlaser from Ceriani et al. AR 2009 (Rawseeds) mocap from Kummerle et al. AR 2009 retroreflective from Tong, Barfoot ICRA 2011 scales to arbitrarily large environments scales to very large single-floor environments (with additional step)

Evaluating accuracy Two different buildings on the Microsoft campus

Evaluating accuracy Automated runs in 2 diff. environments Accuracy comparable Easy to setup Easy to maintain

Computing global coordinates Theodolite: Horizontal laser emanates from pan-tilt head Reflects off mirror Measures (w.r.t. gravity) horizontal distance to mirror pan angle to mirror tilt angle to mirror (not used)

Computing global coordinates For target positions: Repeatedly measure distance and angle for each triplet of targets with line-of- sight  2D Euclidean coordinates of all targets in a common global coordinate system High accuracy of theodolite removes nearly all drift Drift can be checked by adding all angles in a loop, comparing with 360 degrees (optional)  l 12 l 23       l 34 l 45 l 15 l 67 l 78 theodolite reflector

Computing global coordinates Given l 1, l 2,  (from theodolite) and t length (known), find  Naïve solution is sensitive to noise Key is to use only measured values Better solution tan  = ( l 1 - l 2 cos  ) /  where ( l 1 - l 2 cos  ) 2 +    l 2 2 Naïve solution sin  = ( l 1 - l 2 cos  ) / t length For target orientation: Place reflector under several positions within target theodolite reflector (multiple locations – only 2 needed)    l1l1 l2l2 t length target theodolite 

Navigation contest Microsoft and Adept are organizing Kinect Autonomous Mobile Robot Contest at IROS 2014 in Chicago

Conclusion System for evaluating localization accuracy of navigation Inexpensive Easy to setup Easy to maintain Highly accurate Scalable to arbitrarily large environments Scalable to arbitrarily run lengths (time or space) With theodolite, global coordinates are possible We have begun long-term, large-scale comparisons (results forthcoming) Mobile robot navigation contest at IROS 2014

Thanks!