MTP FY03 Year End Review – Oct 20-24, 2003 - 1 Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone:

Slides:



Advertisements
Similar presentations
The fundamental matrix F
Advertisements

For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Odometry Error Modeling Three noble methods to model random odometry error.
Literature Review: Safe Landing Zone Identification Presented by Keith Sevcik.
Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
Reducing Drift in Parametric Motion Tracking
Hybrid Position-Based Visual Servoing
IR Lab, 16th Oct 2007 Zeyn Saigol
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Optimization & Learning for Registration of Moving Dynamic Textures Junzhou Huang 1, Xiaolei Huang 2, Dimitris Metaxas 1 Rutgers University 1, Lehigh University.
Computer Vision REU Week 2 Adam Kavanaugh. Video Canny Put canny into a loop in order to process multiple frames of a video sequence Put canny into a.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Video Processing EN292 Class Project By Anat Kaspi.
A Self-Supervised Terrain Roughness Estimator for Off-Road Autonomous Driving David Stavens and Sebastian Thrun Stanford Artificial Intelligence Lab.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Computer Science Department Andrés Corrada-Emmanuel and Howard Schultz Presented by Lawrence Carin from Duke University Autonomous precision error in low-
Weighted Range Sensor Matching Algorithms for Mobile Robot Displacement Estimation Sam Pfister, Kristo Kriechbaum, Stergios Roumeliotis, Joel Burdick Mechanical.
Bootstrapping a Heteroscedastic Regression Model with Application to 3D Rigid Motion Evaluation Bogdan Matei Peter Meer Electrical and Computer Engineering.
Lecture 12: Structure from motion CS6670: Computer Vision Noah Snavely.
Overview and Mathematics Bjoern Griesbach
Vision Guided Robotics
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
Kalman filter and SLAM problem
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Autonomous Robot Localisation, Path Planning, Navigation, and Obstacle Avoidance The Problem: –Robot needs to know where it is? where it’s going? is there.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Real-time Dense Visual Odometry for Quadrocopters Christian Kerl
Driver’s View and Vehicle Surround Estimation using Omnidirectional Video Stream Abstract Our research is focused on the development of novel machine vision.
Localisation & Navigation
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
3D SLAM for Omni-directional Camera
Vision-based Landing of an Unmanned Air Vehicle
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only Test Plan Review MSL Focused Technology Instrument Placement Validation Test Plan for 2D/3D.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Young Ki Baik, Computer Vision Lab.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Binocular Stereo #1. Topics 1. Principle 2. binocular stereo basic equation 3. epipolar line 4. features and strategies for matching.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
State Estimation for Autonomous Vehicles
Visual Odometry David Nister, CVPR 2004
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
3D head pose estimation from multiple distant views X. Zabulis, T. Sarmis, A. A. Argyros Institute of Computer Science, Foundation for Research and Technology.
G. Casalino, E. Zereik, E. Simetti, A. Turetta, S. Torelli and A. Sperindè EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Person Following with a Mobile Robot Using Binocular Feature-Based Tracking Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering.
Body i+1 Body i-1 Body i Joint i Joint i+ 1 to tip (outward) to base (inward) Sensor Frame Frame A Body i+2 Joint i+ 2 Reference Frame Ground Frame root_body.
Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum.
Basilio Bona DAUIN – Politecnico di Torino
Astrobiology Science and Technology for Exploring Planets (ASTEP) Mid-Year Review August 4, 2004 Robust Autonomous Instrument Placement for Rovers (JPL:
Correspondence and Stereopsis. Introduction Disparity – Informally: difference between two pictures – Allows us to gain a strong sense of depth Stereopsis.
Localization Life in the Atacama 2004 Science & Technology Workshop January 6-7, 2005 Daniel Villa Carnegie Mellon Matthew Deans QSS/NASA Ames.
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Vision Based Motion Estimation for UAV Landing
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Presentation transcript:

MTP FY03 Year End Review – Oct 20-24, Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone: /16/2003

MTP FY03 Year End Review – Oct 20-24, Outline of this Talk Brief History Algorithm Software structure and interface Software Features Ground truth measurement Some results Future works

MTP FY03 Year End Review – Oct 20-24, Brief History H. Moravec’s PhD Thesis, “ Obstacle Avoidance and Navigation in the Real World by a Seeing Robot Rover, Stanford University, 1980 Larry Matthies’ PhD Thesis, “Dynamic Stereo Vision”, Oct, 1989, CMU. A version of Visual Odometry in C was implemented in early 1990s in JPL. A C++ version of visual odometry was implemented by MTP Slope Navigation task led by Larry Matthies in The visual odometry has been ported to CLARAty and demonstrated onboard motion estimation on Rock 8 in The visual odometry has been used successfully on slip compensation by the slope navigation task. The visual odometry has been integrated officially to MER navigation software and demonstrated successfully in A few other versions of visual odometry were developed in academic and industry communities.

MTP FY03 Year End Review – Oct 20-24, Visual Odometry Feature Selections Feature Stereo Matching Feature Gap Analysis Feature Tracking Rigidity Test Least Medians Square Schonemann Motion Estimation Maximum Likelihood Motion Estimation Visual Odometry Fusion Input ImagesInput Motion Output Motion To use a (stereo) image sequence to track 3-D point features, or landmark, to estimate the motion of the vehicle.

MTP FY03 Year End Review – Oct 20-24, Feature (Landmark) Selection A landmark is a patch of image which must exhibit intensity variation that allows the landmark to be localized in subsequent image. Input Image Landmarks Interest Image Forstner operator

MTP FY03 Year End Review – Oct 20-24, Feature Stereo Matching (Pyramid Searching) Left Image Right Image

MTP FY03 Year End Review – Oct 20-24, Feature Gap Analysis and Triangulation Error Gap Gap Analysis Cameras Error Vs Location Error Vs Correlation Peak Error Vs Ray Gap L C. R.C. L.C. R.C. Gaussian Error Model

MTP FY03 Year End Review – Oct 20-24, Feature Tracking Left Image Right Image Feature prediction QpQp QcQc Previous Frame Current Frame

MTP FY03 Year End Review – Oct 20-24, Motion Estimation (Least-Squares Vs Maximum Likelihood) A closed form solution Rotation, R, with orthogonal constrain is estimated first Translation, T, is then estimated. Reflect the quality of the observations. It is fast. The resulting motion estimates can be substantially inferior. An nonlinear optimization solution Fully reflects the error model It is relative slow It needs an initial estimate. It is sensitive to outliers Its motion estimates in general is much superior than the least- squares estimation. Least-squares Estimation Maximum Likelihood Estimation

MTP FY03 Year End Review – Oct 20-24, Least-squares Estimation Merit Function: Orthogonal constrains: Solutions:

MTP FY03 Year End Review – Oct 20-24, Maximum Likelihood Estimation Merit Function: W = covariance matrix of the feature i Solutions: To linearize the merit function and determine the three attitude and three translation iteratively. Page 150 of Larry Matthies’ thesis

MTP FY03 Year End Review – Oct 20-24, Visual Odometry Interface VOMotionStart( leftCam, rightCam, ParameterFile, leftImage, rightImage, leftDisp, InitialMotion) VOMotion(leftImage, rightImage, leftDisp, InitialMotion, *estMotion) Camera models : CAHV, CAHVOR, CAHVORE Motion file: Position[3], attitude [3], covarence[6][6] leftDisp: the disparity image generated by stereo processing. Parameter File contains 48 parameters

MTP FY03 Year End Review – Oct 20-24, Some VO Parameters VO_MAX_NUM_VO_FEATURES600features VO_MIN_NUM_VO_FEATURES8iteration VO_VO_MAX_PIXEL_OFFSET1pixel VO_MAX_VO_ITERATIONS50iteration VO_VO_CORR_WINDOW_ROWS9pixel VO_VO_CORR_WINDOW_COLS9pixel VO_VO_TRACK_WINDOW_SIZE50pixel VO_VO_SELECT_WINDOW_SIZE9pixel VO_VO_NUM_IMAGE_PAIRS4images VO_VO_IMAGE_ROWS640pixel VO_VO_IMAGE_COLS480pixel VO_SCHONEMANN_ITERATIONS50iteration VO_VO_MIN_DIST_FEATURE0.5meter VO_VO_MAX_DIST_FEATURE20.0meter VO_VO_AFFINE_MATCH_FLAG0Boolean VO_MAX_DELTA VO_DEFAULT_VO_MIN_CORRELATION0.8correlation

MTP FY03 Year End Review – Oct 20-24, Arroyo Data Collection 1.About 8 meters of image (20 cm step) sequence were collected at JPL arroyo in March, Onboard IMU, wheel odometry and other data were collected. 3.Ground truth data (position and attitude) were collected by totalstation.

MTP FY03 Year End Review – Oct 20-24, Semiautomatic Rover Position and Attitude Measurement Pitch, Roll, Heading error < 0.5 degree; Position error < 3mm Three points are measured at each stop. The position and attitude can be determined. Total Station & Prism

MTP FY03 Year End Review – Oct 20-24, Motion Estimation (X)

MTP FY03 Year End Review – Oct 20-24, Motion Estimation (Y)

MTP FY03 Year End Review – Oct 20-24, Motion Estimation (Z)

MTP FY03 Year End Review – Oct 20-24, Heading Estimation

MTP FY03 Year End Review – Oct 20-24, Roll Estimation

MTP FY03 Year End Review – Oct 20-24, Pitch Estimation

MTP FY03 Year End Review – Oct 20-24, VO Fusion (front and rear Has Camera ) RearRear + front Absolute Error (x)

MTP FY03 Year End Review – Oct 20-24, VO Fusion (front and rear Has Camera ) Image Step RearRear + front Absolute Error (Y)

MTP FY03 Year End Review – Oct 20-24, VO Fusion (front and rear Has Camera ) RearRear + front Absolute Error (Z)

MTP FY03 Year End Review – Oct 20-24, Absolute Error (Heading) RearRear + front

MTP FY03 Year End Review – Oct 20-24, Absolute Error (Pitch) RearRear + front

MTP FY03 Year End Review – Oct 20-24, Absolute Error (Roll) RearRear + front

MTP FY03 Year End Review – Oct 20-24, Heading Estimation

MTP FY03 Year End Review – Oct 20-24, Heading Estimation (1)

MTP FY03 Year End Review – Oct 20-24, Heading Estimation (2)

MTP FY03 Year End Review – Oct 20-24, Field Test Location Johnson Valley, Mojave Desert, CA Sandy slopes of up to 20-25° slopes Logistics 4 days – 4 people –1.5 days of setup and break down –2.5 days of experimentation Motivation Mars Yard is too small and has no slopes –The size is mostly a factor for visual odometry which looks far beyond traverse distance

MTP FY03 Year End Review – Oct 20-24, Sample of images

MTP FY03 Year End Review – Oct 20-24, Field Test Results Visual Odometry vs. Ground Truth area expanded (and rotated) in next slide Error (0.37 m) is less than 1.5% of distance traveled (29 m) Ground truth data collected with a Leica Total Station and four rover mounted prisms

MTP FY03 Year End Review – Oct 20-24, Field Test Results Slip Compensation/Path Following Results carrot heading visual odometry pose kinematics pose desired path There is a noticeable bias between the visual odometry pose and the kinematics pose in the y direction of many estimates; this is due to the downhill slippage of the rover; this bias is being compensated for in the slip compensation algorithm

MTP FY03 Year End Review – Oct 20-24, MER VO Test (Rough)

MTP FY03 Year End Review – Oct 20-24, MER Test

MTP FY03 Year End Review – Oct 20-24, Target Approach Changes in FOV (a) Target (b) Designated Target Target Tracking time = t2 (avoiding an obstacle) time = t1 1 st Frame 37 th Frame after 10 m

MTP FY03 Year End Review – Oct 20-24, Integrated 2D/3D Tracker Not tested onboard in the integrated system Stereo Process IMU Hazard Cameras Navigator R8 Locomotor Motion Cmd Navigator Depth Map Harris Multiple Feature Extractor Rover Pose Estimate Visual Odometry Visual Odometry Non-flat Surface Filter Rover Pose Estimate + uncertainty? Wheel odo Compute Mast Pointing Angle Tilt Sub window Stereo Map Pan & Tilt Angles Large Uncert- ainty? Large Uncert- ainty? Designate Target (DT) (r,c) in right image No DT(r,c) to DT (x,y,z) Mast Cameras Yes Output: DT(r,c) at t 0 + Δt Mast Images Depth Disparity Expected 2D location Pan-Tilt Controller Pan-Tilt Controller Mast Kinematics Designated Point Visual Tracking Track once w/ 4 mm camera Seed & track again w/ 16 mm camera Pointing Vector 1 st Tier 2D location estimate Adaptive View- Based Matching Adaptive View- Based Matching Normalized Cross Correlation Surface Normals Grow Feature Win Single Pt Stereo Designated Target Template 2 nd Tier 2D location estimate Affine Tracker

MTP FY03 Year End Review – Oct 20-24, Tracking Results over Rough Terrain Tracking Video View from 4 mm camera View from 16 mm camera

MTP FY03 Year End Review – Oct 20-24, Ground Truth Data Collection System Automatically tracks the position of 1 prism and finds the 3 other prisms when rover stops Simplifies and speeds the collection of ground truth data in field tests Locates rover frame in world frame and the initial rover frame +/- 2mm position accuracy +/- 0.3º orientation accuracy

MTP FY03 Year End Review – Oct 20-24, Future Works A real-time Visual Odometry Data Fusion with other sensors (IMU …) to achieve better estimation Visual Odometry Applications