Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ 08530 CPSC 643, Presentation.

Slides:



Advertisements
Similar presentations
Registration for Robotics Kurt Konolige Willow Garage Stanford University Patrick Mihelich JD Chen James Bowman Helen Oleynikova Freiburg TORO group: Giorgio.
Advertisements

Discussion topics SLAM overview Range and Odometry data Landmarks
An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute.
Introduction To Tracking
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
(Includes references to Brian Clipp
Kiyoshi Irie, Tomoaki Yoshida, and Masahiro Tomono 2011 IEEE International Conference on Robotics and Automation Shanghai International Conference Center.
Daniel Shepard and Todd Humphreys
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Simultaneous Localization and Mapping
1 Long-term image-based motion estimation Dennis Strelow.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Visual Odometry Michael Adams CS 223B Problem: Measure trajectory of a mobile platform using visual data Mobile Platform (Car) Calibrated Camera.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
Computing motion between images
Augmented Reality: Object Tracking and Active Appearance Model
Goal: Fast and Robust Velocity Estimation P1P1 P2P2 P3P3 P4P4 Our Approach: Alignment Probability ●Spatial Distance ●Color Distance (if available) ●Probability.
1/53 Key Problems Localization –“where am I ?” Fault Detection –“what’s wrong ?” Mapping –“what is my environment like ?”
Visual Odometry Chris Moore Mark Huetsch Firouzeh Jalilian.
Firefighter Indoor Navigation using Distributed SLAM (FINDS) Major Qualifying Project Matthew Zubiel Nick Long Advisers: Prof. Duckworth, Prof. Cyganski.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
EE392J Final Project, March 20, Multiple Camera Object Tracking Helmy Eltoukhy and Khaled Salama.
Biologically Inspired Turn Control for Autonomous Mobile Robots Xavier Perez-Sala, Cecilio Angulo, Sergio Escalera.
Kalman filter and SLAM problem
Human tracking and counting using the KINECT range sensor based on Adaboost and Kalman Filter ISVC 2013.
Robot Vision SS 2007 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
Robot Compagnion Localization at home and in the office Arnoud Visser, Jürgen Sturm, Frans Groen University of Amsterdam Informatics Institute.
Driver’s View and Vehicle Surround Estimation using Omnidirectional Video Stream Abstract Our research is focused on the development of novel machine vision.
SLAM (Simultaneously Localization and Mapping)
/09/dji-phantom-crashes-into- canadian-lake/
Navi Rutgers University 2012 Design Presentation
3D SLAM for Omni-directional Camera
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
Mapping and Localization with RFID Technology Matthai Philipose, Kenneth P Fishkin, Dieter Fox, Dirk Hahnel, Wolfram Burgard Presenter: Aniket Shah.
Dynamic 3D Scene Analysis from a Moving Vehicle Young Ki Baik (CV Lab.) (Wed)
Visual Odometry in a 2-D environment CS-365A Course Project BY: Aakriti Mittal (12005) Keerti Anand (13344) Under the guidance of: Prof. Amitabha Mukherjee.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Young Ki Baik, Computer Vision Lab.
Stereo Object Detection and Tracking Using Clustering and Bayesian Filtering Texas Tech University 2011 NSF Research Experiences for Undergraduates Site.
3D Reconstruction Jeff Boody. Goals ● Reconstruct 3D models from a sequence of at least two images ● No prior knowledge of the camera or scene ● Use the.
Asian Institute of Technology
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
MTP FY03 Year End Review – Oct 20-24, Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone:
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh.
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
State Estimation for Autonomous Vehicles
Presentation: Shashank Gundu.  Introduction  Related work  Hardware Platform  Experiments  Conclusion.
Fast Semi-Direct Monocular Visual Odometry
Visual Odometry David Nister, CVPR 2004
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Person Following with a Mobile Robot Using Binocular Feature-Based Tracking Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering.
1 Long-term image-based motion estimation Dennis Strelow and Sanjiv Singh.
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
Robot Vision SS 2009 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video.
CMSC5711 Image processing and computer vision
Motion from image and inertial measurements
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
SURF: Speeded-Up Robust Features
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
CMSC5711 Image processing and computer vision
Dongwook Kim, Beomjun Kim, Taeyoung Chung, and Kyongsu Yi
The SIFT (Scale Invariant Feature Transform) Detector and Descriptor
Presentation transcript:

Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation 4 Journal of Field Robotics. Vol. 23, No. 1, pp , 2006.

Mostly Related Works Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004

Mostly Related Works Integrate with IMU/GPS Bundle Adjustment Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004

Mostly Related Works 3-DOF Camera Optical Flow Method Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004

Mostly Related Works Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004 With absolute orientation sensor Forstner interest operator in the left Image, matches from left to right Use approximate prior knowledge Iteratively select landmark points

Mostly Related Works Stereo Visual Odometry – Agrawal 2007 Monocular Visual Odometry – Campbell 2005 A Visual Odometry System – Olson 2003 Previous Work of this Paper – Nister 2004 Estimates ego-motion using a hand-held Camera Real-time algorithm based on RANSIC

Other Related Works Simultaneous Localization and Mapping (SLAM) Robot Navigation with Omnidirectional Camera

Motivation Olson: With absolute orientation sensor Forstner interest operator in the left Image, matches from left to right Use approximate prior knowledge Iteratively select landmark points Nister: Use pure visual information Use Harris corner detection in all images, track feature to feature No prior knowledge RANSIC based estimation in real- time

Feature Detection Harris Corner Detection Search for the local maxima of the corner strength. determinant, trance, constant, window area, derivatives of input image, weight function.

Feature Detection Four Sweeps to Calculate Compute, by filters and. Calculate the horizontal sum by filter. Calculate the vertical sum by filter. Calculate corner strength.

Feature Detection Detected Feature Points Superimposed feature tracks through images

Feature Matching Two Directional Matching Calculate the normalized correlation in reign, where, are two consecutive input images. Match the feature points in the circular area that have the maximum correlation in two directions.

Robust Estimation The Monocular Scheme Separate the matching point s into 5-points groups. Treat each group as a 5-poi- nt relative pose problem. Use RANSAC to select well matched groups. Estimate camera motion us- ing the selected groups. Put the current estimation in to the coordinate of the prev- ious one.

Robust Estimation The Stereo Scheme Match the feature points in stereo images, then triangulate them into 3D points. Estimation the camera motion using RANSAC and the 3D points in consecutive frames.

Experiments Different Platforms

Experiments Speed and Accuracy

Experiments Visual Odometry vs. Differential GPS

Experiments Visual Odometry vs. Inertial Navigation System (INS)

Experiments Visual Odometry vs. Wheel Recorder

Conclusion and Future Work Conclusion A real-time ego motion estimation system. Work both on monocular camera and stereo head. Results are accurate and robust. Future Work Integrate visual odometry with Kalman filter. Use sampling methods with multimodal distributions.