Real-time Dense Visual Odometry for Quadrocopters Christian Kerl 11.05.20121.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Real-Time Template Tracking
DTAM: Dense Tracking and Mapping in Real-Time
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Odometry Error Modeling Three noble methods to model random odometry error.
Position and Attitude Determination using Digital Image Processing Sean VandenAvond Mentors: Brian Taylor, Dr. Demoz Gebre-Egziabher A UROP sponsored research.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
Reducing Drift in Parametric Motion Tracking
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
The GraphSLAM Algorithm Daniel Holman CS 5391: AI Robotics March 12, 2014.
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
ROBOTILNICA Intelligent Systems and Robotics, Skopje, R.Macedonia
Attitude Determination - Using GPS. 20/ (MJ)Danish GPS Center2 Table of Contents Definition of Attitude Attitude and GPS Attitude Representations.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Direct Methods for Visual Scene Reconstruction Paper by Richard Szeliski & Sing Bing Kang Presented by Kristin Branson November 7, 2002.
Detecting and Tracking Moving Objects for Video Surveillance Isaac Cohen and Gerard Medioni University of Southern California.
High Dynamic Range Imaging: Spatially Varying Pixel Exposures Shree K. Nayar, Tomoo Mitsunaga CPSC 643 Presentation # 2 Brien Flewelling March 4 th, 2009.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
Capturing the Motion of Ski Jumpers using Multiple Stationary Cameras Atle Nes Faculty of Informatics and e-Learning Trondheim University.
1. The Promise of MEMS to LBS and Navigation Applications Dr. Naser El-Shiemy, CEO Trusted Positioning Inc. 2.
MACHINE VISION GROUP Multimodal sensing-based camera applications Miguel Bordallo 1, Jari Hannuksela 1, Olli Silvén 1 and Markku Vehviläinen 2 1 University.
Chapter 6 Feature-based alignment Advanced Computer Vision.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
/09/dji-phantom-crashes-into- canadian-lake/
1 Robust estimation techniques in real-time robot vision Ezio Malis, Eric Marchand INRIA Sophia, projet ICARE INRIA Rennes, projet Lagadic.
Navi Rutgers University 2012 Design Presentation
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
3D SLAM for Omni-directional Camera
Vision-based Landing of an Unmanned Air Vehicle
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Robust localization algorithms for an autonomous campus tour guide Richard Thrapp Christian Westbrook Devika Subramanian Rice University Presented at ICRA.
3D Reconstruction Jeff Boody. Goals ● Reconstruct 3D models from a sequence of at least two images ● No prior knowledge of the camera or scene ● Use the.
MTP FY03 Year End Review – Oct 20-24, Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone:
1. COMMUNICATION Liam O’Sullivan  Used XBee RF 2.4 GHz modules for telemetry  Point to point communication (platform and GCS)  Disadvantages.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
1 Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh.
FUFO project Final report.
Presentation: Shashank Gundu.  Introduction  Related work  Hardware Platform  Experiments  Conclusion.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 1.1: Welcome Jürgen Sturm Technische Universität München.
Fast Semi-Direct Monocular Visual Odometry
Visual Odometry David Nister, CVPR 2004
Using Kalman Filter to Track Particles Saša Fratina advisor: Samo Korpar
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Company LOGO Project Characterization Spring 2008/9 Performed by: Alexander PavlovDavid Domb Supervisor: Mony Orbach GPS/INS Computing System.
Tracking Mobile Nodes Using RF Doppler Shifts
G. Casalino, E. Zereik, E. Simetti, A. Turetta, S. Torelli and A. Sperindè EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Robust Localization Kalman Filter & LADAR Scans
1 Long-term image-based motion estimation Dennis Strelow and Sanjiv Singh.
Camera calibration from multiple view of a 2D object, using a global non linear minimization method Computer Engineering YOO GWI HYEON.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
UAV for Indoors Inspection and Measurement
Paper – Stephen Se, David Lowe, Jim Little
Answering ‘Where am I?’ by Nonlinear Least Squares
Motion and Optical Flow
Vision Based Motion Estimation for UAV Landing
Robust Visual Motion Analysis: Piecewise-Smooth Optical Flow
Real Time Dense 3D Reconstructions: KinectFusion (2011) and Fusion4D (2016) Eleanor Tursman.
Map for Easy Paths GIANLUCA BARDARO
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
A special case of calibration
Dongwook Kim, Beomjun Kim, Taeyoung Chung, and Kyongsu Yi
Presentation transcript:

Real-time Dense Visual Odometry for Quadrocopters Christian Kerl

Outline Motivation Hardware & Software Approach Problems Ideas

Motivation Quadrocopters need sensors to fly in unknown environments – Motion – Position – Obstacles Restricted on-board sensors – IMU – Visual navigation (no GPS) Restricted computing resources  Autonomous system

Motivation Standard approach to visual odometry: – Sparse feature tracking in intensity / color images – Examples: Jakob, ETH Zurich, TU Graz, MIT – On-board frame rates 10 Hz Our approach: – Using full RGB-D image information – No feature tracking

Hardware – Asctec Pelican

Hardware – Asctec Pelican IMU AutoPilot Board Atom Board

Hardware – Asctec Pelican IMU – 3 axis magnetometer, gyroscope, accelerometer AutoPilot Board – Highlevel + Lowlevel Processor (ARM) Atom Board – Intel Atom Z GHz – 1 GB RAM – 7 Mini-USB Ports – WirelessLAN 600 g payload

Software – Asctec Pelican ROS drivers for Asctec Pelican from ETH Zurich Nonlinear dynamic inversion for position control Luenberger Observer for data fusion Updated version using Extended Kalman Filter to be presented on ICRA 2012 Needs absolute position input from external source Allows to command accelerations, velocities or positions

Hardware – Asus Xtion Pro Live 24 bit RGB image 16 bit depth image 30 Hz 150 g +On-camera RGB and depth image registration +Time synchronized depth and RGB image -Rolling shutter -Auto exposure

Approach Estimate transformation minimizing squared intensity error (energy minimization)

Approach Linearization with minimize => solve normal equations

Analysis Estimate transformation minimizing squared intensity error (energy minimization) X translation Y translation

Hovering Image Data from Quadrocopter

Trajectory along camera z-axis Image Data from Quadrocopter

Problems Motion blur Auto exposure Dynamic objects (humans)

Problems – Motion Blur

Problems – Motion Blur

Problems – Motion Blur

Problems – Motion Blur

Problems – Motion Blur

Problems – Motion Blur

Problems – Motion Blur

Problems – Auto Exposure

Problems – Auto Exposure

Problems – Dynamic Objects

Ideas Weighted Least Squares Initial motion estimate between 2 consecutive frames from IMU data fusion Multiple iterations per level, convergence checks Regularization term to minimize / constrain least squares solution Minimization of intensity and depth error

Ideas – Weighted Least Squares Assign smaller weight to residual outliers => Weight calculation

Ideas – Weighted Least Squares Influence function – Tukey weight – Huber weight

Ideas – Weighted Least Squares Weighted error

Ideas – Weighted Least Squares Influence on energy function X translation w/o weightsX translation w/ Huber weights

Ideas – Weighted Least Squares Influence on energy function Y translation w/o weightsY translation w/ Huber weights

Ideas – Weighted Least Squares

Ideas – Weighted Least Squares Robustification with respect to dynamic objects Slightly degrades tracking performance How to choose parameter b?

Ideas – Initialization from IMU Use transformation from IMU data fusion as initial estimate

Ideas – Initialization from IMU Use transformation from IMU data fusion as initial estimate

Ideas – Initialization from IMU Use transformation from IMU data fusion as initial estimate

Ideas – Multiple Iterations Perform multiple optimization steps per image pyramid level Stop when increment below threshold Bad frames / diverging results can be recognized and skipped

Summary/Discussion Weighted Least Squares needs more work (especially weight calculation) Initialization from IMU promising Multiple Iterations for increased accuracy and divergence detection promising, but computationally expensive Jumps in trajectory are really problematic! => Ideas welcome!