Real-time Dense Visual Odometry for Quadrocopters Christian Kerl
Outline Motivation Hardware & Software Approach Problems Ideas
Motivation Quadrocopters need sensors to fly in unknown environments – Motion – Position – Obstacles Restricted on-board sensors – IMU – Visual navigation (no GPS) Restricted computing resources Autonomous system
Motivation Standard approach to visual odometry: – Sparse feature tracking in intensity / color images – Examples: Jakob, ETH Zurich, TU Graz, MIT – On-board frame rates 10 Hz Our approach: – Using full RGB-D image information – No feature tracking
Hardware – Asctec Pelican
Hardware – Asctec Pelican IMU AutoPilot Board Atom Board
Hardware – Asctec Pelican IMU – 3 axis magnetometer, gyroscope, accelerometer AutoPilot Board – Highlevel + Lowlevel Processor (ARM) Atom Board – Intel Atom Z GHz – 1 GB RAM – 7 Mini-USB Ports – WirelessLAN 600 g payload
Software – Asctec Pelican ROS drivers for Asctec Pelican from ETH Zurich Nonlinear dynamic inversion for position control Luenberger Observer for data fusion Updated version using Extended Kalman Filter to be presented on ICRA 2012 Needs absolute position input from external source Allows to command accelerations, velocities or positions
Hardware – Asus Xtion Pro Live 24 bit RGB image 16 bit depth image 30 Hz 150 g +On-camera RGB and depth image registration +Time synchronized depth and RGB image -Rolling shutter -Auto exposure
Approach Estimate transformation minimizing squared intensity error (energy minimization)
Approach Linearization with minimize => solve normal equations
Analysis Estimate transformation minimizing squared intensity error (energy minimization) X translation Y translation
Hovering Image Data from Quadrocopter
Trajectory along camera z-axis Image Data from Quadrocopter
Problems Motion blur Auto exposure Dynamic objects (humans)
Problems – Motion Blur
Problems – Motion Blur
Problems – Motion Blur
Problems – Motion Blur
Problems – Motion Blur
Problems – Motion Blur
Problems – Motion Blur
Problems – Auto Exposure
Problems – Auto Exposure
Problems – Dynamic Objects
Ideas Weighted Least Squares Initial motion estimate between 2 consecutive frames from IMU data fusion Multiple iterations per level, convergence checks Regularization term to minimize / constrain least squares solution Minimization of intensity and depth error
Ideas – Weighted Least Squares Assign smaller weight to residual outliers => Weight calculation
Ideas – Weighted Least Squares Influence function – Tukey weight – Huber weight
Ideas – Weighted Least Squares Weighted error
Ideas – Weighted Least Squares Influence on energy function X translation w/o weightsX translation w/ Huber weights
Ideas – Weighted Least Squares Influence on energy function Y translation w/o weightsY translation w/ Huber weights
Ideas – Weighted Least Squares
Ideas – Weighted Least Squares Robustification with respect to dynamic objects Slightly degrades tracking performance How to choose parameter b?
Ideas – Initialization from IMU Use transformation from IMU data fusion as initial estimate
Ideas – Initialization from IMU Use transformation from IMU data fusion as initial estimate
Ideas – Initialization from IMU Use transformation from IMU data fusion as initial estimate
Ideas – Multiple Iterations Perform multiple optimization steps per image pyramid level Stop when increment below threshold Bad frames / diverging results can be recognized and skipped
Summary/Discussion Weighted Least Squares needs more work (especially weight calculation) Initialization from IMU promising Multiple Iterations for increased accuracy and divergence detection promising, but computationally expensive Jumps in trajectory are really problematic! => Ideas welcome!