Vision Based Motion Estimation for UAV Landing Cory Sharp, Omid Shakernia Department of EECS University of California at Berkeley
Outline Motivation Vision based ego-motion estimation Evaluation of motion estimates Vision system hardware/software Landing target design/tracking Active camera control Flight videos
Goal: Autonomous UAV landing on a ship’s flight deck Motivation Goal: Autonomous UAV landing on a ship’s flight deck Challenges Hostile operating environments high winds, pitching flight deck, ground effect UAV undergoing changing nonlinear dynamics Why the vision sensor? Passive sensor (for stealth) Gives relative UAV motion to flight deck U.S. Navy photo
Objective for Vision Based Landing
Preliminaries Motion of camera relative to inertial frame is given by Given fixed point on landing pad , its coordinates in camera frame are given by Body angular and linear velocities are given by The coordinates of a point in the camera frame satisfy:
Camera Model Calibrated pinhole camera Perspective Projection: The image of a point is denoted by Notice: where: Important identity:
Planar Essential Constraint All points on landing pad satisfy Image correspondences satisfy the planar essential constraint Current camera position planar constraint Desired camera position Feature points on landing pad
Planar differential essential constraint For optical flow, image points satisfy the differential planar essential constraint: We have developed a technique to uniquely estimate differential planar essential matrix B We have developed an algorithm to decompose B into its motion and structure parameters
Differential Motion Estimation Differential planar essential constraint on B is written as: vector of image velocities matrix of image points vector of entries of B With at least 4 image points, F has rank 8 Using linear least squares, B is recovered up to ker(F): the entries of are the LLSE of B the entries of span ker(F): observation: Observation: Solve uniquely for B using
Vision Performance Evaluation Performance evaluation of vision sensor under varying levels of noise is image correspondences and optical flow under different camera distances to landing plane under different camera motions relative to landing plane Observations planar algorithm performs better under measurement noise than general 8-point algorithm motion estimates improve as camera approaches landing pad
Noise Sensitivity Discrete Case Differential Case Observation: Planar case is more robust to noise than general case
Depth Sensitivity Discrete Case Differential Case Observations: 5 10 15 1 2 3 4 Differential case- depth variation dependency: noise level .5 pixels translation mean error (degrees) ratio (zmax - zmin)/zmin Planar 8-Point 0.02 0.04 0.06 0.08 0.1 0.12 Differential case- depth variation dependency: noise level .5 pixels rotation mean error (degrees) Discrete Case Differential Case Observations: Motion estimates improve as camera approaches landing pad Translation estimates more sensitive to noise than rotation
Motion Sensitivity Discrete Case Differential Case Observations: X-X X-Y X-Z Y-X Y-Y Y-Z Z-X Z-Y Z-Z 5 10 15 Discrete case- translation axis dependency: noise level 3 pixels translation mean error (degrees) translation-rotation axes 8 point Planar 0.02 0.04 0.06 0.08 Discrete case rotation axis dependency: noise level 3 pixels rotation mean error Discrete Case Differential Case Observations: Worst estimates when translation parallel to surface normal
Vehicle Control Language Vision in Control Loop Camera Pan/Tilt Control Feature Point Correspondence Motion Estimation Image Processing, Corner Finding Helicopter State RS-232 Control Strategy Vehicle Control Language Navigation Computer Vision Computer
UAV Testbed
Vision System Hardware Ampro embedded PC Little Board P5/x Low power Pentium 233MHz, running LINUX 440 MB flashdisk HD, robust to body vibration Runs motion estimation algorithm Controls PTZ camera Motion estimation algorithms Written and optimized in C++ using LAPACK Give motion estimates at 30 Hz
Vision System Software
Nonlinear Motion Estimation Minimize reprojection error using Newton-Raphson Gaussian elimination to solve for dlamda Iteratively, this drives dbeta to 0
Pan/Tilt Camera Control Feature tracking issues: Leave the field of view Pan/tilt increases the range of motion of the UAV Pan/tilt control drive all feature points to the center of the image
Coordinate Frames
Flight Test Results
Vision Based State Estimate, RMS Error Position error to within 5cm Rotation error to within 5deg
Vision Ground Station
Flight Video
Pitching Landing Deck Landing deck to simulate motion of a ship at sea 6 electrically actuated cylindrical shafts Motion Parameters: sea state (freq, amp of waves) ship speed direction into waves Stiffened Aluminum construction Dimensions: 8’ x 6’
Moving Landing Pad
Hovering Above Deck
Landing on Deck
Papers Published A Vision System for Landing an Unmanned Aerial Vehicle Cory Sharp, Omid Shakernia, Shankar Sastry Submitted: ICRA 2001 Landing an Unmanned Air Vehicle: Vision based motion estimation and nonlinear control Omid Shakernia, Yi Ma, T. John Koo, Shankar Sastry, Asian Journal of Control, Vol. 1, No. 3, Sept. 1999 Vision guided landing of an Unmanned Air Vehicle, Omid Shakernia, Yi Ma, Joao Hespanha, Shankar Sastry, IEEE Conf. on Decision and Control, Phoenix, Arizona, Dec. 1999