Vision Based Motion Estimation for UAV Landing

Slides:



Advertisements
Similar presentations
The fundamental matrix F
Advertisements

Lecture 11: Two-view geometry
EUCLIDEAN POSITION ESTIMATION OF FEATURES ON A STATIC OBJECT USING A MOVING CALIBRATED CAMERA Nitendra Nath, David Braganza ‡, and Darren Dawson EUCLIDEAN.
Two-View Geometry CS Sastry and Yang
Chapter 6 Feature-based alignment Advanced Computer Vision.
Two-view geometry.
Computer Vision Optical Flow
Camera calibration and epipolar geometry
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
MASKS © 2004 Invitation to 3D vision Lecture 8 Segmentation of Dynamical Scenes.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Uncalibrated Geometry & Stratification Sastry and Yang
Multiple-view Reconstruction from Points and Lines
Uncalibrated Epipolar - Calibration
Image-Based Rendering using Hardware Accelerated Dynamic Textures Keith Yerex Dana Cobzas Martin Jagersand.
Single-view geometry Odilon Redon, Cyclops, 1914.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Lec 21: Fundamental Matrix
3D Motion Estimation. 3D model construction Video Manipulation.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Multi-view geometry.
Geometry and Algebra of Multiple Views
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Vision-based Landing of an Unmanned Air Vehicle
Computer Vision, Robert Pless Lecture 11 our goal is to understand the process of multi-camera vision. Last time, we studies the “Essential” and “Fundamental”
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Geometric Camera Models
Affine Structure from Motion
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
EECS 274 Computer Vision Geometric Camera Calibration.
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
EECS 274 Computer Vision Affine Structure from Motion.
Feature Matching. Feature Space Outlier Rejection.
Reconstruction from Two Calibrated Views Two-View Geometry
Single-view geometry Odilon Redon, Cyclops, 1914.
Affine Registration in R m 5. The matching function allows to define tentative correspondences and a RANSAC-like algorithm can be used to estimate the.
1 26/03/09 LIST – DTSI Design of controllers for a quad-rotor UAV using optical flow Bruno Hérissé CEA List Fontenay-Aux-Roses, France
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
Calibrating a single camera
Geometric Camera Calibration
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Paper – Stephen Se, David Lowe, Jim Little
A Vision System for Landing an Unmanned Aerial Vehicle
René Vidal and Xiaodong Fan Center for Imaging Science
Segmentation of Dynamic Scenes
Pursuit Evasion Games and Multiple View Geometry
Pursuit-Evasion Games with UGVs and UAVs
Segmentation of Dynamic Scenes
Homogeneous Coordinates (Projective Space)
A Unified Algebraic Approach to 2D and 3D Motion Segmentation
Segmentation of Dynamic Scenes from Image Intensities
Pursuit Evasion Games and Multiple View Geometry
Omnidirectional Vision-Based Formation Control
3D Motion Estimation.
Structure from motion Input: Output: (Tomasi and Kanade)
Computer Graphics Recitation 12.
Multiple View Geometry for Robotics
Uncalibrated Geometry & Stratification
George Mason University
Segmentation of Dynamical Scenes
Two-view geometry.
Two-view geometry.
Multi-view geometry.
Single-view geometry Odilon Redon, Cyclops, 1914.
Structure from motion Input: Output: (Tomasi and Kanade)
Presentation transcript:

Vision Based Motion Estimation for UAV Landing Cory Sharp, Omid Shakernia Department of EECS University of California at Berkeley

Outline Motivation Vision based ego-motion estimation Evaluation of motion estimates Vision system hardware/software Landing target design/tracking Active camera control Flight videos

Goal: Autonomous UAV landing on a ship’s flight deck Motivation Goal: Autonomous UAV landing on a ship’s flight deck Challenges Hostile operating environments high winds, pitching flight deck, ground effect UAV undergoing changing nonlinear dynamics Why the vision sensor? Passive sensor (for stealth) Gives relative UAV motion to flight deck U.S. Navy photo

Objective for Vision Based Landing

Preliminaries Motion of camera relative to inertial frame is given by Given fixed point on landing pad , its coordinates in camera frame are given by Body angular and linear velocities are given by The coordinates of a point in the camera frame satisfy:

Camera Model Calibrated pinhole camera Perspective Projection: The image of a point is denoted by Notice: where: Important identity:

Planar Essential Constraint All points on landing pad satisfy Image correspondences satisfy the planar essential constraint Current camera position planar constraint Desired camera position Feature points on landing pad

Planar differential essential constraint For optical flow, image points satisfy the differential planar essential constraint: We have developed a technique to uniquely estimate differential planar essential matrix B We have developed an algorithm to decompose B into its motion and structure parameters

Differential Motion Estimation Differential planar essential constraint on B is written as: vector of image velocities matrix of image points vector of entries of B With at least 4 image points, F has rank 8 Using linear least squares, B is recovered up to ker(F): the entries of are the LLSE of B the entries of span ker(F): observation: Observation: Solve uniquely for B using

Vision Performance Evaluation Performance evaluation of vision sensor under varying levels of noise is image correspondences and optical flow under different camera distances to landing plane under different camera motions relative to landing plane Observations planar algorithm performs better under measurement noise than general 8-point algorithm motion estimates improve as camera approaches landing pad

Noise Sensitivity Discrete Case Differential Case Observation: Planar case is more robust to noise than general case

Depth Sensitivity Discrete Case Differential Case Observations: 5 10 15 1 2 3 4 Differential case- depth variation dependency: noise level .5 pixels translation mean error (degrees) ratio (zmax - zmin)/zmin Planar 8-Point 0.02 0.04 0.06 0.08 0.1 0.12 Differential case- depth variation dependency: noise level .5 pixels rotation mean error (degrees) Discrete Case Differential Case Observations: Motion estimates improve as camera approaches landing pad Translation estimates more sensitive to noise than rotation

Motion Sensitivity Discrete Case Differential Case Observations: X-X X-Y X-Z Y-X Y-Y Y-Z Z-X Z-Y Z-Z 5 10 15 Discrete case- translation axis dependency: noise level 3 pixels translation mean error (degrees) translation-rotation axes 8 point Planar 0.02 0.04 0.06 0.08 Discrete case rotation axis dependency: noise level 3 pixels rotation mean error Discrete Case Differential Case Observations: Worst estimates when translation parallel to surface normal

Vehicle Control Language Vision in Control Loop Camera Pan/Tilt Control Feature Point Correspondence Motion Estimation Image Processing, Corner Finding Helicopter State RS-232 Control Strategy Vehicle Control Language Navigation Computer Vision Computer

UAV Testbed

Vision System Hardware Ampro embedded PC Little Board P5/x Low power Pentium 233MHz, running LINUX 440 MB flashdisk HD, robust to body vibration Runs motion estimation algorithm Controls PTZ camera Motion estimation algorithms Written and optimized in C++ using LAPACK Give motion estimates at 30 Hz

Vision System Software

Nonlinear Motion Estimation Minimize reprojection error using Newton-Raphson Gaussian elimination to solve for dlamda Iteratively, this drives dbeta to 0

Pan/Tilt Camera Control Feature tracking issues: Leave the field of view Pan/tilt increases the range of motion of the UAV Pan/tilt control drive all feature points to the center of the image

Coordinate Frames

Flight Test Results

Vision Based State Estimate, RMS Error Position error to within 5cm Rotation error to within 5deg

Vision Ground Station

Flight Video

Pitching Landing Deck Landing deck to simulate motion of a ship at sea 6 electrically actuated cylindrical shafts Motion Parameters: sea state (freq, amp of waves) ship speed direction into waves Stiffened Aluminum construction Dimensions: 8’ x 6’

Moving Landing Pad

Hovering Above Deck

Landing on Deck

Papers Published A Vision System for Landing an Unmanned Aerial Vehicle Cory Sharp, Omid Shakernia, Shankar Sastry Submitted: ICRA 2001 Landing an Unmanned Air Vehicle: Vision based motion estimation and nonlinear control Omid Shakernia, Yi Ma, T. John Koo, Shankar Sastry, Asian Journal of Control, Vol. 1, No. 3, Sept. 1999 Vision guided landing of an Unmanned Air Vehicle, Omid Shakernia, Yi Ma, Joao Hespanha, Shankar Sastry, IEEE Conf. on Decision and Control, Phoenix, Arizona, Dec. 1999