Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 1 Chapter 13 Vision Based Guidance.

Slides:



Advertisements
Similar presentations
C280, Computer Vision Prof. Trevor Darrell Lecture 2: Image Formation.
Advertisements

Literature Review: Safe Landing Zone Identification Presented by Keith Sevcik.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Probabilistic video stabilization using Kalman filtering and mosaicking.
Plan for today Discuss your interests and possible term paper topics Discuss biomechanics websites Powerpoint presentation on kinematics and their measurement.
CS485/685 Computer Vision Prof. George Bebis
Obstacle detection using v-disparity image
Lecture 12: Projection CS4670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Lecture 20: Two-view geometry CS6670: Computer Vision Noah Snavely.
GTECH 201 Session 08 GPS.
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Lec 21: Fundamental Matrix
Plan for today Discuss your assignments detailed on the last slide of the powerpoint for last week on: –Topics/problems in which you are most interested.
Stereo Ranging with verging Cameras Based on the paper by E. Krotkov, K.Henriksen and R. Kories.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
T. Bajd, M. Mihelj, J. Lenarčič, A. Stanovnik, M. Munih, Robotics, Springer, 2010 ROBOT SENSORS AND ROBOT VISON T. Bajd and M. Mihelj.
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
Airborne Attitude Determination and Ground Target Location Using GPS Information and Vision Technique Shan-Chih Hsieh, Luke K.Wang, Yean-Nong Yang †,Fei-Bin.
29 April 2008Birkbeck College, U. London1 Geometric Model Acquisition Steve Maybank School of Computer Science and Information Systems Birkbeck College.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Vision-based Landing of an Unmanned Air Vehicle
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
Computer Vision, Robert Pless Lecture 11 our goal is to understand the process of multi-camera vision. Last time, we studies the “Essential” and “Fundamental”
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Metrology 1.Perspective distortion. 2.Depth is lost.
Circular Motion Topics Angular Measure Angular Speed and Velocity Uniform Circular Motion and Centripetal Acceleration Angular Acceleration.
Geometric Camera Models
Stereo Many slides adapted from Steve Seitz.
Peripheral drift illusion. Multiple views Hartley and Zisserman Lowe stereo vision structure from motion optical flow.
CS-498 Computer Vision Week 7, Day 2 Camera Parameters Intrinsic Calibration  Linear  Radial Distortion (Extrinsic Calibration?) 1.
October 13, IMAGE FORMATION. October 13, CAMERA LENS IMAGE PLANE OPTIC AXIS LENS.
00/4/103DVIP-011 Part Two: Integration of Multi-View Data.
1 Chapter 2: Geometric Camera Models Objective: Formulate the geometrical relationships between image and scene measurements Scene: a 3-D function, g(x,y,z)
Spatiotemporal Saliency Map of a Video Sequence in FPGA hardware David Boland Acknowledgements: Professor Peter Cheung Mr Yang Liu.
Chapter 1. Concepts of Motion
Computer vision: models, learning and inference M Ahad Multiple Cameras
Camera Model Calibration
Digital Image Processing Additional Material : Imaging Geometry 11 September 2006 Digital Image Processing Additional Material : Imaging Geometry 11 September.
Motion / Optical Flow II Estimation of Motion Field Avneesh Sud.
Arizona’s First University. Command and Control Wind Tunnel Simulated Camera Design Jacob Gulotta.
CS-498 Computer Vision Week 7, Day 2 Camera Parameters Intrinsic Calibration  Linear  Forward and backward projection 1.
Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 1: Slide 1 Chapter 1 Introduction.
Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 12: Slide 1 Chapter 12 Path Planning.
Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 2: Slide 1 Chapter 2 Coordinate Frames.
Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 8, Slide 1 Chapter 8 State Estimation.
Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 4: Slide 1 Chapter 4 Forces and Moments.
Chapter 5 Linear Design Models.
1 26/03/09 LIST – DTSI Design of controllers for a quad-rotor UAV using optical flow Bruno Hérissé CEA List Fontenay-Aux-Roses, France
Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 11, Slide 1 Chapter 11 Path Manager.
Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video.
Kinematics in Two Dimensions
Overview Ellipsoid Spheroid Geoid Datum Projection Coordinate System.
Vision Based Motion Estimation for UAV Landing
Autonomous Cyber-Physical Systems: Sensing
7.3 Forces in Two Dimensions
Optical Flow For Vision-Aided Navigation
Lecture 2 Chapter ( 2 ).
Chapter 2: Digital Image Fundamentals
Chapter 2: Digital Image Fundamentals
Multiple View Geometry for Robotics
Mirrors, Plane and Spherical Spherical Refracting Surfaces
Credit: CS231a, Stanford, Silvio Savarese
Integration of Multi-View Data
Maps one figure onto another figure in a plane.
Photographic Image Formation I
Kinematics in Two Dimensions
Presentation transcript:

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 1 Chapter 13 Vision Based Guidance

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 2 Gimbal and Camera Reference Frames

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 3 Gimbal Reference Frames

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 4 Camera Reference Frame

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 5 Pinhole Camera Model Describes mathematical relationship between coordinates of 3-D point and its projection onto 2-D image plane Assumes point aperture, no lenses Good 1 st order approximation

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 6 Pinhole Camera Model image plane

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 7 Camera Model image plane

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 8 Camera Model image plane

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 9 Camera Model image plane

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 10 Gimbal Pointing Two scenarios –Point gimbal at given world coordinate “Point to this GPS location” –Point gimbal so that optical axis aligns with certain point in image plane “Point at this object” Gimbal dynamics –Assume rate control inputs

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 11 Scenario 1: Point Gimbal at World Coordinate

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 12 Scenario 2: Point Gimbal at Object in Image

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 13 Gimbal Pointing Angles

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 14 Gimbal Pointing Angles

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 15 Geolocation

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 16 Range to Target – Flat Earth Model

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 17 Geolocation Errors

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 18 Geolocation Using EKF

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 19 Geolocation Using EKF

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 20 GPS smoother geo-location attitude estimation vision processing gimbal Geolocation Architecture

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 21 Target Motion in Image Plane

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 22 Pixel LPF and Differentiation

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 23 Digital Approximation

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 24 Apparent Motion

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 25 Apparent Motion, cont.

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 26 Apparent Motion, cont.

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 27 Total Motion in Image Plane

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 28 Time to Collision

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 29 Time to Collision - Looming

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 30 Time to Collision – Flat Earth

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 31 Precision Landing

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 32 Proportional Navigation (PN)

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 33 Acceleration Command

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 34 Polar Converting Logic

Beard & McLain, “Small Unmanned Aircraft,” Princeton University Press, 2012, Chapter 13: Slide 35 Polar Converting Logic, cont.