Vision Based UAV Landing

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.
No soda cans were harmed in the production of this project RAT Soda can detection system What is it RAT is an attempt to create a system to detect any.
The optical sensor of the robot Phoenix-1 Aleksey Dmitriev.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Simple Face Detection system Ali Arab Sharif university of tech. Fall 2012.
Vertical Launch UAV Project Plan. ∞ Construct an unmanned aerial vehicle (UAV) with a camera payload ∞ UAV must autonomously navigate with real-time video.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
1 Formation et Analyse d’Images Session 12 Daniela Hall 16 January 2006.
Broadcast Court-Net Sports Video Analysis Using Fast 3-D Camera Modeling Jungong Han Dirk Farin Peter H. N. IEEE CSVT 2008.
Video enhances, dramatizes, and gives impact to your multimedia application. Your audience will better understand the message of your application.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Advanced Cubesat Imaging Payload
Multi video camera calibration and synchronization.
Digital Cameras CCD (Monochrome) RGB Color Filter Array.
Spectrophotometer Jan 28, 2002 Deryck Hong Suryadi Gunawan.
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
Cliff Rhyne and Jerry Fu June 5, 2007 Parallel Image Segmenter CSE 262 Spring 2007 Project Final Presentation.
Eye Tracking Project Project Supervisor: Ido Cohen By: Gilad Ambar
Hand Movement Recognition By: Tokman Niv Levenbroun Guy Instructor: Todtfeld Ari.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
A Real-Time for Classification of Moving Objects
Juan Guzman ASU Mentor: Shea Ferring. About Me Name: Juan Guzman Computer Science major ASU Senior (graduating December 2012) Have prior internship with.
On the Design, Construction and Operation of a Diffraction Rangefinder MS Thesis Presentation Gino Lopes A Thesis submitted to the Graduate Faculty of.
Team Phoenix March 15, Project Goal Our team will develop an air vehicle that will not only navigate a course autonomously while providing real.
Automatic Camera Calibration
system design Final report
Raspberry Pi Camera for Measuring Bottle Size
Automated Ball Striker Joseph Black David Caloccia Gina Rophael Paul Savickas.
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
Department of Bioengineering FAILURE ENERGY: COMPARING SUTURE PERFORMANCE UNDER UNIAXIAL TENSION GROUP 101B1 Alison Agres – Background & Objective Ted.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alexander Norton Advisor: Dr. Huggins April 26, 2012 Senior Capstone Project Final Presentation.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Hand Tracking for Virtual Object Manipulation
Alessandro Nastro Laboratory of Optoelectronics, University of Brescia 2D Vision Course Creating and Projecting Fringes.
Computer Vision Lab Seoul National University Keyframe-Based Real-Time Camera Tracking Young Ki BAIK Vision seminar : Mar Computer Vision Lab.
1 COMS 161 Introduction to Computing Title: Digital Images Date: November 12, 2004 Lecture Number: 32.
GAS CAMERA Mobile imaging system to detect methane plumes from distances between 0 m and at least 100 m Dr. Arnd Schmücker, Peter Schwengler TGLS, Pipelines.
Veggie Vision: A Produce Recognition System R.M. Bolle J.H. Connell N. Haas R. Mohan G. Taubin IBM T.J. Watson Resarch Center Presented by Chris McClendon.
Figure ground segregation in video via averaging and color distribution Introduction to Computational and Biological Vision 2013 Dror Zenati.
Senior Design Project Megan Luh Hao Luo Febrary
Senior Design Project Megan Luh Hao Luo January
QUADCOPTER- Vision Based Object Tracking By By Pushyami Kaveti Pushyami Kaveti.
3d Pose Detection Used by Kinect
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
Implementation of a Relational Database as an Aid to Automatic Target Recognition Christopher C. Frost Computer Science Mentor: Steven Vanstone.
Final Year Project. Project Title Kalman Tracking For Image Processing Applications.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alex Norton Advisor: Dr. Huggins February 28, 2012 Senior Project Progress Report Bradley University.
RoboCup KSL Design and implementation of vision and image processing core Academic Supervisor: Dr. Kolberg Eli Mentors: Dr. Abramov Benjamin & Mr.
Processing Images and Video for An Impressionist Effect Automatic production of “painterly” animations from video clips. Extending existing algorithms.
OpenCV C++ Image Processing
Depth Analysis With Stereo Cameras
Implementing Localization
Paper – Stephen Se, David Lowe, Jim Little
UAV Vision Landing Motivation Data Gathered Research Plan Background
Implementing Tactile Behaviors Using FingerVision
Vehicle Segmentation and Tracking in the Presence of Occlusions
What Is Spectral Imaging? An Introduction
Optical Flow For Vision-Aided Navigation
Life Without Outlines Examining the Edge.
ECE 477 Digital Systems Senior Design Project  Spring 2006
COMS 161 Introduction to Computing
Elecbits Electronic shade.
Summary of Sign Conventions
FISH IDENTIFICATION SYSTEM
Multi-UAV Detection and Tracking
Report 2 Brandon Silva.
Presentation transcript:

Vision Based UAV Landing Raimi Shah Mentor: Or Dantsker

Motivation Lab is developing solar powered aircraft Must maximize efficiency by minimizing energy consumption Aircraft has no landing gear → less weight → less drag → less energy consumed Since aircraft is large and fragile, it cannot land using conventional means Will use a 15’x15’ net to catch the aircraft Wingspan: 157.5” (4.00 m) Length: 70.0” (1.78 m) Weight: 4.4 lb (2.0 kg)

Overview Use computer vision to determine the aircraft’s location relative to the landing net Write code in C++ using OpenCV libraries to detect the target frame and calculate the (x,y,z) distance Target frame will hold the net and be bright orange, making it easily distinguishable from the background Once distance can be accurately calculated, the code will be integrated into the UAV flight computer, providing the autopilot relative aircraft location during landing Will also profile GPU/CPU (frame rate vs power consumed)

Technique Filter image and detect the target Detect the target contours, producing target outline Use the largest outline to represent the target Use a data type ‘RotatedRect’ to bound the contour, producing target information including: area, vertices, width and height Compute distance to the target using: Distance = (actual Width x Focal Length†) / image Width Compute x, y, z offset from the center of the target Once the camera is close enough that outer frame disappears, a smaller marker placed in the center will be tracked until a landing is completed † Focal Length is pre-computed in camera pixels using a known distance

Goals Color calibration for object detection for orange frame  Small scale model in lab  Calculate distance to target in z-direction  Frame rate profiling  Calculate distance to target in x- and y-directions Process real-world video of the landing target to verify code Create interface with flight control board Deploy onto aircraft

Questions?