Optic Flow QuadCopter Control

Slides:



Advertisements
Similar presentations
Nonlinear Control of Quadrotor
Advertisements

NUS CS5247 Motion Planning for Camera Movements in Virtual Environments By Dennis Nieuwenhuisen and Mark H. Overmars In Proc. IEEE Int. Conf. on Robotics.
Literature Review: Safe Landing Zone Identification Presented by Keith Sevcik.
Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
Insects as Gibsonian Animals Amelia Grant-Alfieri Mandyam V. Srinivasan Ecological Psychology, 1998 Centre for Visual Science, Research School of Biological.
David Wild Supervisor: James Connan Rhodes University Computer Science Department Gaze Tracking Using A Webcamera.
Bohr Robot Group OpenCV ECE479 John Chhokar J.C. Arada Richard Dixon.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
The European Project sFly: Swarm of Micro Flying Robots EU FP7,
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Computer Vision REU Week 2 Adam Kavanaugh. Video Canny Put canny into a loop in order to process multiple frames of a video sequence Put canny into a.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
AT 209 Introduction to Civil Unmanned Aerial Systems (UAS)
Reegan Worobec & David Sloan In collaboration with UAARG.
Automatic Control & Systems Engineering Autonomous Systems Research Mini-UAV for Urban Environments Autonomous Control of Multi-UAV Platforms Future uninhabited.
MEAM 620 Project Report Nima Moshtagh.
Image Processing of Video on Unmanned Aircraft Video processing on-board Unmanned Aircraft Aims to develop image acquisition, processing and transmission.
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
Team Phoenix March 15, Project Goal Our team will develop an air vehicle that will not only navigate a course autonomously while providing real.
Intelligent Ground Vehicle Competition Navigation Michael Lebson - James McLane - Image Processing Hamad Al Salem.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
Embedded Microcomputer Systems Andrew Karpenko 1 Prepared for Technical Presentation February 25 th, 2011.
A Brief Overview of Computer Vision Jinxiang Chai.
Quadcopters. History Also known as quadrotors First flying quadrotor: – 11 November 1922 – Etienne Oehmichen Image: blogger.com.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Computational Mechanics and Robotics The University of New South Wales
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
3D SLAM for Omni-directional Camera
Vision-based Landing of an Unmanned Air Vehicle
Update September 14, 2011 Adrian Fletcher, Jacob Schreiver, Justin Clark, & Nathan Armentrout.
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 2.3: 2D Robot Example Jürgen Sturm Technische Universität München.
Computer Vision Driven Micro- Aerial Vehicle (MAV): Obstacles Avoidance Lim-Kwan (Kenny) Kong - Graduate Student Dr. Jie Sheng - Faculty Advisor Dr. Ankur.
UAS Initiative Flying Sensors – Mapping with New Technologies Fred Judson, GISP District 1 and 2 GIS Manager Ohio UAS Center Support
HiQuadLoc: An RSS-Based Indoor Localization System for High-Speed Quadrotors 1 Tuo Yu*, Yang Zhang*, Siyang Liu*, Xiaohua Tian*, Xinbing Wang*, Songwu.
Tracking CSE 6367 – Computer Vision Vassilis Athitsos University of Texas at Arlington.
Motion Analysis using Optical flow CIS750 Presentation Student: Wan Wang Prof: Longin Jan Latecki Spring 2003 CIS Dept of Temple.
Lecture 7: Features Part 2 CS4670/5670: Computer Vision Noah Snavely.
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF.
FUFO project Final report.
Vision and Obstacle Avoidance In Cartesian Space.
Spatiotemporal Saliency Map of a Video Sequence in FPGA hardware David Boland Acknowledgements: Professor Peter Cheung Mr Yang Liu.
Existing Draganflyer Projects and Flight Model Simulator Demo Chayatat Ratanasawanya 5 February 2009.
Robust Nighttime Vehicle Detection by Tracking and Grouping Headlights Qi Zou, Haibin Ling, Siwei Luo, Yaping Huang, and Mei Tian.
Final Year Project. Project Title Kalman Tracking For Image Processing Applications.
Quadcopters A CEV Talk. Agenda Flight PreliminariesWhy Quadcopters The Quadcopter SystemStability: The NotionSensors and FusionControl AlgorithmsThe Way.
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
Alan Cleary ‘12, Kendric Evans ‘10, Michael Reed ‘12, Dr. John Peterson Who is Western State? Western State College of Colorado is a small college located.
Person Following with a Mobile Robot Using Binocular Feature-Based Tracking Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering.
Minor Project on Vertical Take-off Landing System SUBMITTED BY:- SHUBHAM SHARMA ( ) ABHISHEK ARORA ( ) VIBHANSHU JAIN ( )
  Computer vision is a field that includes methods for acquiring,prcessing, analyzing, and understanding images and, in general, high-dimensional data.
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
Accessible Aerial Autonomy via ROS Nick Berezny ’12, Lilian de Greef ‘12, Brad Jensen ‘13, Kimberly Sheely ‘12, Malen Sok ‘13, and Zachary Dodds Tasks.
Robot Vision SS 2009 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
Alex Triaca Supervisor: James Connan Project Presentation 2 TRACKING AND MAPPING WITH THE PARROT A.R. DRONE.
Autonomous Quadrotor Navigation & Guidance
1 26/03/09 LIST – DTSI Design of controllers for a quad-rotor UAV using optical flow Bruno Hérissé CEA List Fontenay-Aux-Roses, France
Lunabotics Navigation Package Team May14-20 Advisor: Dr. Koray Celik Clients: ISU Lunabotics Club, Vermeer Company.
MAV Optical Navigation Software System April 30, 2012 Tom Fritz, Pamela Warman, Richard Woodham Jr, Justin Clark, Andre DeRoux Sponsor: Dr. Adrian Lauf.
Avneesh Sud Vaibhav Vaish under the guidance of Dr. Subhashis Banerjee
Pursuit-Evasion Games with UGVs and UAVs
Vision Based Motion Estimation for UAV Landing
UAV Vision Landing Motivation Data Gathered Research Plan Background
Vehicle Segmentation and Tracking in the Presence of Occlusions
Optical Flow For Vision-Aided Navigation
Detecting Motion Pattern in Optical Flow Fields
Presentation transcript:

Optic Flow QuadCopter Control Update Presentation 1 Weeks 1-4 Optic Flow QuadCopter Control Oscar Merry

Contents Introduction OpenCV Honeybee Vision Research Paper 1 – AR Drone LabVIEW Toolkit Research Paper 2 – Optical Flow Quadrotor controller Other Research Papers General Problems Aims of Project Discussion

Introduction Optical Flow: Last 4 weeks: The estimation of the motion field created by a moving camera with respect to a rigid scene. Last 4 weeks: Literature Analysis Research into Optic Flow Research into honeybee navigation Research into OpenCV

OpenCV Open-source C / C++ library for advanced computer vision. Built in functions for many optical flow processing techniques: Canny edge detector Feature tracking Lucas–Kanade Horn-Schunck Shi-Tomasi

Honeybee Vision Each compound eye in the honeybee contains ~ 5000 ommatidia Each ommatidium has 9 photoreceptor cells grouped into 3 classes Ultraviolet sensitive Blue sensitive Green sensitive Bees control their flight speed by keeping optical flow constant. (The same process is used for landing) This is done via the Green photoreceptors. Bees recognise objects both through strong contrasts in luminance or colour and through optical flow.

Research Paper 1 - AR Drone LabVIEW Toolkit Michael Mogenson Masters Thesis: “A software framework for the control of low-cost quadrotor aerial robots.” Created a LabVIEW toolkit for the control of Parrot AR Drone. http://dl.dropboxusercontent.com/u/22965474/Michael_Mogenson_Thesis.pdf

AR Drone LabVIEW Toolkit Toolkit has multiple virtual instruments. (VIs) Regular Comms VIs: Main VI for control commands and comms management Video VI for video reading and decoding Nav Data VI for reading navigation data ‘Thinking’ Vis: State VI to estimate position in X,Y,Z Cartesian coordinates Various image processing Vis (Fast blob detection, dense optical flow, image space conversion, ROI VI) Toolkit has wrapper for OpenCV Library. http://dl.dropboxusercontent.com/u/22965474/Michael_Mogenson_Thesis.pdf

AR Drone LabVIEW Toolkit – State VI Populates a rotation matrix between the drones coordinate system and the ground using Euler orientation angles from navigation data. Applies rotation matrix to the velocities measured from optical flow from the bottom camera. Rotated velocities integrated into a position with the timestamp data. Suffers from problem of drift.

AR Drone LabVIEW Toolkit – Achievements Benefit of LabVIEW – modifications without recompiling Demonstrations: Face tracking Indoor hallway navigation (Via vanishing point) Fly through hoop Problems: Indoor hallway navigation fails if 90 degree turn or if facing wall State Estimation VI suffers from drift

Research Paper 1 - Optical Flow Quadrotor controller “Optical Flow-Based Controller for Reactive and Relative Navigation dedicated to a Four Rotor Rotorcraft” Eduardo Rondon, Isabelle Fantoni-Coichot, Anand Sanchez, Guillaume Sanahuja Produced a controller for a Quadrotor based on the optical flow from 2 cameras. (1 for velocity regulation, 1 for obstacle avoidance) Implemented obstacle avoidance for indoor navigation. http://ieeexplore.ieee.org.eresources.shef.ac.uk/stamp/stamp.jsp?tp=&arnumber=5354483

Optical Flow Quadrotor controller http://ieeexplore.ieee.org.eresources.shef.ac.uk/stamp/stamp.jsp?tp=&arnumber=5354483

Optical Flow Quadrotor controller - Achievements Velocity regulation via optical flow controller. Sets the inverse time-to-contact to a threshold to stop the vehicle if obstacle detected. Has lateral and altitude avoidance. http://ieeexplore.ieee.org.eresources.shef.ac.uk/stamp/stamp.jsp?tp=&arnumber=5354483

Other Research Papers “Optic flow based slope estimation for autonomous landing” de Croon et al. Achieved slope following and autonomous landing. “Combined Optic-Flow and Stereo-Based Navigation of Urban Canyons for a UAV” Hrabar et al. Used optical flow to balance UAV in canyon. Used stero vision to navigate T and L junctions. “An adaptive vision-based autopilot for mini flying machines guidance, navigation and control.” used optic flow and IMU data for guidance navigation and control, specifically automatic hovering, landing, and target tracking. Use feature tracking to reduce optic flow computation.

General Problems Camera Calibration – In order for the depth of a scene to be known the camera must be calibrated on a known object. (Or height above ground must be known) Video Delay – Video encoding, decoding, and transmission must be fast enough.

Project Aims Hovering Stabilisation Position and velocity control Smooth Landing Execution Obstacle recognition and avoidance (with a focus on methods that have similarities to honeybees e.g. flower recognition)

Discussion Communication: Formalize state commands. Hardware State commands (roll angle, pitch angle, yaw rate, climb rate, ??) Video frames Formalize state commands.