October 28, 2011 Adrian Fletcher (CECS), Jacob Schreiver (ECE), Justin Clark (CECS), & Nathan Armentrout (ECE) Sponsor: Dr. Adrian Lauf 1.

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Georgia Tech Aerial Robotics Dr. Daniel P Schrage Jeong Hur Fidencio Tapia Suresh K Kannan SUCCEED Poster Session 6 March 1997.
OpenCV Introduction Hang Xiao Oct 26, History  1999 Jan : lanched by Intel, real time machine vision library for UI, optimized code for intel 
Objectives The objective of this design process was to create a small, autonomous robot capable of completing a set of predefined objectives within an.
Prof. Kristofer S.J. Pister’s team Berkeley Sensor and Actuator Center University of California, Berkeley.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Real-Time Video Analysis on an Embedded Smart Camera for Traffic Surveillance Presenter: Yu-Wei Fan.
Computer Vision REU Week 2 Adam Kavanaugh. Video Canny Put canny into a loop in order to process multiple frames of a video sequence Put canny into a.
OpenCV Stacy O’Malley CS-590 Summer, What is OpenCV? Open source library of functions relating to computer vision. Cross-platform (Linux, OS X,
Experiences with an Architecture for Intelligent Reactive Agents By R. Peter Bonasso, R. James Firby, Erann Gat, David Kortenkamp, David P Miller, Marc.
Project Status Update II R09230: Open Architecture, Open Source Unmanned Aerial Vehicle for Imaging Systems A. Benjamin Wager (ME) B. Michael Skube (ME)
Automatic Control & Systems Engineering Autonomous Systems Research Mini-UAV for Urban Environments Autonomous Control of Multi-UAV Platforms Future uninhabited.
Background S.A.U.V.I.M. Semi - Autonomous Underwater Vehicle for
Image Processing of Video on Unmanned Aircraft Video processing on-board Unmanned Aircraft Aims to develop image acquisition, processing and transmission.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
Computerized Labyrinth Solver The board-game ‘Labyrinth’ traditionally uses two manual controls to navigate a marble through a maze. This project proposes.
EDGE™ Wireless Open-Source/Open-Architecture Command and Control System (WOCCS) Group Members: –Eric Hettler –Manuel Paris –Ryan Miller –Christian Moreno.
NSF Foundations of Hybrid and Embedded Software Systems UC Berkeley: Chess Vanderbilt University: ISIS University of Memphis: MSI Gautam Biswas and Ken.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
April 26, Team Information Designation Ongo-03 Members Advisors Dr. J. Lamont, Prof. R. Patterson, Dr. Rajagopalan, Dr. J. Basart ClientSpace Systems.
Firefighter Indoor Navigation using Distributed SLAM (FINDS) Major Qualifying Project Matthew Zubiel Nick Long Advisers: Prof. Duckworth, Prof. Cyganski.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
Team Phoenix March 15, Project Goal Our team will develop an air vehicle that will not only navigate a course autonomously while providing real.
RaPTEX: Rapid Prototyping of Embedded Communication Systems Dr. Alex Dean & Dr. Mihai Sichitiu (ECE) Dr. Tom Wolcott (MEAS) Motivation  Existing work.
Sérgio Ronaldo Barros dos Santos, Cairo Lúcio Nascimento Júnior,
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Ruslan Masinjila Aida Militaru.  Nature of the Problem  Our Solution: The Roaming Security Robot  Functionalities  General System View  System Design.
Cooperating AmigoBots Framework and Algorithms
Robot Autonomous Perception Model For Internet-Based Intelligent Robotic System By Sriram Sunnam.
Computational Mechanics and Robotics The University of New South Wales
Leslie Luyt Supervisor: Dr. Karen Bradshaw 2 November 2009.
Vision-based Landing of an Unmanned Air Vehicle
1. COMMUNICATION Liam O’Sullivan  Control was off board (on the GCS)  Used XBee ZigBee RF modules for telemetry  Point to point communication.
Update September 14, 2011 Adrian Fletcher, Jacob Schreiver, Justin Clark, & Nathan Armentrout.
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
LITERATURE SURVEY 1.EFFICIENT PATH PLANNING IN SEMI- FAULT TOLERANT ROBOTICS 2. Caesar robot.
EDGE™ Wireless Open-Source/Open-Architecture Command and Control System (WOCCS) Group Members: –Eric Hettler –Manuel Paris –Ryan Miller –Christian Moreno.
By: Eric Backman Advisor: Dr. Malinowski.  Introduction  Goals  Project Overview and Changes  Work Completed  Updated Schedule.
10/19/2005 ACGSC Fall Meeting, Hilton Head SC Copyright Nascent Technology Corporation © 2005 James D. Paduano 1 NTC ACTIVITIES 2005 Outline 1)Activities.
Update September 21, 2011 Adrian Fletcher, Jacob Schreiver, Justin Clark, & Nathan Armentrout.
1. COMMUNICATION Liam O’Sullivan  Used XBee RF 2.4 GHz modules for telemetry  Point to point communication (platform and GCS)  Disadvantages.
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
IEEE International Conference on Multimedia and Expo.
1 Center for the Collaborative Control of Unmanned Vehicles (C3UV) UC Berkeley Karl Hedrick, Raja Sengupta.
Design Team # 4 Design of low cost flight computer for unmanned aerial vehicles Status Report # 5 Ryan Morlino Chris Landeros Sylvester Meighan Stephen.
Optic Flow QuadCopter Control
Auto-Park for Social Robots By Team Daedalus. Requirements for FVE Functional Receive commands from user via smartphone app Share data with other cars.
Mini Autonomous Flying Vehicle CASDE is part of the National effort to develop a Micro Air Vehicle. CASDE has chosen a Mini Vehicle, in the short term,
We thank the Office of Research and Sponsored Programs for supporting this research, and Learning & Technology Services for printing this poster. Fully-Autonomous.
XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser Wojciech Jalmuzna, Technical University of Lodz, Department of Microelectronics and Computer.
OpenCV C++ Image Processing
MAV Optical Navigation Software System April 30, 2012 Tom Fritz, Pamela Warman, Richard Woodham Jr, Justin Clark, Andre DeRoux Sponsor: Dr. Adrian Lauf.
Cloud Cap Technologies
Depth Analysis With Stereo Cameras
Pursuit-Evasion Games with UGVs and UAVs
UAV Vision Landing Motivation Data Gathered Research Plan Background
MAV Optical Navigation
MAV Optical Navigation
Optical Flow For Vision-Aided Navigation
Lockheed Martin Challenge
Networks of Autonomous Unmanned Vehicles
Joe Trefilek Jeff Kubascik Paul Scheffler Matt Rockey
Distributed Sensing, Control, and Uncertainty
Distributed Control Applications Within Sensor Networks
MAV Optical Navigation
MAV Optical Navigation
MAV Optical Navigation
MAV Optical Navigation
Jetson-Enabled Autonomous Vehicle
Presentation transcript:

October 28, 2011 Adrian Fletcher (CECS), Jacob Schreiver (ECE), Justin Clark (CECS), & Nathan Armentrout (ECE) Sponsor: Dr. Adrian Lauf 1

 A subset of Unmanned Aerial Vehicles (UAVs) ◦ Predator ◦ Raptor  Very small, maneuverable, and lightweight  MAV Categories ◦ Fixed-wing ◦ Rotary-wing ◦ Flapping-wing  Used for homeland & battlefield applications ◦ Surveillance ◦ Reconnaissance 2

 Dr. Lauf is a new assistant professor in the CECS department from Wright State University  His research is in embedded system design with applications to UAVs and MAVs ◦ Communications & Networking ◦ Controls ◦ Navigation ◦ Autonomous Flight ◦ Multi-Agent Systems 3 Courtesy of Dr. Lauf

 Flapping-Wing MAV  Sensors are limited to ◦ Gyroscopes (MEMS) ◦ 3-Axis Accelerometers (MEMS) ◦ Monocular Camera with Transceiver Unit  Optical Navigation is necessary for autonomous operation 4 Courtesy of Dr. Lauf

5

 Develop a optical navigation software subsystem ◦ User selected destination ◦ Semi-autonomous operation ◦ Adaptable for flapping-wing MAVs ◦ Operates in closed, static environment  Classroom with tables and chairs  No moving objects 6

 Preflight operations ◦ Calibrate the camera ◦ Place the test rig in the room ◦ Start the optical navigation software ◦ Choose a destination  Mid-flight operations ◦ Move camera to simulate flight ◦ Follow suggested navigational output 7

 Requirements: ◦ Communicate real-time navigation output ◦ Create 3D model of the environment ◦ Plan a path from current location to a selected destination ◦ Work in any closed, static environment  Restrictions ◦ Non-stereoscopic camera 8

 Two major components ◦ Camera transceiver unit ◦ Computer with vision software  Connected via 1.9Ghz RF channel 9

 OpenCV  JavaCV  Netbeans Integrated Development Environment (IDE) 10

 OpenCV: open source computer vision software library built by Intel Corporation  Image Processing  Object Recognition  Machine Learning  3D Reconstruction  JavaCV: a wrapper for OpenCV ◦ Allows us to use OpenCV in Java environment ◦ Includes added functionality 11

 Free, open source IDE  Supports multiple languages including Java  Includes many developer helper functions ◦ GUI & Form Builder ◦ Software Debugger ◦ Unit Testing ◦ Code completion ◦ Integrated subversion (SVN) 12

13

 Goal: Find a prominent object in view  Why: Need to initialize object tracking and learning  How: Use the “Snake” algorithm ◦ Based on active contour detection ◦ “Constricts” around strong contours 14

15

 Goal: Provide short-term tracking capability in the learning phase is the same object  Why: Assist long-term (learning) tracker  How: ◦ Lucas-Kanade optical flow algorithm  Uses scattered points on object to track motion ◦ CamShift algorithm  Reduces picture color and calculates color histograms 16

17

18

 Goal: Establish a model for an object during the learning phase  Why: ◦ Recover from object occlusion ◦ Provide a basis for egomotion (camera motion)  How: ◦ SURF algorithm ◦ Haar-Like features ◦ Machine learning 19

20

 Goal: Establish no-fly zones for the current environment  Why: ◦ Collision avoidance ◦ Path planning ◦ Data visualization  How: Egomotion recovery with stereo vision techniques 21

 Goal: Provide navigational output to user  Why: Builds framework for autonomous navigation  How: ◦ Modified navigation algorithms 22

 Goal: Provide data visualization and user input capability  Why: ◦ Destination selection ◦ Navigational output ◦ Internal troubleshooting  How: ◦ Netbeans GUI builder 23

24

 Applications ◦ Camera calibration ◦ Verification of egomotion estimation 25

 Integrated JavaCV & OpenCV with Netbeans IDE  Interfaced with a variety of cameras  Camera calibration & test rig built 26

 Module integration ◦ Object recognition ◦ Object tracking ◦ Machine learning  3D Reconstruction ◦ Obtain depth perception  Egomotion & Stereo techniques  Destination selection  Path Planning  Improved Graphical User Interface (GUI) 27

28

Adrian P. Lauf, P. George Huang Wright State University Center for Micro Aerial Vehicle Studies (CMAVS) Guidance and Control On-board Hardware Off-board Control Each MAV (Micro Aerial Vehicle) equipped with on-board computing module Guidance and Intertial Navigation Assistant (GINA) Based on schematics developed at UC Berkeley’s WarpWing project Modified to reduce weight, unneeded components Onboard processing allows for vehicle stability in flight Integrated IEEE radio protocol permits two-way radio communications Radio telemetry External commands Video image capture and transmission Without modification, GINA 2.1 weighs over 2.2 grams. Development will target a weight of 1.5 grams or less Local Control Loops MEMS-based gyroscopes onboard GINA provide information about the aircraft’s stability Simple PID control can be used to keep aircraft level and stable Filtering functions can mitigate hysteresis caused by wing motion and control surface actuators Onboard microprocessor is capable of handling these high-rate, low-complexity tasks Feedback from PID control can be sent off- board for processing via radios Actuator control can be directly handled by the microprocessor; inputs to the system from external sources do not directly actuate control surfaces Unlike traditional UAVs, MAVs have limited power and computational resources Qualify as deeply-embedded systems Weight restrictions are primary obstacle for onboard processing systems In some cases, aircraft weigh less than 7 grams The need for autonomy requires the integration of on-board and off-board processing and guidance capabilities This hybrid schema permits computationally- intensive operations to run without weight restrictions Various sensor inputs can be used to aid local and global navigation objectives Video camera images MEMS gyroscopes Other heterogeneous mounted sensors Off-line image analysis permits identification of navigation objectives and obstacles Frame-to-frame analysis allows the system to construct a model of its environment and surroundings Information contained in the world-model can be used to make navigation decisions Multiple-aircraft implementations can more quickly and accurately build world-model Permits joint and distributed operation in an unknown scenario Allows distributed agents to augment the accuracy of existing models Commands issued as a result of image analysis can be used as inputs into the PID control navigation routines onboard the aircraft An airframe and drivetrain example of a CMAVS flapping-wing aircraft Existing receivers and actuators Gyroscope output from a GINA module A base-station mote used for the off-board computer

 OpenTLD  JavaCV ObjectFinder 30

31

 Object Detection (4 total)  Object Tracking (8 total) ◦ Optical Flow: Lucas-Kanade (2) ◦ Optical Flow: Horn & Schunk (1) ◦ CamShift (1)  Object Recognition (11 total) ◦ SURF (3) ◦ Haar–Like (3) ◦ SIFT (1)  Machine Learning (4 total) ◦ P-N Learning (1)  3D Reconstruction (10 total) ◦ Egomotion (5) ◦ Stereo vision (3) 32