Group 99 Mobile Vision for Autonomous… Navigation and Reconnaissance Jay Silver Kevin Wortman Advised by Bill Ross.

Slides:



Advertisements
Similar presentations
Rotary Encoder. Wikipedia- Definition  A rotary encoder, also called a shaft encoder, is an electro- mechanical device that converts the angular position.
Advertisements

Carnegie Mellon University School of Computer Science Carnegie Mellon University School of Computer Science Cognitive Primitives for Mobile Robots Development.
Autonomous Cargo Transport System for an Unmanned Aerial Vehicle, using Visual Servoing Noah Kuntz and Paul Oh Drexel Autonomous Systems Laboratory Drexel.
Autonomous Quadrocopter Proposal Brad Bergerhouse, Nelson Gaske, Austin Wenzel Dr. Malinowski.
ElectroOptic Sensors Matt McKeever Jonathan Baker UAV Design Team 10/26/2006
A Cloud-Assisted Design for Autonomous Driving Swarun Kumar Shyamnath Gollakota and Dina Katabi.
San Diego State University College of Engineering A Web-Based Mobile Robotic System for Control and Sensor Fusion Studies Christopher Paolini 1, Gerold.
Abstract This project focuses on realizing a series of operational improvements for WPI’s unmanned ground vehicle Prometheus with the end goal of a winning.
Sponsors Mechanical Improvements Software The software is written in C++ and Python using the Robot Operating System (ROS) framework. The ROS tool, rviz,
Matt McKeever Jonathan Baker UAV Design Team 11/16/2006
Automatic Control & Systems Engineering Autonomous Systems Research Mini-UAV for Urban Environments Autonomous Control of Multi-UAV Platforms Future uninhabited.
1 Comp300a: Introduction to Computer Vision L. QUAN.
Vision Guided Navigation Andrey Kozitsky Seth Kramer.
1 Autonomously Controlled Vehicles with Collision Avoidance Mike Gregoire Rob Beauchamp Dan Holcomb Tim Brett.
Pursuit Evasion Games (PEGs) Using a Sensor Network Luca Schenato, Bruno Sinopoli Robotics and Intelligent Machines Laboratory UC Berkeley
An experiment on squad navigation of human and robots IARP/EURON Workshop on Robotics for Risky Interventions and Environmental Surveillance January 7th-8th,
Study on Mobile Robot Navigation Techniques Presenter: 林易增 2008/8/26.
Introduction to mobile robots Slides modified from Maja Mataric’s CSCI445, USC.
University of Kansas Sensing and Actuation for Polar Mobile Robot Eric L. Akers, Hans P. Harmon, Richard S. Stansbury (Presenter), and Arvin Agah ITTC,
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
Client: Space Systems & Controls Laboratory (SSCL) Advisor : Matthew Nelson Anders Nelson (EE) Mathew Wymore (CprE)
1 Autonomous Robots Key questions in mobile robotics What is around me? Where am I ? Where am I going ? How do I get there ? Alternatively, these questions.
An INS/GPS Navigation System with MEMS Inertial Sensors for Small Unmanned Aerial Vehicles Masaru Naruoka The University of Tokyo 1.Introduction.
P14215 AUTONOMOUS WANDERING AMBASSADOR What is the project about? The objective of this project is to modify a currently remote controlled robot so that.
Autonomous Unmanned Ground Vehicle Navigation: Present and Future Larry Jackel DARPA IPTO / TTO darpatech2004/
Pierre Sermanet¹·² Raia Hadsell¹ Jan Ben² Ayse Naz Erkan¹ Beat Flepp² Urs Muller² Yann LeCun¹ (1) Courant Institute of Mathematical Sciences, New York.
Kalman filter and SLAM problem
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
1 DTSI / Interactive Robotics Unit IST Advanced Robotics µdrones µDRone autOnomous Navigation for Environment Sensing JM ALEXANDRE CEA List.
Activity 1: Multi-sensor based Navigation of Intelligent Wheelchairs Theo Theodoridis and Huosheng Hu University of Essex 27 January 2012 Ecole Centrale.
Dr. Jennifer Rochlis. Overview Build a technology testbed for future rover concepts Develop and demonstrate operations and mission concepts.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Development of a Mini-UAV for Urban Environments Tony Dodd and Beniamin Apopei.
Class material vs. Lab material – Lab 2, 3 vs. 4,5, 6 BeagleBoard / TI / Digilent GoPro.
1.  The Autonomous Helicopter Navigation System 2010 is focused on developing a helicopter system capable of autonomous control, navigation and localising.
Autonomous Vehicles: Boundary Tracking and Control Laws By Jackie Brosamer June 19, 2008.
Robot Autonomous Perception Model For Internet-Based Intelligent Robotic System By Sriram Sunnam.
Computational Mechanics and Robotics The University of New South Wales
1. COMMUNICATION Liam O’Sullivan  Control was off board (on the GCS)  Used XBee ZigBee RF modules for telemetry  Point to point communication.
The Eos-Explorer CHENRAN YE IMDE ECE 4665/5666 Fall 2011.
Outline Previous Accomplishments o Last year's SURG o Mapkin Proposal Concept o Why is this useful? o The MikroKopter platform o Previous work Criteria.
MAE 435 Project Design and Management II 19 October,
IMPROVE THE INNOVATION Today: High Performance Inertial Measurement Systems LI.COM.
Phong Le (EE) Josh Haley (CPE) Brandon Reeves (EE) Jerard Jose (EE)
The Wayfarer Modular Navigation Payload for Intelligent Robot Infrastructure Brian Yamauchi
Ch. 3: Geometric Camera Calibration
Image-Based Segmentation of Indoor Corridor Floors for a Mobile Robot
Thermal Camera Systems CHILI Jalapeno Habanero. 640 x µm HgCdTe (Mercury Cadmium Telluride) long-wave thermal Less Glint from water Low Life Cycle.
Autonomous Vehicle Instructor Dr. Dongchul Kim Presented By Harish Kumar Gudipati.
GraffitiBot Sensor Report Andy Kobyljanec EEL 5666C March 25, 2008.
Alan Cleary ‘12, Kendric Evans ‘10, Michael Reed ‘12, Dr. John Peterson Who is Western State? Western State College of Colorado is a small college located.
Software Narrative Autonomous Targeting Vehicle (ATV) Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers.
Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum.
Auto-Park for Social Robots By Team I. Meet the Team Alessandro Pinto ▫ UTRC, Sponsor Dorothy Kirlew ▫ Scrum Master, Software Mohak Bhardwaj ▫ Vision.
EEL 5666 Intelligent Machines Design Laboratory JAWS Summer 1998 BY: Kurnia Wonoatmojo.
STEREO VISION BASED MOBILE ROBOT NAVIGATION IN ROUGH TERRAIN
Autonomous Robot Platform
Pursuit-Evasion Games with UGVs and UAVs
Probabilistic Pursuit-Evasion Games with UGVs and UAVs
Autonomous Robots Key questions in mobile robotics What is around me?
(Mobile Office Delivery System)
Autonomous Cyber-Physical Systems: Sensing
Networks of Autonomous Unmanned Vehicles
Inertial Measurement Units
Mobile Vision for Autonomous…
Patent Liability Analysis
CSE (c) S. Tanimoto, 2002 Image Understanding
CSE (c) S. Tanimoto, 2004 Image Understanding
Overview: Chapter 2 Localization and Tracking
Presentation transcript:

Group 99 Mobile Vision for Autonomous… Navigation and Reconnaissance Jay Silver Kevin Wortman Advised by Bill Ross

Group 99 Vision-aided Navigation for Off-road Robotics ACC Testbed development and DARPA FCS-PerceptOR program  Reconnaissance: Remote vision & monitoring, interactive map/video, and 3D modeling  Navigation: Collision avoidance and improved position estimation Reconnaissance and navigation on limited payload platforms (UGV, UAV, etc.) Autonomous systems to support Future Combat Systems (DARPA TTO, PerceptOR)  Vision systems for time-to-goal/bandwidth-constrained collision avoidance in unstructured outdoor environments (example: lightly wooded scenarios)  All-terrain robotic vehicle (Ruggedized, 2 meter/sec)  Digital fisheye video (180 fisheye,12bit, 60fps, Megapixel)  6-DOF IMU (134 Hz, 3 gyros + 3 accelerometers)  Wireless ethernet ( standard)  Onboard compute (2 Pentium IIIs)  Offboard processing (Dual Pentium IV) Robotic Testbed (ACC )

Group 99 Autonomous Navigation

Group 99 Navigation  Resolution = f ( polar radius )  Many resolutions to create new perspectives  Path finding with probabilistic obstacle models

Group 99 Navigation  Resolution = f ( polar radius )  Many resolutions to create new perspectives  Path finding with probabilistic obstacle models

Group 99 Space Variant Resolutions I(x,y) I(r  r  r   r  r Space Variant Res = f ( r ) Standard Fisheye Res = constant } } } } } } } } } } } } Distorted depth/range perception Accurate depth/range perception

Group 99 Space Variant Resolutions I(x,y) I(r  r  r   r  r Space Variant Res = f ( r ) Standard Fisheye Res = constant } } } } } } } } } } } } Distorted depth/range perception Accurate depth/range perception

Group 99 Space Variant Resolutions

Group 99 Space Variant Resolutions Resolution function is the integral of the angular shift w.r.t. eccentricity Approximate with sum and find a least squares best fit polynomial Space Variant Resolution =

Group 99 Space Variant Resolutions Equal shifts along the radius of expansion for objects with equal range. Boundary tracking is simplified. Red = High Angular Shift Blue = Low Angular Shift Bird’s Eye View of Angular Shift Inverse Resolution - degrees/pixel Eccentricity - degrees Space-variant resolution function Sensor limit

Group 99 Space Variant Resolutions Fixed resolution Space-variant resolution

Group 99 Navigation Resolution = f ( polar radius ) Many resolutions to create new perspectives  Many resolutions to create new perspectives  Path finding with probabilistic obstacle models

Group 99 Navigation Resolution = f ( polar radius ) Many resolutions to create new perspectives  Many resolutions to create new perspectives  Path finding with probabilistic obstacle models

Group 99 Red = high res. Green = med res. (1/2) Blue = low res. (1/4) Multiple Resolutions Red = high res. Green = med res. (1/2) Blue = low res. (1/4)

Group 99 Navigation Resolution = f ( polar radius ) Many resolutions to create new perspectives  Path finding with probabilistic obstacle models

Group 99 Navigation Resolution = f ( polar radius ) Many resolutions to create new perspectives  Path finding with probabilistic obstacle models

Group 99 Path Finding with Probabilistic Obstacle Models Minimum time to goal heavily rewardedDanger Avoidance heavily rewarded

Group 99 Ideal Path Finding with Probabilistic Obstacle Models

Group 99 Navigation Resolution = f ( polar radius ) Many resolutions to create new perspectives Path finding with probabilistic obstacle models Path finding with probabilistic obstacle models Reconnaissance Next: Access this presentation tomorrow at: