Hi_Lite Scott Fukuda Chad Kawakami. Background ► The DARPA Grand Challenge ► The Defense Advance Research Project Agency (DARPA) established a contest.

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Working for the future - today
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Sukhum Sattaratnamai Advisor: Dr.Nattee Niparnan
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Abstract This project focuses on realizing a series of operational improvements for WPI’s unmanned ground vehicle Prometheus with the end goal of a winning.
AI and Autonomous Vehicles Daniel Landers COMP Intelligent and Interactive Systems 3/10/2007.
Sponsors Mechanical Improvements Software The software is written in C++ and Python using the Robot Operating System (ROS) framework. The ROS tool, rviz,
Advanced Computer Vision Introduction Goal and objectives To introduce the fundamental problems of computer vision. To introduce the main concepts and.
Zach Allen Chris Chan Ben Wolpoff Shane Zinner Project Z: Stereo Range Finding Based on Motorola Dragonball Processor.
Electrical and Computer Engineering Irregular Object Dimensioning System Advisor: Professor Neal Anderson Michael Baccari Peter Bian Michael Coughlin Avi.
Spacecraft Stereo Imaging Systems Group S3. Variables Separation of the cameras Height of the cameras – relative to the bench Angle – The direction cameras.
3-D Computer Vision Using Structured Light Prepared by Burak Borhan.
Hi_Lite Scott Fukuda Chad Kawakami. Background ► The DARPA Grand Challenge ► The Defense Advance Research Project Agency (DARPA) established a contest.
1 Video Surveillance systems for Traffic Monitoring Simeon Indupalli.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
Introduction to Machine Vision Systems
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
CIS 601 Fall 2004 Introduction to Computer Vision and Intelligent Systems Longin Jan Latecki Parts are based on lectures of Rolf Lakaemper and David Young.
OPTICAL FLOW The optical flow is a measure of the change in an image from one frame to the next. It is displayed using a vector field where each vector.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Shane Tuohy.  In 2008, rear end collisions accounted for almost 25% of all injuries sustained in road traffic accidents on Irish roads [RSA Road Collision.
Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: –Cameras’ extrinsic parameters, i.e. the geometric relationship.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Designing and implementing a method for locating and presenting a Laser pointer spot Eran Korkidi Gil-Ad Ben-Or.
Casey Smith Doug Ritchie Fred Lloyd Michael Geary School of Electrical and Computer Engineering November 2, 2011 ECE 4007 Automated Speed Enforcement Using.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alexander Norton Advisor: Dr. Huggins April 26, 2012 Senior Capstone Project Final Presentation.
Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan,
By: Alex Norton Advisor: Dr. Huggins November 15, 2011
Stanley – RC Car.
November 10, 2004 Prof. Christopher Rasmussen Lab web page: vision.cis.udel.edu.
Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
Computer Vision Why study Computer Vision? Images and movies are everywhere Fast-growing collection of useful applications –building representations.
DIEGO AGUIRRE COMPUTER VISION INTRODUCTION 1. QUESTION What is Computer Vision? 2.
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
MACHINE VISION Machine Vision System Components ENT 273 Ms. HEMA C.R. Lecture 1.
Acquiring 3D models of objects via a robotic stereo head David Virasinghe Department of Computer Science University of Adelaide Supervisors: Mike Brooks.
Intelligent Ground Vehicle Competition Navigation Michael Lebson - James McLane - Image Processing Hamad Al Salem.
Lec 22: Stereo CS4670 / 5670: Computer Vision Kavita Bala.
CSP Visual input processing 1 Visual input processing Lecturer: Smilen Dimitrov Cross-sensorial processing – MED7.
0 Test Slide Text works. Text works. Graphics work. Graphics work.
Autonomous Robots Vision © Manfred Huber 2014.
Image Tracing Laser System Jason Duarte Azmat Latif Stephen Sundell Tim Weidner.
By: David Gelbendorf, Hila Ben-Moshe Supervisor : Alon Zvirin
Colour and Texture. Extract 3-D information Using Vision Extract 3-D information for performing certain tasks such as manipulation, navigation, and recognition.
Final Year Project. Project Title Kalman Tracking For Image Processing Applications.
Computational Vision CSCI 363, Fall 2012 Lecture 17 Stereopsis II
RoboCup KSL Design and implementation of vision and image processing core Academic Supervisor: Dr. Kolberg Eli Mentors: Dr. Abramov Benjamin & Mr.
An Introduction to Digital Image Processing Dr.Amnach Khawne Department of Computer Engineering, KMITL.
Vision & Image Processing for RoboCup KSL League Rami Isachar Lihen Sternfled.
Sound Source Localization & Surround System A Blackfin 533 DSP Application by Jordan Arnold & Adam Hanafi.
1 2D TO 3D IMAGE AND VIDEO CONVERSION. INTRODUCTION The goal is to take already existing 2D content, and artificially produce the left and right views.
The Robot Revolution has been Postponed (until we can debug the sensors) Bill Smart Oregon State University
3D Scanning Based on Computer Vision
Depth Analysis With Stereo Cameras
Depth Analysis With Stereo Cameras
Properties of human stereo processing
Stereo Vision Applications
Common Classification Tasks
Pearson Lanka (Pvt) Ltd.
Depth Analysis With Stereo Camera
Vision Based UAV Landing
Depth Analysis With Stereo Camera
Introduction Computer vision is the analysis of digital images
Introduction to Artificial Intelligence Lecture 22: Computer Vision II
Presentation transcript:

Hi_Lite Scott Fukuda Chad Kawakami

Background ► The DARPA Grand Challenge ► The Defense Advance Research Project Agency (DARPA) established a contest in 2004 to set a new level of accomplishment for autonomous land vehicles. This was in response to a goal set by Congress that by 2015 twenty percent of U.S. military vehicles on a battlefield would be unmanned. In The Grand Challenge, contestants vied for a one million dollar prize by racing for 150 miles.

HI_Lite Vehicle ► Modified from a Columbia Par Car electric scout vehicle ► Akamai Research LLC hosted and led the Hawaii effort with volunteers and sponsorship from e-Vehicles of Hawaii. ► In only three months the car was born. ► HI_Lite stayed in consideration up to the semi finals of the DARPA challenge.

HI_Lite Vehicle

Current status ► Vehicle is currently dismantled. ► No longer bounded by the DARPA contest rules. ► Looking to improve operational capabilities/ strategies and robotic controls, integrate wireless feedback and multiple remote controllers, including human operators, and allow for “urban” environments.

Our Project ► To explore the capabilities of the vehicle’s “eyes” and derive a primitive useful “environment describing” system. ► Coupling simple optical devices to create an enhanced “environment describing” system.

Block Diagram Software Small Vision System Matlab (or C++) Hardware Videre Design digital stereo Camera head Laser Pointers Lenses

Block Diagram - Hardware ► Videre Design Variable baseline digital stereo camera head  Provides left and right video/images for depth/3D information ► Laser Pointers  Create a 3D laser grid and illuminate grid on objects within the camera's view to calibrate depth information ► If necessary, external "pre-processing" will be done using color lenses as filters to enable the camera to see the lasers as lines instead of points ► If necessary, use the SICK Laser Range Detector to establish numerical calibration points and calibrate/tune the stereo camera using the SICK's data

Block Diagram - Software ► Small Vision System – Videre bundled software  Record stereo images  Rectify the images to account for distortion  Perform stereo correlation to compute a range image (depth contours) ► Matlab - interpret data and provide useful outputs  Object detection calculations using edge detection and boundary tracing on disparity images  Correlate objects with numerical distance points from laser grid

Software – In Depth ► Object Detection ► Philsophy  Simpler = Better!  Why? ► We don’t need to record or identify objects we “see”, we only need to avoid hitting objects

Software – In Depth ► Object Detection ► Algorithm  SVS library performs disparity calculations which give us depth information that can be converted to a 3D image

Software – In Depth ► Using the disparity image, look for edges that will distinguish object and trace the boundaries around each object

Software – In Depth ► To ensure a complete object, look for cohesion in objects depth by observing correlation between the pixels in the disparity image since the disparity image uses the brightness of the pixels to show depth (brighter = closer)

Software – In Depth ► Separate objects, then output object information and distance (determined from laser grid)

Software – In Depth ► Potential problems with algorithm:  Tiny objects  Overlaying objects  Oddly shaped objects

Test Bed 5 Feet 5 1/2 Feet 6 Feet

5 Feet 5 1/2 Feet 6 Feet Test Bed 2 Filters Laser “array” distance

5 Feet 5 1/2 Feet 6 Feet Test Bed 3 Filters Laser “array”

5 Feet 5 1/2 Feet 6 Feet Test Bed 4 Filters Laser “array”

Work to be done ► Implement useful simple algorithm for object detection ► Test effects of coupling devices ► Move from image to video functionality ► Testing, testing, testing!

Hi-Lite Gant ChartScott Fukuda and Chad Kawakami 18-Feb25-Feb4-Mar11-Mar18-Mar25-Mar1-Apr8-Apr15-Apr22-Apr29-Apr Tests Setup Test Bed 1 Run Test Bed 1 Setup Test Bed 2 Run Test Bed 2 Data organize algorithm Exploratory additional Tests

QUESTIONS?