Computer Vision: Parallelize or Paralyze Team Purple Threads CSE Capstone 2012 April 2012.

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Miroslav Hlaváč Martin Kozák Fish position determination in 3D space by stereo vision.
Video Object Tracking and Replacement for Post TV Production LYU0303 Final Year Project Spring 2004.
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Introduction  If you have ever been to shooting range before, you know that firing a gun is fun. Time flies when you’re sending hundreds of rounds down.
Object Tracking Using Autonomous Quad Copter Carlos A. Muñoz Robotics, Intelligent Sensing & Control (RISC) Laboratory, School of Engineering, University.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Vision for Robotics ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters, January.
Katie Dellaquila Jeremy Nelson Khiem Tong.  Project Overview [KED]  Multidisciplinary Aspects [KED]  Motivation (Similar Products) [KED]  System Schematic.
Video Object Tracking and Replacement for Post TV Production LYU0303 Final Year Project Spring 2004.
CSE Design Lab – Milestone 2 James Hopkins Dave Festa Dennis O’Flaherty Karl Schwirz.
Technical Specification / Schedule Department of Computer Science and Engineering Michigan State University Spring 2007 Team : CSE 498, Collaborative Design.
Eye Tracking Project Project Supervisor: Ido Cohen By: Gilad Ambar
Juan Guzman ASU Mentor: Shea Ferring. About Me Name: Juan Guzman Computer Science major ASU Senior (graduating December 2012) Have prior internship with.
Virtual RC Racing Nipunn Koorapati Erik Schmidt Jacob Olson Robert Liu.
Page 1 | Microsoft Streams sync and coordinate mapping Kinect for Windows Video Courses.
Kinect calibration Ilya Afanasyev Facoltà di Ingegneria Trento, /20 25/01/2012.
Kalman Tracking for Image Processing Applications Student : Julius Oyeleke Supervisor : Dr Martin Glavin Co-Supervisor : Dr Fearghal Morgan.
Reprojection of 3D points of Superquadrics Curvature caught by Kinect IR-depth sensor to CCD of RGB camera Mariolino De Cecco, Nicolo Biasi, Ilya Afanasyev.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Autonomous Tracking Robot Andy Duong Chris Gurley Nate Klein Wink Barnes Georgia Institute of Technology School of Electrical and Computer Engineering.
The Voice Operated and Wirelessly Controlled Elevator Jeremy Hester Advisor: Dr. Mohammad Saadeh Class: ET 494 (Senior Design II), Fall 2013 Class Professor:
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alexander Norton Advisor: Dr. Huggins April 26, 2012 Senior Capstone Project Final Presentation.
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2 Source : 1.
Project Guide: Mr. B.RAVINDER Assistant Professor (CSE) Batch No.: CSB33 Team Members: D. Sai Goud A0569 Satya Swarup Sahoo A0574 G. Shivadeep.
Page 1 | Microsoft Work With Color Data Kinect for Windows Video Courses Jan 2013.
ECE 477 Final Presentation Team 16 − Spring 2013 Scott Stack Neil Kumar Jon Roose John Hubberts.
JULLIENOR TITAS DAS EEL 5666 – IMDL FALL /2/2014.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
1 EEC-492/592 Kinect Application Development Lecture 2 Wenbing Zhao
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Autonomous Tracking Robot Andy Duong Chris Gurley Nate Klein Wink Barnes Georgia Institute of Technology School of Electrical and Computer Engineering.
FAST: Fully Autonomous Sentry Turret
DIEGO AGUIRRE COMPUTER VISION INTRODUCTION 1. QUESTION What is Computer Vision? 2.
Hardware Sponsors National Aeronautics and Space Administration (NASA) NASA Goddard Space Flight Center (GSFC) NASA Goddard Institute for Space Studies.
Butler Bot Sai Srivatsava Vemu Graduate Student Mechanical and Aerospace Engineering.
By Rachel Hoffman DrumBot.  Mission  Overview  Hardware  Software  Special Sensor  Behaviors  Timeline  Questions Objectives.
Ben Lower Kinect Community Evangelism Kinect for Windows in 5 Minutes.
Juan David Rios IMDL FALL 2012 Dr. Eric M. Schwartz – A. Antonio Arroyo September 18/2012.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
QUADCOPTER- Vision Based Object Tracking By By Pushyami Kaveti Pushyami Kaveti.
David Lorrain Christian Washington John West Chris Le Meggie Alleman October 25, 2012 Team 10Advisor: Dr. Jerry Trahan.
Roaming Security Robot Ruslan Masinjila Aida Militaru.
Bryan Willimon IROS 2011 San Francisco, California Model for Unfolding Laundry using Interactive Perception.
ECE 4007 L01 DK6 1 FAST: Fully Autonomous Sentry Turret Patrick Croom, Kevin Neas, Anthony Ogidi, Joleon Pettway ECE 4007 Dr. David Keezer.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Introduction to Kinect For Windows SDK
Final Year Project. Project Title Kalman Tracking For Image Processing Applications.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alex Norton Advisor: Dr. Huggins February 28, 2012 Senior Project Progress Report Bradley University.
Fire Ants A Firefighting Robot Team Matt Gough Intelligent Machine Design Lab Fall 2015.
Zachary Starr Dept. of Computer Science, University of Missouri, Columbia, MO 65211, USA Digital Image Processing Final Project Dec 11 th /16 th, 2014.
CSE 341 Project : Ultrasonic Radar PRESENTED BY: NAME : AKIFA TASNEEM ID : SECTION: 02 1.
Date of download: 7/9/2016 Copyright © 2016 SPIE. All rights reserved. High level representations of the two data flows. (a) The state-of-the-art data.
CAPSTONE PROJECT OBJECT FOLLOWING ROBOT Summer 2014.
LOGO AutoCarParking Capstone Project. LOGO Project Role HungPD Supervisor Huynb Project Manager, Developer Truongpx Developer Tuanhh Developer, tester.
Creative Coding & the New Kinect
Southern Taiwan University Department of Electrical Engineering
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
“Murphy” Surveillance Robot
Prepared By : wala’ Hamad Khayrieh Homran
Missile Defense Group 12 Aaron Smith Garrett Murray Brad Miller
EEC-693/793 Applied Computer Vision with Depth Cameras
Backup Car Camera Derek Wachowski.
Vision Tracking System
Image Abstract Data Types, and Operations on Images
Project P06441: See Through Fog Imaging
Jetson-Enabled Autonomous Vehicle
Presentation transcript:

Computer Vision: Parallelize or Paralyze Team Purple Threads CSE Capstone 2012 April 2012

Abstract Purple ThreadsPurple Threads Project DescriptionProject Description MotivationMotivation System OverviewSystem Overview First StepsFirst Steps Target Drone PlatformTarget Drone Platform Turret SystemTurret System Final StatusFinal Status

Purple Threads Top Row: Duc Tran, Aviral Shrivastava(Sponsor), Gabriel Silva Bottom Row: Nicholas Moulin, Craig Hartmann, Anthony Russo, Nadim Hoque

Project Description This project is a real world system of robots that visually demonstrates the benefits of parallelization for computer vision applicationsThis project is a real world system of robots that visually demonstrates the benefits of parallelization for computer vision applications

Motivation Computer vision systems have a wide variety of applications, but are very processor intensiveComputer vision systems have a wide variety of applications, but are very processor intensive Parallelization allows the implementation of more advanced computer vision techniques by removing the bottleneck on the processorParallelization allows the implementation of more advanced computer vision techniques by removing the bottleneck on the processor

System Overview Project consists of two different robotsProject consists of two different robots –Target Drone Platform Detect & track projectiles Avoid projectiles –Turret System Detect & track the target Hit target with foam darts

First Steps Requirement ElicitationRequirement Elicitation –High Level –Low Level Hardware/Software SpecificationHardware/Software Specification Risk ManagementRisk Management Project TimelineProject Timeline BudgetBudget Configuring Development EnvironmentConfiguring Development Environment

Target Drone Platform HardwareHardware –Traxxas Slash VXL* –ArduPilot Mega w/ IMU Shield –ION Intel Atom Motherboard* –Microsoft Xbox 360 Kinect Sensor* *Other names and brands may be claimed as the property of others

Target Drone Platform SoftwareSoftware –EMGU CV Open CV Wrapper (C#) –ArduRover Code for ArduPilot –Microsoft Robotics Developer Studio 4* –Kinect for Windows SDK v1.0* *Other names and brands may be claimed as the property of others

Target Drone Platform ImplementationImplementation –Assemble Target Drone Hardware Components Power System –Setup Wireless Access to Drone –Configure Drone Software Stack –Implemented Platform Services through RDS

Target Drone Platform ImplementationImplementation –Obtain RGB & Depth Image Data From Kinect* –Projectile Detection Color Threshold –Projectile Tracking –Collision Avoidance Algorithm *Other names and brands may be claimed as the property of others

Target Drone Platform SetbacksSetbacks –Foam dart too small to track accurately –Hardware too heavy for original shocks –Depth frame coordinates are offset from RGB frame coordinates with no translation function –Dr. Shrivastava crashed the car into a tree!

Turret System HardwareHardware –USB Missile Turret –Microsoft Xbox360 Kinect Sensor* –Arduino Uno –Servos SoftwareSoftware –OpenCV –Libfreenect –Ubuntu 10.04* *Other names and brands may be claimed as the property of others

Turret System ImplementationImplementation –Obtain RGB/Depth Images from Kinect* Sensor –Object Detection Color Threshold Cascade Classification –C USB Turret Driver –Program Arduino to Control Servos –C NetDuino Driver *Other names and brands may be claimed as the property of others

Turret System ImplementationImplementation –Equations Created to Track Target Depth Disparity to Depth(ft) –d_feet = (disparity-824.8)/15.2 –d_feet = (0.1236*tan((disparity/2842.5) ))* Pixels per Foot depending on Depth –Pixels/Foot = *log(d_feet)

Turret System SetbacksSetbacks –Original turret configuration not precise enough –Unable to implement libfreenect driver on PS3* –Turret caught on fire –Turret mounting unstable *Other names and brands may be claimed as the property of others

Turret System Turret Tracking and Hitting a Moving CarTurret Tracking and Hitting a Moving Car

Turret System Effect of Parallelization on Turret SystemEffect of Parallelization on Turret System

Final Status

Questions ?

Backup Slides

Turret System ImplementationImplementation –Equations To Track Target Horizontal Rotation Vertical Rotation