Real-Time Object Tracking System Adam Rossi Meaghan Zorij 12-04-03.

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Copyright © Gregory Avady. All rights reserved. Electro-optical 3D Metrology Gregory Avady, Ph.D. Overview.
A new Network Concept for transporting and storing digital video…………
Autonomous Sensor and Control Platform Rover Tae Lee Josh Reitsema Scott Zhong Mike Chao Mark Winter.
Sonar and Localization LMICSE Workshop June , 2005 Alma College.
Team Spot Cooperative Light Finding Robots A Robotics Academy Project Louise Flannery, Laurel Hesch, Emily Mower, and Adeline Sutphen Under the direction.
Autonomous Mapping Robot Jason Ogasian Jonathan Hayden Hiroshi Mita Worcester Polytechnic Institute Department of Electrical and Computer Engineering Advisor:
SOUTHEASTCON I KARMA ECE IEEE SoutheastCon Hardware Competition Must build an autonomous robot that can –Start at rest at the Starting Station.
Microspectrophotometry Validation. Reasons for Changing Instruments Reduced reliability. Limited efficiency. Limited availability and cost of replacement.
Rotary Encoder. Wikipedia- Definition  A rotary encoder, also called a shaft encoder, is an electro- mechanical device that converts the angular position.
1 Electrical and Computer Engineering Guitar Virtuos Justin D’Alessando (EE) Jacob Dionne (CSE) Adam Montalbano (CSE) Jeffrey Newton (EE) Team Kelly Preliminary.
Senior Computer Engineering Project
Laser Display System Christopher Nigro David Merillat.
Auto - Drive An Automated Driving Range. Team Members Mike Loiselle Jared Beland Jeremy Paradee.
You’ve Got SARS!! Group 6 Brent Anderson Lauren Cutsinger Martin Gilpatric Michael Oberg Matthew Taylor Capstone Spring 2006.
Sonitus Capture the imagination. Agenda Introduction Introduction System Overview System Overview Transmit stage Transmit stage Receive stage Receive.
Mars Rover By: Colin Shea Dan Dunn Eric Spiller Advisors: Dr. Huggins, Dr. Malinowski.
The Alix.1c microcontroller on board the vehicle runs Fluxbuntu Linux and is connected to a g wireless card and a USB web camera. A background process.
Game Development with Kinect
Virtual Target Practice Marc Jabbour - Mike Swanson – Joe Tucci –
PEG Breakout Mike, Sarah, Thomas, Rob S., Joe, Paul, Luca, Bruno, Alec.
August 7, 2003 Sensor Network Modeling and Simulation in Ptolemy II Philip Baldwin University of Virginia Motivation With.
Hand Movement Recognition By: Tokman Niv Levenbroun Guy Instructor: Todtfeld Ari.
A.R.M.S. Automated Robotic Messaging System William Batts Chris Rericha.
Android Based Graphical User Interface for Control.
Night Vision James Stacy Brian Herre Maurio Grando Eric Faller Chris Bawiec James Team Bender.
Digital Notetaker Walter Jordan Tom Warsaw Senior Design Project EECC 657.
Tag Bot: A Robotic Game of Tag Jonathan Rupe Wai Yip Leung.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
Team Migliore Controls and Interface Presented by: Matt Burkhardt Brendan Crotty.
1 Ultrasonic Distance Sensor. 2 How it Works The distance sensor emits short bursts of sound and listens for this sound to echo off of nearby objects.
Challenge #1 – Relay Race Robot #1 will be randomly placed on starting line #1, #2 or #3. Robot #1 will drive until it detects the “Dark Line” - Robot.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Concept Design Review THE DUKES OF HAZARD CAMILLE LEGAULT, NEIL KENNEDY, OMAR ROJAS, FERNANDO QUIJANO, AND JIMMY BUFFI April 24, 2008.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
A HIGH RESOLUTION 3D TIRE AND FOOTPRINT IMPRESSION ACQUISITION DEVICE FOR FORENSICS APPLICATIONS RUWAN EGODA GAMAGE, ABHISHEK JOSHI, JIANG YU ZHENG, MIHRAN.
Motion Tracking & Position Acquisition Final Design Review Solomon Gates | William K. Grefe | Jay Michael Heidbreder | Jeremy Kolpak.
Autonomous Tracking Robot Andy Duong Chris Gurley Nate Klein Wink Barnes Georgia Institute of Technology School of Electrical and Computer Engineering.
May 2008US Sensor Calibration1 Ultrasonic Sensors Calibration Omar A. Daud Truc-Vien T. Nguyen May 16, 2008.
Capacitor Connection in to LED socket Connection to 5v and ground Connection to light sensor pin.
Xin Jin Zelun Tie Ranmin Chen Hang Xie. Outline  Project overview  Project-specific success criteria  Block diagram  Component selection rationale.
1 Lecture 19: Motion Capture. 2 Techniques Morphing Motion Capture.
Ruslan Masinjila Aida Militaru.  Nature of the Problem  Our Solution: The Roaming Security Robot  Functionalities  General System View  System Design.
Smart Pathfinding Robot. The Trouble Quad Ozan Mindek Team Leader, Image Processing Tyson Mowery Packaging Specialist Jungwoo Seo Webmaster, Networking.
Autonomous Robot Project Lauren Mitchell Ashley Francis.
Autonomous Tracking Robot Andy Duong Chris Gurley Nate Klein Wink Barnes Georgia Institute of Technology School of Electrical and Computer Engineering.
Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”
FAST: Fully Autonomous Sentry Turret
Team Ocho Cinco Raymond Chen Zhuo Jing Brian Pentz Kjell Peterson Steven Pham.
Autonomous Robots Vision © Manfred Huber 2014.
Sonar Sensor Project Polaroid Sonar Sensor Details of the Project
Members: Nicholas Allendorf - CprE Christopher Daly – CprE Daniel Guilliams – CprE Andrew Joseph – EE Adam Schuster – CprE Faculty Advisor: Dr. Daji Qiao.
Using Adaptive Tracking To Classify And Monitor Activities In A Site W.E.L. Grimson, C. Stauffer, R. Romano, L. Lee.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan.
Software Narrative Autonomous Targeting Vehicle (ATV) Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers.
LAUNCH X improvements. LAUNCH ©Auto Testing Division Content Sensor headProgrammeTarget dataClampsCabinetTurn table, Side slip.
EEE 499 GRADUATION PROJECT FIRST-MIDTERM PRESENTATION By Sedat ONATLI & Akbar SHADAKOV Supervisor : Prof.Dr. Arif NACAROĞLU.
ParkNet: Drive-by Sensing of Road-Side Parking Statistics Irfan Ullah Department of Information and Communication Engineering Myongji university, Yongin,
Forward Until Near Stop when near a wall.
A Plane-Based Approach to Mondrian Stereo Matching
3D Puppetry: A Kinect-based Interface for 3D Animation
Measurements & Instrumentation – Module 3
Recent developments on micro-triangulation
Implementing Localization
Software Design Team KANG Group 1.
Senior Capstone Project Gaze Tracking System
ECE 477 Digital Systems Senior Design Project  Spring 2006
Definition & Description:
Development High-Speed Visible Diagnostics for Real-Time Plasma Boundary Reconstruction on EAST By: Biao Shen 8/27/2019.
Presentation transcript:

Real-Time Object Tracking System Adam Rossi Meaghan Zorij

Project Objectives Identify a user selected object of specific size, shape, and color in the detectable region. Once the object is found, it will be tracked in real- time. Desired object color is selectable by the user Tracking can be started / stopped at anytime by user Laser directed at target object when tracking Real-time video stream displayed with target object identified Coordinates of locked-on object from platform displayed

Specifications Calibration must be completed prior to starting tracking ( first time only ) Detectable range is 150° from 1 foot from base platform, up to 10 feet Maximum of 3 objects of different color within the detectable region – Each object must square and either red, green, or blue – Objects can vary in size from 1-2 foot square – Each object must be separated by at least 1 foot

Specifications (cont’d)

Objects must remain stationary until it has been located (as indicated when the laser turns on) Once object is located, the object can be moved in the horizontal direction at a maximum speed of one foot per second

Operational Overview Steps of operation Calibrate system Select target object’s color Start tracking Locate object Track in real-time Stop tracking

Analytical Components 5 Main Components – Graphical User Interface: user interface, also includes calibration, configuration, and diagnostic interfaces – Controller: “smarts” of the system, performs calculations of object location based on information from the camera and ultrasonic sensor – CCD Camera: locates target object candidates – Ultrasonic Linear Position Sensor: determines distances of objects – Laser: indicates tracking of object

Analytical Components (cont’d)

Graphical User Interface Live video stream with target object indicated User selectable object Coordinates of object Diagnostics Calibration

Main GUI

Diagnostics GUI

Controller Interfaces ultrasonic sensor / laser system and CCD camera system Determines target object from GUI setting Provides information related to target object for GUI while tracking Calculates coordinates of the object using information from the ultrasonic sensor and CCD camera system Directs laser position based upon analysis of data from camera and sensor

CCD Camera Creative Labs Webcam Pro (OV511+ chipset) – x480 (native) Interface camera using “Video For Linux” API Object detection algorithm – Color segmentation – Template matching / edge detection Provides information to controller on the detected object(s) coordinates, size, and color

Ultrasonic Linear Position Sensor Polaroid Instrumentation Package (from Acroname) – Polaroid 6500 Series Sonar Ranging Module (eases interface to transducer) – Polaroid Electrostatic Transducer Specifications – Distance measurements from 6 inches to 35 feet – Accuracy of +/- 1% over range of operation – Multiple object detection when objects separated by at least 3 inches

Ultrasonic Linear Position Sensor (cont’d) Provides distance calculations to controller – maximum of three measurements per position Interfaces to HC12 with little additional circuitry – Only need a pull-up resistor on ECHO input into HC12 (recommended value is 4.7 kΏ)

Laser Receives target object position from controller Turned on when aligned with target object

Test Strategies Unit tests of each subsystem Diagnostics – controller tests each subsystem Demonstrate real-time tracking capability – One object – Multiple objects Demonstrate tracking of object speeds up to 1 foot/sec

Difficulties Ultrasonic sensor difficulties – Specifications provided by manufacturer may not be accurate – Timing of microprocessor precise enough to accurately measure distance Camera difficulties – Achieving real-time video processing – Locating the target object accurately Accuracy of positioning the laser to track the object

Division of Work Adam Camera subsystem Controller GUI Meaghan Sensor subsystem Laser subsystem Servo operation

Work Schedule

Costs

Questions