Tracking Migratory Birds Around Large Structures by Arik Brooks and Nicholas Patrick Senior Design Project 2003-2004 Bradley University Department of Electrical.

Slides:



Advertisements
Similar presentations
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.
Advertisements

Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Range Imaging and Pose Estimation of Non-Cooperative Targets using Structured Light Dr Frank Pipitone, Head, Sensor Based Systems Group Navy Center for.
University of Colorado at Boulder – ECE Capstone – CDR – October 16, 2007.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
Department of Electrical and Computer Engineering Texas A&M University College Station, TX Abstract 4-Level Elevator Controller Lessons Learned.
Tracking Migratory Birds Around Large Structures Presented by: Arik Brooks and Nicholas Patrick Advisors: Dr. Huggins, Dr. Schertz, and Dr. Stewart Senior.
Spectrophotometer Jan 28, 2002 Deryck Hong Suryadi Gunawan.
Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006.
Craig Chan & Mike Abidoye
Group 1 Final Project Demonstration 13 December 2002 Madhvi Jain Yasin Ozer Jon Shalvi Frank Patrum.
Vehicle Movement Tracking
Group 1 Final Project Demonstration 13 December 2002 Madhvi Jain Yasin Ozer Jon Shalvi Frank Patrum.
Zach Allen Chris Chan Ben Wolpoff Shane Zinner Project Z: Stereo Range Finding Based on Motorola Dragonball Processor.
Chapter 2 Computer Imaging Systems. Content Computer Imaging Systems.
Controls Lab Interface Improvement Project #06508Faculty Advisors: Dr. A. Mathew and Dr. D. Phillips Project Objectives This work focused on the improvement.
המעבדה למערכות ספרתיות מהירות High speed digital systems laboratory הטכניון - מכון טכנולוגי לישראל הפקולטה להנדסת חשמל Technion - Israel institute of.
1 Color Discriminating Tracking System Lloyd Rochester Sam Duncan Ben Schulz Fernando Valentiner.
Preliminary Design Review The Lone Rangers Brad Alcorn Tim Caldwell Mitch Duggan Kai Gelatt Josh Peifer Capstone – Spring 2007.
7/24/031 Ben Blazey Industrial Vision Systems for the extruder.
Climate Monitoring WEB Interface Using 1_Wire™ Sensors Imad Hoteit Hassan Wehbe.
Acoustic Fossil Imaging by Matt Kaiser & John Lewis Advised by: Dr. James H. Irwin & Mr. José Sánchez.
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
Graftek Imaging, Inc. A National Instruments Alliance Member Providing Complete Solutions For Image Acquisition and Analysis.
3-D Computer Vision Using Structured Light Prepared by Burak Borhan.
Binaural Sound Localization and Filtering By: Dan Hauer Advisor: Dr. Brian D. Huggins 6 December 2005.
Wireless Data Acquisition for SAE Car Project by: J.P. Haberkorn & Jon Trainor Advised by: Mr. Steven Gutschlag.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
Laser Tracking System (LTS) Team Lazer: Son Nguyen Jassim Alshamali Aja ArmstrongMatt Aamold.
Ahmed Abdel-Fattah Jerry Chang Derrick Culver Matt Zenthoefer.
The Camera Chapter 4.
Video Basics – Chapter 4 The Video Camera.
Use of FOS for Airborne Radar Target Detection of other Aircraft Example PDS Presentation for EEE 455 / 457 Preliminary Design Specification Presentation.
 A data processing system is a combination of machines and people that for a set of inputs produces a defined set of outputs. The inputs and outputs.
Digital Graphics and Computers. Hardware and Software Working with graphic images requires suitable hardware and software to produce the best results.
Lab 2: Capturing and Displaying Digital Image
Automated Ball Striker Joseph Black David Caloccia Gina Rophael Paul Savickas.
Patrick Lazar, Tausif Shaikh, Johanna Thomas, Kaleel Mahmood
A HIGH RESOLUTION 3D TIRE AND FOOTPRINT IMPRESSION ACQUISITION DEVICE FOR FORENSICS APPLICATIONS RUWAN EGODA GAMAGE, ABHISHEK JOSHI, JIANG YU ZHENG, MIHRAN.
[1] Reference: QCam API reference manual document version Charge Coupled Device (CCD)
9/13/2015Memorial University of Newfoundland Faculty of Engineering & Applied Science Engineering 7854 Industrial Machine Vision INTRODUCTION TO MACHINE.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alexander Norton Advisor: Dr. Huggins April 26, 2012 Senior Capstone Project Final Presentation.
1.Overview 2. Hardware 3. Software Interface 4. Triggering 5. Installation 6. Configuring.
3D SLAM for Omni-directional Camera
By: Alex Norton Advisor: Dr. Huggins November 15, 2011
CCTV Camera Component and Technology Imaging sensor Optic - Lens Camera Technology IP vs analogue CCTV Uniview IPC features Fundamental of CCTV.
Visual Target Tracking System Final Design February 26, 2003 Chad Helm Matthew Sked James Deloge Tim Bagnull.
THE GEORGE WASHINGTON UNIVERSITY SCHOOL OF ENGINEERING AND APPLIED SCIENCE DEPARTMENT OF ELECTRICAL AND COMPUTER ENGINEERING FINAL PRESENTATION WEATHER.
M.S. Thesis Defense Jason Anderson Electrical and Computer Engineering Dept. Clemson University.
MULTIMEDIA INPUT / OUTPUT TECHNOLOGIES
SNS Integrated Control System Timing Clients at SNS DH Thompson Epics Spring 2003.
Realtime Robotic Radiation Oncology Brian Murphy 4ECE.
Multi-Node Real Time Flight Simulator (Outline of the topic, Oct/26/2004) Donghyuk Jeong Aerospace Eng.
Automated Maze System Development Group 9 Tanvir Haque Sidd Murthy Samar Shah Advisors: Dr. Herbert Y. Meltzer, Psychiatry Dr. Paul King, Biomedical Engineering.
0 Test Slide Text works. Text works. Graphics work. Graphics work.
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
Underwater Network Localization Patrick Lazar, Tausif Shaikh, Johanna Thomas, Kaleel Mahmood University of Connecticut Department of Electrical Engineering.
Automated Maze System Development Group 9 Tanvir Haque Sidd Murthy Samar Shah Advisors: Dr. Herbert Y. Meltzer, Psychiatry Dr. Paul King, Biomedical Engineering.
Senior Project Poster Day 2006, CIS Dept. University of Pennsylvania One if by land… Yosef Weiner, David Charles Pollack Faculty Advisor: C.J. Taylor,
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alex Norton Advisor: Dr. Huggins February 28, 2012 Senior Project Progress Report Bradley University.
December 13, G raphical A symmetric P rocessing Prototype Presentation December 13, 2004.
Vanderbilt University Toshiba IR Test Apparatus Project Final Design Review Ahmad Nazri Fadzal Zamir Izam Nurfazlina Kamaruddin Wan Othman.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
April / 2010 UFOCapture 1 UFOCaptureV2 Time-Shift-Motion-Detect-Video-Recording software for complete records of un-expected events.
National Highway Institute 5-1 REV-2, JAN 2006 EQUIPMENT FACTORS AFFECTING INERTIAL PROFILER MEASUREMENTS BLOCK 5.
UFOCaptureV2 Time-Shift-Motion-Detect-Video-Recording software for complete records of un-expected events April / 2010 UFOCapture.
VUMC Soil Worm Activity Monitor
AMCOM Digital Archive Design Review - Week 3.
Software Equipment Survey
IMAGE BASED VISUAL SERVOING
Presentation transcript:

Tracking Migratory Birds Around Large Structures by Arik Brooks and Nicholas Patrick Senior Design Project Bradley University Department of Electrical and Computer Engineering

Outline 1.Background 2.Project summary 3.Previous Work 4.Detailed description 1.System block diagram 2.Subsystems 3.Modes of operation 4.Design equations

Outline Preliminary design work Datasheet Schedule Standards/Patents References Equipment List

Background Every year, many birds are killed when their migration path takes them near tall structures. This usually occurs on overcast nights, and one widely accepted theory on why these bird kills happen is that the birds do not want to leave the lighted area near a structure and end up running into it.

Project Summary The purpose of this project is to implement a system to track the trajectories of birds flying within the field of view of a set of cameras mounted on a rotatable boom in realtime. The positions of the birds are determined using stereoscopic vision by placing the two cameras a known distance apart in parallel with each other.

Project Summary The system output is a display depicting a three dimensional representation of the trajectories, and data relating to the trajectories. Inputs to the system include the position of the boom, images detected by the cameras, calibration information, and confidence level threshold.

Previous Work Seniors Brian Crombie and Matt Zivney worked on a senior project in Spring 2003 with the goal of tracking birds around tall structures via stereoscopic imaging. They achieved basic object tracking in a laboratory environment with major limitations. The groundwork laid out in their project (algorithms, design equations, software organization, etc.) will be used as a starting point for our system.

Detailed Description

System Block Diagram System

Hardware Block Diagram

Subsystems Cameras Boom Frame Grabber PC Display and Interface

Camera Subsystem The camera subsystem includes two cameras mounted in parallel a known distance apart allowing objects to be located in space. Inputs –Photons -- Images from the environment within the field of view of the cameras –Synchronization signal -- Signal from an external source (frame grabber) to coordinate the capturing of images Outputs –Data -- Image data transmitted to the frame grabber Operation in Modes –The cameras capture images continuously

Boom Subsystem The boom subsystem holds the cameras in parallel and rotates via a stepper motor. The position of the boom is determined from the output of an encoder. Inputs –Stepper Motor Control Signal -- Rotates the boom in two directions Outputs –Encoder Output -- Signal to the PC to determine the current angle of the boom Operation in Modes –The boom operates (changes position) only in Setup mode

Frame Grabber Subsystem The frame grabber simultaneously captures images from both cameras and supplies the data to the PC. Inputs –Data -- Image data from the cameras –Setup -- Information from the PC Outputs –Image Data to PC –Synchronization Signal -- Signal to the cameras to coordinate the capture of images Operation in Modes –The frame grabber operates continuously along with the cameras

PC Subsystem Inputs –Image Data -- Arrays of intensity information from the frame grabber representing the collected images –Encoder -- Angle information from the boom encoder –Desired Boom Position -- Input from the user for desired boom position –Real-time/Delay -- Input from user determining whether or not to calculate and display the trajectory information in real-time –Calibration Input -- Calibration data for the cameras being used –Confidence Level -- User defined level of non- linearity in trajectories allowable for consideration

PC Subsystem Outputs –Display -- Trajectories displayed in a three dimensional representation and graphical user interface –Statistics -- Pertinent information about the objects locations and trajectories (e.g. Number of birds within x distance of the cameras, maximum velocity, etc.) –Raw Data -- Data file containing all position data for later analysis Operation in Modes –The PC is continuously operating in every mode

Display and Interface Subsystem The trajectories will be displayed on a standard computer monitor. The user will interface with the system using a standard computer keyboard and mouse. Inputs –Display Information –User Inputs Outputs –Image Display –User Data Operation in Modes –The Display and Interface will be used in Setup and Display modes

Modes of Operation Setup Monitoring Data Acquisition Display and Computation

Setup Mode

Monitoring Mode

Data Acquisition Mode

Display and Computation Mode

Design Equations

Preliminary Design Work Based on preliminary work performed in the laboratory, it was determined that a better method of transient object correlation needs to be implemented to achieve the tracking of a large number of objects at one time. When objects cross paths or get close to each other, the current transient correlation algorithm fails to differentiate between those objects accurately and errors occur.

Preliminary Design Work

The basic flow of the software to be designed including better organization and correlation method was determined. Preprocessing –Read in image, record initial time stamp and time between frame grabs –Discard areas that are not within field of view of both cameras –Perform a background subtraction to extract moving objects –Threshold and convert each image to B/W –Apply filters –Find areas/centroids of all objects

Preliminary Design Work Correlation/Trajectory –Input areas/centroids found in preprocessing –Save data for later use –Find every “possible” 3d position for the objects in the present frame to be “possible”, must be within 30 pixels of each other between cameras in horizontal position –continued...

Preliminary Design Work Correlation/Trajectory (continued) –Search for closest position to predicted position, within the user defined threshold, for each object based on its previous two locations –Search for objects that were first detected in the previous frame based on closest position and area within a threshold (Different from the user defined threshold) –Correlate any remaining objects between two cameras based on closest horizontal distance and area –Calculate new predicted positions for any object with two or more data points in time –Display

Datasheet Average Migratory Bird Size (AMBS): TBD Max # of Objects Tracked Simultaneously: TBD Max Distance from Cameras: TBD Min Distance from Cameras: TBD Max Location Error: TBD Light Level Sensitivity: –Lab Cameras: 0.22 Lux –Low Light Cameras: Lux Max Framerate: TBD System Latency: TBD Max Trackable Bird Speed: TBD Total Volume of Space Observed: TBD Boom Rotation Step Resolution: TBD

Test Plan There will be four primary test procedures that will be performed to verify the system specifications: Location Accuracy –track an AMBS object in known trajectories (including trajectories proceeding primarily towards and away from the cameras) and compare the measured and actual locations Max/Min Distance from Cameras –track an AMBS object in known trajectories and check accuracy/ability to track Max # Objects –TBD Contrast Resolution –track objects of various known intensities in front of a variety of backgrounds

Schedule Week beginningTaskAssigned to 1/22Research/Develop algorithms to improve tracking and correlation Determine final output to the user and layout of the user interface Both 1/29Implement final preprocessing code in C++ Implement improved algorithms in MATLAB for testing Nick Arik 2/5ContinuedBoth 2/12ContinuedBoth 2/19Integrate new cameras to system Port MATLAB to C++ Nick Arik 2/26Develop Graphical User Interface for system and continue other software development Both

Schedule 3/4ContinuedBoth 3/11Test system in near real environmentBoth 3/18Attend wet T-shirt contest in CancunBoth 3/25Develop and implement final boom system and stepper motor Both 4/1Continued and create test plan and final specifications Both 4/8Test systemBoth 4/15Continued and make any necessary changes Prepare for Expo presentation Both 4/22Prepare final report and presentationBoth 5/6Give presentationBoth

Standards There are no overarching standards that apply to bird tracking, but several standards are used to interface cameras to the PC. NTSC –The cameras selected produce NTSC compatible signals, which is the standard in North America –The Frame Grabber converts NTSC inputs to digital images DirectX –DirectX is a defacto standard for Microsoft Windows which includes a programming interface to video capture devices such as frame grabbers –DirectX was chosen over proprietary APIs to maintain a maximum amount of hardware independence

Patents Patent #6,366,691 –Stereoscopic image processing apparatus and method Patent #6,028,954 –Method and apparatus for three-dimensional position measurement Patent #6,035,067 –Apparatus for tracking objects in video sequences and methods therefor Patent #5,812,269 –Triangulation-based 3-D imaging and processing method and system

References Pinhole camera model, image processing reference. Equations relating focal length to zoom Light levels for various time of day and weather conditions. Estimating position when synchronized cameras are not available. Using line lock cameras. Equation relating focal length to target object size, distance, and CCD width. Measurements for various CCD sizes. Project proposal from previous group Chen, Tieh-Yuh; Bovik, Alan Conrad; Cormack, Lawrence K. “Stereoscopic Ranging by Matching Image Modulations,” IEEE Transactions on Image Processing. Vol 8, # 6, June 1999, pg

Equipment List Cameras and Lenses –Lab Sanyo VCB-3444 Rainbow L8DC4P Auto Iris Lens –Low Light Hitachi KP-200E –$920 at DV10x7.5A-SA2 Auto Iris Lens –$273 at

Equipment List Video Capture Card –Data Translation DT3132 Dual Frame Grabber Supports simultaneous acquisition of images from two sources. Programmable through DirectX

Equipment List PC –Windows 2000 or higher OS –DirectX 8.1 or higher installed –One PCI slot for frame grabber –Enough processor power for real- time operation –Development software DirectX 8.1 SDK Microsoft Visual Studio 6.0 MATLAB 6.5 with image processing toolbox

Tracking Migratory Birds Around Large Structures Questions?