Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video.

Slides:



Advertisements
Similar presentations
DDDAS: Stochastic Multicue Tracking of Objects with Many Degrees of Freedom PIs: D. Metaxas, A. Elgammal and V. Pavlovic Dept of CS, Rutgers University.
Advertisements

For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Visual Object Tracking Using Particle Filters : A Survey Satarupa Mukherjee Department of Computing Science, University of Alberta, Edmonton, Canada
Activity Recognition Aneeq Zia. Agenda What is activity recognition Typical methods used for action recognition “Evaluation of local spatio-temporal features.
(Includes references to Brian Clipp
A KLT-Based Approach for Occlusion Handling in Human Tracking Chenyuan Zhang, Jiu Xu, Axel Beaugendre and Satoshi Goto 2012 Picture Coding Symposium.
Formation et Analyse d’Images Session 8
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Video Processing EN292 Class Project By Anat Kaspi.
Reegan Worobec & David Sloan In collaboration with UAARG.
Rodent Behavior Analysis Tom Henderson Vision Based Behavior Analysis Universitaet Karlsruhe (TH) 12 November /9.
Optical Flow
CSSE463: Image Recognition Day 30 Due Friday – Project plan Due Friday – Project plan Evidence that you’ve tried something and what specifically you hope.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
MULTIPLE MOVING OBJECTS TRACKING FOR VIDEO SURVEILLANCE SYSTEMS.
Multi-camera Video Surveillance: Detection, Occlusion Handling, Tracking and Event Recognition Oytun Akman.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
1 Video Surveillance systems for Traffic Monitoring Simeon Indupalli.
Overview and Mathematics Bjoern Griesbach
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
Tracking Pedestrians Using Local Spatio- Temporal Motion Patterns in Extremely Crowded Scenes Louis Kratz and Ko Nishino IEEE TRANSACTIONS ON PATTERN ANALYSIS.
GM-Carnegie Mellon Autonomous Driving CRL TitleAutomated Image Analysis for Robust Detection of Curbs Thrust AreaPerception Project LeadDavid Wettergreen,
Particle Filter & Search
CSSE463: Image Recognition Day 30 This week This week Today: motion vectors and tracking Today: motion vectors and tracking Friday: Project workday. First.
1. Introduction Motion Segmentation The Affine Motion Model Contour Extraction & Shape Estimation Recursive Shape Estimation & Motion Estimation Occlusion.
A General Framework for Tracking Multiple People from a Moving Camera
3D SLAM for Omni-directional Camera
Vision-based Landing of an Unmanned Air Vehicle
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Introduction EE 520: Image Analysis & Computer Vision.
National Taiwan University Graduate Institute of Electronics Engineering National Taiwan University Graduate Institute of Electronics Engineering A CCESS.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Motion Analysis using Optical flow CIS750 Presentation Student: Wan Wang Prof: Longin Jan Latecki Spring 2003 CIS Dept of Temple.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Object Tracking - Slide 1 Object Tracking Computer Vision Course Presentation by Wei-Chao Chen April 05, 2000.
OBJECT TRACKING USING PARTICLE FILTERS. Table of Contents Tracking Tracking Tracking as a probabilistic inference problem Tracking as a probabilistic.
IEEE International Conference on Multimedia and Expo.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Presented by: Idan Aharoni
Learning video saliency from human gaze using candidate selection CVPR2013 Poster.
1.Optical Flow 2.LogPolar Transform 3.Inertial Sensor 4.Corner Detection 5. Feature Tracking 6.Lines.
Zhaoxia Fu, Yan Han Measurement Volume 45, Issue 4, May 2012, Pages 650–655 Reporter: Jing-Siang, Chen.
Automated Motion Imagery Data
Signal and Image Processing Lab
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Paper – Stephen Se, David Lowe, Jim Little
A Forest of Sensors: Using adaptive tracking to classify and monitor activities in a site Eric Grimson AI Lab, Massachusetts Institute of Technology
Tracking Objects with Dynamics
Motion Detection And Analysis
Vision Based Motion Estimation for UAV Landing
CSE 577 Image and Video Analysis
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
A New Approach to Track Multiple Vehicles With the Combination of Robust Detection and Two Classifiers Weidong Min , Mengdan Fan, Xiaoguang Guo, and Qing.
Range Imaging Through Triangulation
IMAGE BASED VISUAL SERVOING
Eric Grimson, Chris Stauffer,
Motion Segmentation at Any Speed
Effective and Efficient Detection of Moving Targets From a UAV’s Camera
Filtering Things to take away from this lecture An image as a function
CSSE463: Image Recognition Day 30
CSSE463: Image Recognition Day 30
Multi-UAV Detection and Tracking
Nome Sobrenome. Time time time time time time..
Multi-Target Detection and Tracking of UAVs from a UAV
UAV CINEMATOGRAPHY CONSTRAINTS IMPOSED BY VISUAL TARGET TRACKING
Presentation transcript:

Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video

Tracking algorithm overview The OF module is responsible for calculating a differential optical flow, which eliminates the camera motion (airborne imagery) from the motion of the objects as observed by this camera. The second phase is formed by a module that performs the actual tracking of the objects segmented from phase 1 Phase 1 No Phase 2 Read images Yes Confidence ? LowHigh Geo location Star t Found ROI? Fig 3: Algorithm flow chart

Optical flow based target segmentation 1. Extract OF on feature points2. Analyze histogram of OF2. Identify BG flow 3. Subtract BG flow from the entire OF 4. Segment out the dominating target

Tracking module and thread expansion Thread 1Thread 2 The algorithm them processes the region of interest and detects as many features as possible, according to KLT Feature Tracker. If the confidence index drops below a predetermined level, the Phase 2 of the algorithm aborts and Phase 1 is invoked again and the process starts from the beginning. the tracking module is capable of doing multi-tracking task by thread and semaphore coding scheme. Fig 4: Phase2 expanded to two different targets

Tracking demo on ground vehicles

Image plane x Z Y Inertial frame Pt Pn Camera focal Point Target Normal point h Po One shot target localization

x Z Y Inertial frame dt C  t C t I dt I  h s P UAV altitude estimation using a ground feature point Figure 6: Feature points available on the ground Figure 7: Altitude estimation using stereo camera positions.

Particle filtering The performance of the target localization algorithm needs to be refined since it only depends on noisy sensor perception (GPS/IMU). It is a nonlinear Bayesian tracking method which recursively estimates the current state based on the previous states. Given a particle, the main structure of the algorithm is repeating prediction steps and measurement steps based on the current observation

Particle filtering Update step: The measurement is done by comparing the current observation vector with the projected observation vector. The observation vector is defined as, where u and v are the target position in the image coordinate frame, and h is the altitude of the camera. Prediction step: Our target is assumed to obey the following dynamic model.