Babol university of technology Presentation: Alireza Asvadi

Slides:



Advertisements
Similar presentations
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Advertisements

Change Detection C. Stauffer and W.E.L. Grimson, “Learning patterns of activity using real time tracking,” IEEE Trans. On PAMI, 22(8): , Aug 2000.
CSE 473/573 Computer Vision and Image Processing (CVIP)
Interest points CSE P 576 Ali Farhadi Many slides from Steve Seitz, Larry Zitnick.
Introduction To Tracking
Formation et Analyse d’Images Session 8
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Lecture 6: Feature matching CS4670: Computer Vision Noah Snavely.
Computer Vision - A Modern Approach Set: Linear Filters Slides by D.A. Forsyth Differentiation and convolution Recall Now this is linear and shift invariant,
Image processing. Image operations Operations on an image –Linear filtering –Non-linear filtering –Transformations –Noise removal –Segmentation.
EE663 Image Processing Edge Detection 2 Dr. Samir H. Abdul-Jauwad Electrical Engineering Department King Fahd University of Petroleum & Minerals.
Segmentation Divide the image into segments. Each segment:
Object Detection and Tracking Mike Knowles 11 th January 2005
CSSE463: Image Recognition Day 30 Due Friday – Project plan Due Friday – Project plan Evidence that you’ve tried something and what specifically you hope.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Augmented Reality: Object Tracking and Active Appearance Model
Presented by Pat Chan Pik Wah 28/04/2005 Qualifying Examination
Introduction to Object Tracking Presented by Youyou Wang CS643 Texas A&M University.
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
Computer Vision Linear Tracking Jan-Michael Frahm COMP 256 Some slides from Welch & Bishop.
Computer Vision - A Modern Approach Set: Segmentation Slides by D.A. Forsyth Segmentation and Grouping Motivation: not information is evidence Obtain a.
Overview and Mathematics Bjoern Griesbach
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Kalman filter and SLAM problem
Computer vision.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
Olga Zoidi, Anastasios Tefas, Member, IEEE Ioannis Pitas, Fellow, IEEE
TP15 - Tracking Computer Vision, FCUP, 2013 Miguel Coimbra Slides by Prof. Kristen Grauman.
1 Interest Operators Harris Corner Detector: the first and most basic interest operator Kadir Entropy Detector and its use in object recognition SIFT interest.
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Computer vision: models, learning and inference Chapter 19 Temporal models.
GESTURE ANALYSIS SHESHADRI M. (07MCMC02) JAGADEESHWAR CH. (07MCMC07) Under the guidance of Prof. Bapi Raju.
Local invariant features Cordelia Schmid INRIA, Grenoble.
3D SLAM for Omni-directional Camera
Video-Vigilance and Biometrics
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Reconstructing 3D mesh from video image sequences supervisor : Mgr. Martin Samuelčik by Martin Bujňák specifications Master thesis
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Dan Rosenbaum Nir Muchtar Yoav Yosipovich Faculty member : Prof. Daniel LehmannIndustry Representative : Music Genome.
Detection, Classification and Tracking in a Distributed Wireless Sensor Network Presenter: Hui Cao.
Motion Analysis using Optical flow CIS750 Presentation Student: Wan Wang Prof: Longin Jan Latecki Spring 2003 CIS Dept of Temple.
Stable Multi-Target Tracking in Real-Time Surveillance Video
Expectation-Maximization (EM) Case Studies
1 Template Matching Roland Miezianko Assignment 2 CIS 581 October 30, 2002.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Tracking with dynamics
IEEE International Conference on Multimedia and Expo.
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Digital Image Processing CSC331
Zhaoxia Fu, Yan Han Measurement Volume 45, Issue 4, May 2012, Pages 650–655 Reporter: Jing-Siang, Chen.
Detection, Tracking and Recognition in Video Sequences Supervised By: Dr. Ofer Hadar Mr. Uri Perets Project By: Sonia KanOra Gendler Ben-Gurion University.
Motion tracking TEAM D, Project 11: Laura Gui - Timisoara Calin Garboni - Timisoara Peter Horvath - Szeged Peter Kovacs - Debrecen.
Ehsan Nateghinia Hadi Moradi (University of Tehran, Tehran, Iran) Video-Based Multiple Vehicle Tracking at Intersections.
Traffic Sign Recognition Using Discriminative Local Features Andrzej Ruta, Yongmin Li, Xiaohui Liu School of Information Systems, Computing and Mathematics.
Tracking Objects with Dynamics
Image Segmentation Techniques
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
EE513 Audio Signals and Systems
CSSE463: Image Recognition Day 30
Kalman Filtering COS 323.
CSSE463: Image Recognition Day 30
CSSE463: Image Recognition Day 30
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Nome Sobrenome. Time time time time time time..
Presentation transcript:

Babol university of technology Presentation: Alireza Asvadi ECE Dep. Object Tracking Machine Vision Prof: M. Ezoji Presentation: Alireza Asvadi Winter 2012

“Tracking by detection” Vs “Tracking with dynamics” What is tracking? Estimating the trajectory of an object over time by locating its position in every frame. “Tracking by detection” Vs “Tracking with dynamics” Tracking by Detection: we have a strong model of the object, we detect the object independently in each frame and can record its position over time. Tracking with dynamics: we use object position estimated by measurement but also incorporate the position predicted by dynamics.

Tracking by detection: Time t Time t+1 Time t+n 1. detect the object independently in each frame 2. Link up the instances and we have a track Ref: Master Thesis, Alireza Asvadi, "Object Tracking from Video Sequence (using color and texture information and RBF Networks)"

Methods for detection of object : Point detectors Template and density Based appearance models Background Modeling Background Subtraction Shape models Other Methods: Supervised Classifiers Contour evaluation …. Ref: Master Thesis, Alireza Asvadi, "Object Tracking from Video Sequence (using color and texture information and RBF Networks)"

Point detectors: KLT control graph: Point detectors are used to find interest points in images which have an expressive texture in their respective localities. The object is represented by points. commonly used interest point detectors include Moravec, Harris and SIFT detector. Start Take new image Select Good Features Track Features If lost features Replace features If images remaining Store Image Stop Yes No KLT control graph: Ref: Ankit Gupta (1999183) Vikas Nair (1999219) Supervisor Prof M. Balakrishnan Electrical Engineering Department IIT Delhi

Ref: M. Sonka,V. Hlavac, R.Boyle, “Image Processing, Analysis, and Machine Vision,” 3rd Edition, 2008. Ref: A. Yilmaz, O. Javed, and M. Shah, “Object tracking: A survey,” ACM Computing Surveys, Vol. 38, No. 4, pp. 1–45, December 2006.

NCC=normalized cross-correlation Template Matching: x,y Template Image Input Image I(x,y) O(x,y) Output Image Correlation Template matching is a brute force method of searching the image for a region similar to the object template. The position of the template in the current image is computed by a similarity measure, for example: NCC=normalized cross-correlation SSD=Sum of square differences SAD=Sum of absolute differences Ref: Ankit Gupta (1999183) Vikas Nair (1999219) Supervisor Prof M. Balakrishnan Electrical Engineering Department IIT Delhi

Template Matching: NCC: x is the template gray level image x is the average grey level in the template y is the source image section y is the average grey level in the source image N is the number of pixels in the section image (N= template image size = columns * rows) The value cor is between –1 and +1, with larger values representing a stronger relationship between the two images. In Matlab: C = normxcorr2(template, A) Affine Transformations could be use to Confirm a Match. Ref: M. Sonka,V. Hlavac, R.Boyle, “Image Processing, Analysis, and Machine Vision,” 3rd Edition, 2008.

Frame NCC Template SSD=sum((TEMP(:)-IMGBLK(:)).^2) SAD=sum(abs(TEMP(:)- IMGBLK(:)))

Shape matching can be performed similar to tracking based on template matching. shape models are usually in the form of edge maps. Ref: A. Yilmaz, O. Javed, and M. Shah, “Object tracking: A survey,” ACM Computing Surveys, Vol. 38, No. 4, pp. 1–45, December 2006.

Background Subtraction: Background subtraction is often a good enough detector in applications where the background is known and all trackable objects look different from the background. The most important limitation of background subtraction is the requirement of stationary cameras. Ref: M. Sonka,V. Hlavac, R.Boyle, “Image Processing, Analysis, and Machine Vision,” 3rd Edition, 2008.

Supervised Classifiers: Object detection can be performed by learning different object views automatically from a set of examples by means of a supervised learning mechanism. During testing, the classifier gives a score to the test data indicating the degree of membership of the test data to the positive class. Maximum classification score over image regions estimate the position of the object. Ref: A. Yilmaz, O. Javed, and M. Shah, “Object tracking: A survey,” ACM Computing Surveys, Vol. 38, No. 4, pp. 1–45, December 2006.

See Details in Mean Shift Slides

Mean Shift algorithm:

Mean Shift Widely used, various enhancements (e.g. Robert Collins):

Problems with Tracking by detection Methods: Occlusions: Time Similar Objects:

Problems with Tracking by detection Methods: Or if the detector fails to detect object NCC map in First Frame (A template matching Example)

Tracking with dynamics: Observation (Detected object) + Dynamics Key idea: Given a model of expected motion, predict where objects will occur in next frame. Prediction vs. correction If the observation model is too strong, tracking is reduced to repeated detection If the dynamics model is too strong, will end up ignoring the data

Prediction a and measurement b Filtering Problem: Estimate of c based on Prediction a and measurement b In kalman Filter they are correspond with:

Kalman filter assumptions: The Kalman filter model assumes that the state of a system at a time t evolved from the prior state at time t-1 according to: Measurements of the system can be performed, according to the model: To have similar notation R. Faragher , “Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation,” IEEE Signal Processing Magazine, September 2012.

The Kalman filter: state vector: (position & velocity) Control information: (force) The relationship between the force During the time period Δt and position and velocity: In matrix form: R. Faragher , “Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation,” IEEE Signal Processing Magazine, September 2012.

The Kalman filter: The information from the predictions and measurements are combined to provide the best possible estimate of the location of the train. THE PRODUCT OF TWO GAUSSIAN FUNCTIONS IS ANOTHER GAUSSIAN FUNCTION R. Faragher , “Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation,” IEEE Signal Processing Magazine, September 2012.

The Kalman filter: R. Faragher , “Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation,” IEEE Signal Processing Magazine, September 2012.

The Kalman filter: R. Faragher , “Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation,” IEEE Signal Processing Magazine, September 2012.

Best Prediction prior to Zk Read details in References process noise covariance Residual Estimate (a posteriori) Best Prediction prior to Zk (a priori) Optimal Weighting (Kalman Gain) G. Welch, G. Bishop , “An Introduction to the Kalman Filter,” ,1995.

The Kalman filter: Small measurement error: The observation model is too strong, tracking is reduced to repeated detection Small prediction error: the dynamics model is too strong, will end up ignoring the data

The Kalman filter: G. Welch, G. Bishop , “An Introduction to the Kalman Filter,” ,1995.

The Kalman filter:

Problems: Yet it is not good enough Kalman filter Estimate Is combination of Prediction and Measurement. Till now we considered the entire measurements But what if we omit uninformative observations or highly unlikely measurements? It is Data Association. Result by applying The kalman filter

Data association: So far, we’ve assumed the entire measurement to be relevant to determining the state. In reality, there may be uninformative measurements or measurements that are not necessarily resulted from the target of interest or may belong to different tracked objects. Data association: task of determining which measurements go with which tracks. Simple strategy: only pay attention to the measurement that is “closest” to the prediction.

Tracking: Detection(observation)+ dynamics +Data association DA(Gating): Omit Measurements Outside the gate (say a circle with radius 50) Applying Data association (Gating) Notice Predicted Values By Kalman Filter (Green)

Summary: Tracking by detection tracking=Detection detect the object independently in each frame tracking=Detection Detection Methods: Tracking with dynamics incorporate object dynamics to tracking Methods: tracking=Detection(observation)+dynamics Applying Data association Eliminate highly unlikely measurements tracking=Detection(observation)+ dynamics +Data association Point detectors Template matching density Based appearance models Shape models Background Subtraction … Filtering Methods Kalman filter … Tracking Matching Gating … Methods:

Reference: D. A. Forsyth, J. Ponce, “Computer Vision: A Modern Approach,” Prentice Hall,2nd Edition, 2012. M. Sonka,V. Hlavac, R.Boyle, “Image Processing, Analysis, and Machine Vision,” 3rd Edition, 2008. A. Yilmaz, O. Javed, and M. Shah, “Object tracking: A survey,” ACM Computing Surveys, Vol. 38, No. 4, pp. 1–45, December 2006. R. Faragher , “Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation,” IEEE Signal Processing Magazine, September 2012. G. Welch, G. Bishop , “An Introduction to the Kalman Filter,” ,1995.