COMP 9517 Computer Vision Motion and Tracking 6/11/2018

Slides:



Advertisements
Similar presentations
Dynamic Occlusion Analysis in Optical Flow Fields
Advertisements

1 Approximated tracking of multiple non-rigid objects using adaptive quantization and resampling techniques. J. M. Sotoca 1, F.J. Ferri 1, J. Gutierrez.
Introduction To Tracking
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Computer Vision Optical Flow
A New Block Based Motion Estimation with True Region Motion Field Jozef Huska & Peter Kulla EUROCON 2007 The International Conference on “Computer as a.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Motion Detection And Analysis Michael Knowles Tuesday 13 th January 2004.
1 Motion in 2D image sequences Definitely used in human vision Object detection and tracking Navigation and obstacle avoidance Analysis of actions or.
Optical Flow Methods 2007/8/9.
Probabilistic video stabilization using Kalman filtering and mosaicking.
E.G.M. PetrakisDynamic Vision1 Dynamic vision copes with –Moving or changing objects (size, structure, shape) –Changing illumination –Changing viewpoints.
Motion based Correspondence for Distributed 3D tracking of multiple dim objects Ashok Veeraraghavan.
Object Detection and Tracking Mike Knowles 11 th January 2005
CSSE463: Image Recognition Day 30 Due Friday – Project plan Due Friday – Project plan Evidence that you’ve tried something and what specifically you hope.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
1 Motion in 2D image sequences Definitely used in human vision Object detection and tracking Navigation and obstacle avoidance Analysis of actions or.
Introduction to Object Tracking Presented by Youyou Wang CS643 Texas A&M University.
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
1 Video Surveillance systems for Traffic Monitoring Simeon Indupalli.
Multimodal Interaction Dr. Mike Spann
TP15 - Tracking Computer Vision, FCUP, 2013 Miguel Coimbra Slides by Prof. Kristen Grauman.
Exploiting video information for Meeting Structuring ….
CSSE463: Image Recognition Day 30 This week This week Today: motion vectors and tracking Today: motion vectors and tracking Friday: Project workday. First.
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Babol university of technology Presentation: Alireza Asvadi
1. Introduction Motion Segmentation The Affine Motion Model Contour Extraction & Shape Estimation Recursive Shape Estimation & Motion Estimation Occlusion.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
Recognizing Action at a Distance Alexei A. Efros, Alexander C. Berg, Greg Mori, Jitendra Malik Computer Science Division, UC Berkeley Presented by Pundik.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Motion Analysis using Optical flow CIS750 Presentation Student: Wan Wang Prof: Longin Jan Latecki Spring 2003 CIS Dept of Temple.
3D Imaging Motion.
Efficient Visual Object Tracking with Online Nearest Neighbor Classifier Many slides adapt from Steve Gu.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Segmentation of Vehicles in Traffic Video Tun-Yu Chiang Wilson Lau.
 Present by 陳群元.  Introduction  Previous work  Predicting motion patterns  Spatio-temporal transition distribution  Discerning pedestrians  Experimental.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
Tracking with dynamics
CSSE463: Image Recognition Day 29 This week This week Today: Surveillance and finding motion vectors Today: Surveillance and finding motion vectors Tomorrow:
Processing Images and Video for An Impressionist Effect Automatic production of “painterly” animations from video clips. Extending existing algorithms.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
CSSE463: Image Recognition Day 29
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Traffic Sign Recognition Using Discriminative Local Features Andrzej Ruta, Yongmin Li, Xiaohui Liu School of Information Systems, Computing and Mathematics.
Motion and Optical Flow
COMP 9517 Computer Vision Motion 7/21/2018 COMP 9517 S2, 2012.
Tracking Objects with Dynamics
Motion Detection And Analysis
Dynamical Statistical Shape Priors for Level Set Based Tracking
Fast and Robust Object Tracking with Adaptive Detection
Range Imaging Through Triangulation
Presented by: Cindy Yan EE6358 Computer Vision
CSSE463: Image Recognition Day 29
Image and Video Processing
Image and Video Processing
CSSE463: Image Recognition Day 30
CSSE463: Image Recognition Day 29
Coupled Horn-Schunck and Lukas-Kanade for image processing
CSSE463: Image Recognition Day 30
CSSE463: Image Recognition Day 29
Optical flow Computer Vision Spring 2019, Lecture 21
CSSE463: Image Recognition Day 30
CSSE463: Image Recognition Day 29
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Presentation transcript:

COMP 9517 Computer Vision Motion and Tracking 6/11/2018 COMP 9517 S2, 2009

Introduction A changing scene may be observed via a sequence of images Changes in an image sequence provide features for detecting objects that are moving computing their trajectories computing the motion of the viewer in the world Recognising objects based on their behaviours Detecting and recognising activities 6/11/2018 COMP 9517 S2, 2009

Applications Motion-based recognition: human identification based on gait, automatic object detection, etc; Automated surveillance: monitoring a scene to detect suspicious activities or unlikely events; Video indexing: automatic annotation and retrieval of the videos in multimedia databases; Human-computer interaction: gesture recognition, eye gaze tracking for data input to computers, etc.; Traffic monitoring: real-time gathering of traffic statistics to direct traffic flow. Vehicle navigation: video-based path planning and obstacle avoidance capabilities. 6/11/2018 COMP 9517 S2, 2009

Motion Phenomena Still camera, single moving object, constant background Still camera, several moving objects, constant background Moving camera, relatively constant scene Moving camera, several moving objects 6/11/2018 COMP 9517 S2, 2009

Image Subtraction Detecting an object moving across a constant background The forward and rear edges of the object advance only a few pixels per frame By subtracting the image It from the previous image It-1, there edges should be evident as the only pixels significantly different from zero 6/11/2018 COMP 9517 S2, 2009

Image Subtraction Derive a background image from a set of video frames at the beginning of the video sequence The background image is then subtracted from each subsequent frame to create a difference image Enhance the difference image to fuse neighbouring regions and remove noise 6/11/2018 COMP 9517 S2, 2009

Image Subtraction Detect changes between two images Input: images It and It-Δ Input: an intensity threshold τ Output: a binary image Iout Output: a set of bounding boxes B For all pixels [r, c] in the input images, set Iout[r, c] = 1 if (|It[r, c] –It-Δ[r, c]|>τ) set Iout[r, c] = 0 otherwise 2. Perform connected components extraction on Iout Remove small regions assuming they are noise Perform a closing of Iout using a small disk to fuse neighbouring regions Compute the bounding boxes of all remaining regions of changed pixels Return Iout[r, c] and the bounding boxes B of regions of changed pixels 6/11/2018 COMP 9517 S2, 2009

Motion Vector Motion filed: a 2-D array of 2-D vectors representing the motion of 3-D scene points The motion vector in the image represent the displacements of the images of moving 3-D points Tail at time t and head at time t+Δ Instantaneous velocity estimate at time t 6/11/2018 COMP 9517 S2, 2009

Motion Vector Two assumption: The intensity of a 3-D scene point and that of its neighbours remain nearly constant during the time interval The intensity differences observed along the images of the edges of objects are nearly constant during the time internal Image flow: the motion field computed under the assumption that image intensity near corresponding points is relatively constant Two methods for computing image flow: Sparse: point correspondence-based method Dense: spatial & temporal gradient-based method 6/11/2018 COMP 9517 S2, 2009

Point Correspondence-based Method A sparse motion field can be computed by identifying pairs of points that correspond in two images taken at times ti and ti+Δ Interesting points: Corner points Centroids of persistent moving regions Two steps: Detect interesting points Search corresponding points 6/11/2018 COMP 9517 S2, 2009

Point Correspondence-based Method Detect interesting points: Detect corner points Kirsch edge operator Frie-Chen ripple operator Interest operator Computes intensity variance in the vertical, horizontal and diagonal directions Interest point if the minimum of these four variances exceeds a threshold 6/11/2018 COMP 9517 S2, 2009

Point Correspondence-based Method Finding interesting points of a given input image Procedure detect_corner_points(I,V){ for (r = 0 to MaxRow – 1) for (c = 0 to MaxCol – 1) if (I[r,c] is a border pixel) break; elseif (interest_operator(I,r,c,w)>=t) add [(r,c),(r,c)] to set V; } Procedure interest_opertator (I,r,c,,w){ v1 = variance of intensity of horizontal pixels I[r,c-w]…I[r,c-w]; v2 = variance of intensity of vertical pixels I[r-w,c]…I[r+w,]; v3 = variance of intensity of diagonal pixels I[r-w,c-w]…I[r+w,c+w]; v4 = variance of intensity of diagonal pixels I[r-w,c+w]…I[r+w,c-w]; return mini(v1, v2, v3, v4); 6/11/2018 COMP 9517 S2, 2009

Point Correspondence-based Method Search corresponding points: Given an interesting point Pj from I1, we take its neighbourhood in I1 and find the best correlating neighbourhood in I2 under the assumption that the amount of movement is limited 6/11/2018 COMP 9517 S2, 2009

Spatial & Temporal Gradient-based Method Assumption The object reflectivity and the illumination of the object do not change during the time interval The distance of the object from the camera or light sources do not vary significantly over this interval Each small intensity neighbourhood Nx,y at time t1 is observed in some shifted position Nx+Δx,y+Δy at time t2 These assumption may be not hold tight in real case, but provides useful computation and approximation 6/11/2018 COMP 9517 S2, 2009

Spatial & Temporal Gradient-based Method Optical flow equation Taylor series : Multivariable version: (1) Assuming the optical flow vector V=[Δx, Δy] carries the intensity neighbourhood N1(x,y) at time t1 to an identical intensity neighbourhood N2(x+Δx,y+ Δy) at time t2 leads to (2) By combining (1) and (2) and ignoring the high order term, a linear constraint can be developed: (3) 6/11/2018 COMP 9517 S2, 2009

Spatial & Temporal Gradient-based Method The optical flow equation provides a constraint that can be applied at every pixel position However, the optical flow does not give unique solution and thus further constrains are required Example: using the optical flow equation for a group of adjacent pixels and assuming that all of them have the same velocity, the optical flow computation task is reduced to solving a linear system using the least square method 6/11/2018 COMP 9517 S2, 2009

Motion Tracking When moving points are not tagged with unique texture or colour information, the characteristics of the motion itself must be used to collect points into trajectories Assumption: The location of an object changes smoothly over time The velocity of an object changes smoothly over time (speed and direction) An object can be at only one location in space at a give time Two objects cannot occupy the same location at the same time 6/11/2018 COMP 9517 S2, 2009

Motion Tracking Trajectory: the sequence of image points Ti=(pi,1, pi,2, … , pi,n) when an object I is observed at time instants t = 1, 2, …, n Difference vector between any two points of a trajectory: Smoothness of velocity Total smoothness 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Tracking can be considered as the problem of generating an inference about the motion of an object given a sequence of images Tracking is properly thought of as an probabilistic inference problem The moving object has internal state, which is measured at each frame Measurements are combined to estimate the state of the object 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Given Xi: the state of the object at the ith frame Yi: the measurement obtained in the ith frame There are three main problems: Prediction: predict the state for the ith frame by having seen a set of measurements y0, y1, …, yi Data association: select the measurements that are related to the object’s state Correction: correct the predicted state by obtained relevant measurements yi 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Independence Assumptions Only the immediate past matters Measurements depend only on the current state Xi-1 Yi-1 Xi Yi Xi+1 Yi+1 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Tracking as Inference Prediction Correction 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Kalman Filtering Dynamic model: Define: Prediction: 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Kalman Filtering Correction 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Kalman Filtering – the algorithm (1-D) Start Assumption: are known Update Equation: Prediction Update Equation: Correction 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Kalman Filtering – the algorithm(2-D) Start Assumption: are known Update Equation: Prediction Update Equation: Correction 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Particle Filtering A non-linear dynamic model Also known variously as : Sequential Monte Carlo method, bootstrap filtering, the condensation algorithm, interacting particle approximations and survival of the fittest The conditional state density at time t is represented by a set of sampling particles with weight (sampling probability). The weight define the importance of a sample (observation frequency) 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Particle Filtering (A.Vlake and M.Isard, Active Contour) 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Particle Filtering The common sampling scheme is importance sampling Selection: Select N random sample Prediction: Generate new sample with zero mean Gaussian Error and non-negative function Correction: Weight corresponding to the new sample are computed using the measurement equation which can be modelled as a Gaussian density 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Particle Filtering The common sampling scheme is importance sampling Selection: Select N random sample Prediction: Generate new sample with zero mean Gaussian Error and non-negative function Correction: Weight corresponding to the new sample are computed using the measurement equation which can be modelled as a Gaussian density 6/11/2018 COMP 9517 S2, 2009

Tracking with Dynamic Models Particle Filtering – the algorithm (SIS), A.Vlake and M.Isard 6/11/2018 COMP 9517 S2, 2009

Tracking can be complex Loss of information caused by projection of the 3D world on a 2D image, Noise in images, Complex object motion, Non-rigid or articulated nature of objects, Partial and full object occlusions, Complex object shapes, Scene illumination changes, and Real-time processing requirements. 6/11/2018 COMP 9517 S2, 2009

References Yilmaz, A., Javed, O., and Shah, M. 2006. Object tracking: A survey. ACM Comput. Surv. 38, 4. 6/11/2018 COMP 9517 S2, 2009