Download presentation
Presentation is loading. Please wait.
Published byBriana Carr Modified over 6 years ago
1
COMP 9517 Computer Vision Motion and Tracking 6/11/2018
COMP 9517 S2, 2009
2
Introduction A changing scene may be observed via a sequence of images
Changes in an image sequence provide features for detecting objects that are moving computing their trajectories computing the motion of the viewer in the world Recognising objects based on their behaviours Detecting and recognising activities 6/11/2018 COMP 9517 S2, 2009
3
Applications Motion-based recognition: human identification based on gait, automatic object detection, etc; Automated surveillance: monitoring a scene to detect suspicious activities or unlikely events; Video indexing: automatic annotation and retrieval of the videos in multimedia databases; Human-computer interaction: gesture recognition, eye gaze tracking for data input to computers, etc.; Traffic monitoring: real-time gathering of traffic statistics to direct traffic flow. Vehicle navigation: video-based path planning and obstacle avoidance capabilities. 6/11/2018 COMP 9517 S2, 2009
4
Motion Phenomena Still camera, single moving object, constant background Still camera, several moving objects, constant background Moving camera, relatively constant scene Moving camera, several moving objects 6/11/2018 COMP 9517 S2, 2009
5
Image Subtraction Detecting an object moving across a constant background The forward and rear edges of the object advance only a few pixels per frame By subtracting the image It from the previous image It-1, there edges should be evident as the only pixels significantly different from zero 6/11/2018 COMP 9517 S2, 2009
6
Image Subtraction Derive a background image from a set of video frames at the beginning of the video sequence The background image is then subtracted from each subsequent frame to create a difference image Enhance the difference image to fuse neighbouring regions and remove noise 6/11/2018 COMP 9517 S2, 2009
7
Image Subtraction Detect changes between two images
Input: images It and It-Δ Input: an intensity threshold τ Output: a binary image Iout Output: a set of bounding boxes B For all pixels [r, c] in the input images, set Iout[r, c] = 1 if (|It[r, c] –It-Δ[r, c]|>τ) set Iout[r, c] = 0 otherwise 2. Perform connected components extraction on Iout Remove small regions assuming they are noise Perform a closing of Iout using a small disk to fuse neighbouring regions Compute the bounding boxes of all remaining regions of changed pixels Return Iout[r, c] and the bounding boxes B of regions of changed pixels 6/11/2018 COMP 9517 S2, 2009
8
Motion Vector Motion filed: a 2-D array of 2-D vectors representing the motion of 3-D scene points The motion vector in the image represent the displacements of the images of moving 3-D points Tail at time t and head at time t+Δ Instantaneous velocity estimate at time t 6/11/2018 COMP 9517 S2, 2009
9
Motion Vector Two assumption:
The intensity of a 3-D scene point and that of its neighbours remain nearly constant during the time interval The intensity differences observed along the images of the edges of objects are nearly constant during the time internal Image flow: the motion field computed under the assumption that image intensity near corresponding points is relatively constant Two methods for computing image flow: Sparse: point correspondence-based method Dense: spatial & temporal gradient-based method 6/11/2018 COMP 9517 S2, 2009
10
Point Correspondence-based Method
A sparse motion field can be computed by identifying pairs of points that correspond in two images taken at times ti and ti+Δ Interesting points: Corner points Centroids of persistent moving regions Two steps: Detect interesting points Search corresponding points 6/11/2018 COMP 9517 S2, 2009
11
Point Correspondence-based Method
Detect interesting points: Detect corner points Kirsch edge operator Frie-Chen ripple operator Interest operator Computes intensity variance in the vertical, horizontal and diagonal directions Interest point if the minimum of these four variances exceeds a threshold 6/11/2018 COMP 9517 S2, 2009
12
Point Correspondence-based Method
Finding interesting points of a given input image Procedure detect_corner_points(I,V){ for (r = 0 to MaxRow – 1) for (c = 0 to MaxCol – 1) if (I[r,c] is a border pixel) break; elseif (interest_operator(I,r,c,w)>=t) add [(r,c),(r,c)] to set V; } Procedure interest_opertator (I,r,c,,w){ v1 = variance of intensity of horizontal pixels I[r,c-w]…I[r,c-w]; v2 = variance of intensity of vertical pixels I[r-w,c]…I[r+w,]; v3 = variance of intensity of diagonal pixels I[r-w,c-w]…I[r+w,c+w]; v4 = variance of intensity of diagonal pixels I[r-w,c+w]…I[r+w,c-w]; return mini(v1, v2, v3, v4); 6/11/2018 COMP 9517 S2, 2009
13
Point Correspondence-based Method
Search corresponding points: Given an interesting point Pj from I1, we take its neighbourhood in I1 and find the best correlating neighbourhood in I2 under the assumption that the amount of movement is limited 6/11/2018 COMP 9517 S2, 2009
14
Spatial & Temporal Gradient-based Method
Assumption The object reflectivity and the illumination of the object do not change during the time interval The distance of the object from the camera or light sources do not vary significantly over this interval Each small intensity neighbourhood Nx,y at time t1 is observed in some shifted position Nx+Δx,y+Δy at time t2 These assumption may be not hold tight in real case, but provides useful computation and approximation 6/11/2018 COMP 9517 S2, 2009
15
Spatial & Temporal Gradient-based Method
Optical flow equation Taylor series : Multivariable version: (1) Assuming the optical flow vector V=[Δx, Δy] carries the intensity neighbourhood N1(x,y) at time t1 to an identical intensity neighbourhood N2(x+Δx,y+ Δy) at time t2 leads to (2) By combining (1) and (2) and ignoring the high order term, a linear constraint can be developed: (3) 6/11/2018 COMP 9517 S2, 2009
16
Spatial & Temporal Gradient-based Method
The optical flow equation provides a constraint that can be applied at every pixel position However, the optical flow does not give unique solution and thus further constrains are required Example: using the optical flow equation for a group of adjacent pixels and assuming that all of them have the same velocity, the optical flow computation task is reduced to solving a linear system using the least square method 6/11/2018 COMP 9517 S2, 2009
17
Motion Tracking When moving points are not tagged with unique texture or colour information, the characteristics of the motion itself must be used to collect points into trajectories Assumption: The location of an object changes smoothly over time The velocity of an object changes smoothly over time (speed and direction) An object can be at only one location in space at a give time Two objects cannot occupy the same location at the same time 6/11/2018 COMP 9517 S2, 2009
18
Motion Tracking Trajectory: the sequence of image points Ti=(pi,1, pi,2, … , pi,n) when an object I is observed at time instants t = 1, 2, …, n Difference vector between any two points of a trajectory: Smoothness of velocity Total smoothness 6/11/2018 COMP 9517 S2, 2009
19
Tracking with Dynamic Models
Tracking can be considered as the problem of generating an inference about the motion of an object given a sequence of images Tracking is properly thought of as an probabilistic inference problem The moving object has internal state, which is measured at each frame Measurements are combined to estimate the state of the object 6/11/2018 COMP 9517 S2, 2009
20
Tracking with Dynamic Models
Given Xi: the state of the object at the ith frame Yi: the measurement obtained in the ith frame There are three main problems: Prediction: predict the state for the ith frame by having seen a set of measurements y0, y1, …, yi Data association: select the measurements that are related to the object’s state Correction: correct the predicted state by obtained relevant measurements yi 6/11/2018 COMP 9517 S2, 2009
21
Tracking with Dynamic Models
Independence Assumptions Only the immediate past matters Measurements depend only on the current state Xi-1 Yi-1 Xi Yi Xi+1 Yi+1 6/11/2018 COMP 9517 S2, 2009
22
Tracking with Dynamic Models
Tracking as Inference Prediction Correction 6/11/2018 COMP 9517 S2, 2009
23
Tracking with Dynamic Models
Kalman Filtering Dynamic model: Define: Prediction: 6/11/2018 COMP 9517 S2, 2009
24
Tracking with Dynamic Models
Kalman Filtering Correction 6/11/2018 COMP 9517 S2, 2009
25
Tracking with Dynamic Models
Kalman Filtering – the algorithm (1-D) Start Assumption: are known Update Equation: Prediction Update Equation: Correction 6/11/2018 COMP 9517 S2, 2009
26
Tracking with Dynamic Models
Kalman Filtering – the algorithm(2-D) Start Assumption: are known Update Equation: Prediction Update Equation: Correction 6/11/2018 COMP 9517 S2, 2009
27
Tracking with Dynamic Models
Particle Filtering A non-linear dynamic model Also known variously as : Sequential Monte Carlo method, bootstrap filtering, the condensation algorithm, interacting particle approximations and survival of the fittest The conditional state density at time t is represented by a set of sampling particles with weight (sampling probability). The weight define the importance of a sample (observation frequency) 6/11/2018 COMP 9517 S2, 2009
28
Tracking with Dynamic Models
Particle Filtering (A.Vlake and M.Isard, Active Contour) 6/11/2018 COMP 9517 S2, 2009
29
Tracking with Dynamic Models
Particle Filtering The common sampling scheme is importance sampling Selection: Select N random sample Prediction: Generate new sample with zero mean Gaussian Error and non-negative function Correction: Weight corresponding to the new sample are computed using the measurement equation which can be modelled as a Gaussian density 6/11/2018 COMP 9517 S2, 2009
30
Tracking with Dynamic Models
Particle Filtering The common sampling scheme is importance sampling Selection: Select N random sample Prediction: Generate new sample with zero mean Gaussian Error and non-negative function Correction: Weight corresponding to the new sample are computed using the measurement equation which can be modelled as a Gaussian density 6/11/2018 COMP 9517 S2, 2009
31
Tracking with Dynamic Models
Particle Filtering – the algorithm (SIS), A.Vlake and M.Isard 6/11/2018 COMP 9517 S2, 2009
32
Tracking can be complex
Loss of information caused by projection of the 3D world on a 2D image, Noise in images, Complex object motion, Non-rigid or articulated nature of objects, Partial and full object occlusions, Complex object shapes, Scene illumination changes, and Real-time processing requirements. 6/11/2018 COMP 9517 S2, 2009
33
References Yilmaz, A., Javed, O., and Shah, M Object tracking: A survey. ACM Comput. Surv. 38, 4. 6/11/2018 COMP 9517 S2, 2009
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.