Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
1 Approximated tracking of multiple non-rigid objects using adaptive quantization and resampling techniques. J. M. Sotoca 1, F.J. Ferri 1, J. Gutierrez.
Formation et Analyse d’Images Session 8
Prof. Trevor Darrell Lecture 19: Tracking
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Today Feature Tracking Structure from Motion on Monday (1/29)
Probabilistic video stabilization using Kalman filtering and mosaicking.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Announcements Project1 artifact reminder counts towards your grade Demos this Thursday, 12-2:30 sign up! Extra office hours this week David (T 12-1, W/F.
Computing motion between images
Object Detection and Tracking Mike Knowles 11 th January 2005
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Fitting a Model to Data Reading: 15.1,
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Overview and Mathematics Bjoern Griesbach
Markov Localization & Bayes Filtering
Perceptual and Sensory Augmented Computing Computer Vision II, Summer’14 Computer Vision II – Lecture 8 Tracking with Linear Dynamic Models
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
TP15 - Tracking Computer Vision, FCUP, 2013 Miguel Coimbra Slides by Prof. Kristen Grauman.
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Babol university of technology Presentation: Alireza Asvadi
Computer vision: models, learning and inference Chapter 19 Temporal models.
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Probabilistic Robotics Bayes Filter Implementations.
Motion Segmentation By Hadas Shahar (and John Y.A.Wang, and Edward H. Adelson, and Wikipedia and YouTube) 1.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
Tracking People by Learning Their Appearance Deva Ramanan David A. Forsuth Andrew Zisserman.
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
An Introduction to Kalman Filtering by Arthur Pece
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
EECS 274 Computer Vision Tracking. Motivation: Obtain a compact representation from an image/motion sequence/set of tokens Should support application.
Tracking with dynamics
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Learning Image Statistics for Bayesian Tracking Hedvig Sidenbladh KTH, Sweden Michael Black Brown University, RI, USA
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
776 Computer Vision Jan-Michael Frahm Spring 2012.
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
A Forest of Sensors: Using adaptive tracking to classify and monitor activities in a site Eric Grimson AI Lab, Massachusetts Institute of Technology
Motion and Optical Flow
Tracking Objects with Dynamics
Particle Filtering for Geometric Active Contours
Probabilistic Robotics
COMPUTER VISION Tam Lam
Introduction to particle filter
Lecture 10 Causal Estimation of 3D Structure and Motion
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
Introduction to particle filter
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Particle Filtering.
Announcements Questions on the project? New turn-in info online
Optical flow and keypoint tracking
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Nome Sobrenome. Time time time time time time..
Presentation transcript:

Tracking Many slides adapted from Kristen Grauman, Deva Ramanan

Feature tracking So far, we have only considered optical flow estimation in a pair of images If we have more than two images, we can compute the optical flow from each frame to the next Given a point in the first image, we can in principle reconstruct its path by simply “following the arrows”

Tracking challenges Ambiguity of optical flow Large motions Find good features to track Large motions Discrete search instead of Lucas-Kanade Changes in shape, orientation, color Allow some matching flexibility Occlusions, disocclusions Need mechanism for deleting, adding new features Drift – errors may accumulate over time Need to know when to terminate a track

Handling large displacements Define a small area around a pixel as the template Match the template against each pixel within a search area in next image – just like stereo matching! Use a match measure such as SSD or correlation After finding the best discrete location, can use Lucas-Kanade to get sub-pixel estimate

Tracking over many frames Select features in first frame For each frame: Update positions of tracked features Discrete search or Lucas-Kanade Terminate inconsistent tracks Compute similarity with corresponding feature in the previous frame or in the first frame where it’s visible Start new tracks if needed

Shi-Tomasi feature tracker Find good features using eigenvalues of second-moment matrix Key idea: “good” features to track are the ones that can be tracked reliably From frame to frame, track with Lucas-Kanade and a pure translation model More robust for small displacements, can be estimated from smaller neighborhoods Check consistency of tracks by affine registration to the first observed instance of the feature Affine model is more accurate for larger displacements Comparing to the first frame helps to minimize drift J. Shi and C. Tomasi. Good Features to Track. CVPR 1994.

J. Shi and C. Tomasi. Good Features to Track. CVPR 1994. Tracking example J. Shi and C. Tomasi. Good Features to Track. CVPR 1994.

Tracking with dynamics Key idea: Given a model of expected motion, predict where objects will occur in next frame, even before seeing the image Restrict search for the object Improved estimates since measurement noise is reduced by trajectory smoothness

General model for tracking The moving object of interest is characterized by an underlying state X State X gives rise to measurements or observations Y At each time t, the state changes to Xt and we get a new observation Yt X1 X2 Y1 Y2 Xt Yt …

Steps of tracking Prediction: What is the next state of the object given past measurements?

Steps of tracking Prediction: What is the next state of the object given past measurements? Correction: Compute an updated estimate of the state from prediction and measurements

Steps of tracking Prediction: What is the next state of the object given past measurements? Correction: Compute an updated estimate of the state from prediction and measurements Tracking can be seen as the process of propagating the posterior distribution of state given measurements across time

Simplifying assumptions Only the immediate past matters dynamics model

Simplifying assumptions Only the immediate past matters Measurements depend only on the current state dynamics model observation model

Simplifying assumptions Only the immediate past matters Measurements depend only on the current state dynamics model observation model … X1 X2 Xt Y1 Y2 Yt

Tracking as induction predict correct Base case: Assume we have initial prior that predicts state in absence of any evidence: P(X0) At the first frame, correct this given the value of Y0=y0 Given corrected estimate for frame t: Predict for frame t+1 Correct for frame t+1 predict correct

Tracking as induction Base case: Assume we have initial prior that predicts state in absence of any evidence: P(X0) At the first frame, correct this given the value of Y0=y0

Induction step: Prediction Prediction involves representing given

Induction step: Prediction Prediction involves representing given Law of total probability

Induction step: Prediction Prediction involves representing given Conditioning on Xt–1

Induction step: Prediction Prediction involves representing given Independence assumption

Induction step: Correction Correction involves computing given predicted value

Induction step: Correction Correction involves computing given predicted value Bayes rule

Induction step: Correction Correction involves computing given predicted value Independence assumption (observation yt depends only on state Xt)

Induction step: Correction Correction involves computing given predicted value Conditioning on Xt

Summary: Prediction and correction dynamics model corrected estimate from previous step

Summary: Prediction and correction dynamics model corrected estimate from previous step observation model predicted estimate

Linear Dynamic Models Dynamics model: state undergoes linear tranformation plus Gaussian noise Observation model: measurement is linearly transformed state plus Gaussian noise

Example: Constant velocity (1D) State vector is position and velocity Measurement is position only (greek letters denote noise terms)

Example: Constant acceleration (1D) State vector is position, velocity, and acceleration Measurement is position only (greek letters denote noise terms)

The Kalman filter Method for tracking linear dynamical models in Gaussian noise The predicted/corrected state distributions are Gaussian You only need to maintain the mean and covariance The calculations are easy (all the integrals can be done in closed form)

Propagation of Gaussian densities

The Kalman Filter: 1D state Make measurement Given corrected state from previous time step and all the measurements up to the current one, predict the distribution over the current step Given prediction of state and current measurement, update prediction of state Predict Correct Time advances (from t–1 to t) Mean and std. dev. of predicted state: Mean and std. dev. of corrected state:

1D Kalman filter: Prediction Linear dynamic model defines predicted state evolution, with noise Want to estimate distribution for next predicted state

1D Kalman filter: Prediction Linear dynamic model defines predicted state evolution, with noise Want to estimate distribution for next predicted state

1D Kalman filter: Prediction Linear dynamic model defines predicted state evolution, with noise Want to estimate distribution for next predicted state Update the mean: Update the variance:

1D Kalman filter: Correction Mapping of state to measurements: Predicted state: Want to estimate corrected distribution

1D Kalman filter: Correction Mapping of state to measurements: Predicted state: Want to estimate corrected distribution

1D Kalman filter: Correction Mapping of state to measurements: Predicted state: Want to estimate corrected distribution Update the mean: Update the variance:

Prediction vs. correction What if there is no prediction uncertainty What if there is no measurement uncertainty The measurement is ignored! The prediction is ignored!

The Kalman filter: General case PREDICT CORRECT show complete set of equations and repeat (on one slide) the cycle; MAYBE shift equations up and show dimensions also….

Kalman filter pros and cons Simple updates, compact and efficient Cons Unimodal distribution, only single hypothesis Restricted class of motions defined by linear model

Propagation of general densities

Factored sampling Represent the state distribution non-parametrically Prediction: Sample points from prior density for the state, P(X) Correction: Weight the samples according to P(Y|X) M. Isard and A. Blake, CONDENSATION -- conditional density propagation for visual tracking, IJCV 29(1):5-28, 1998

Particle filtering We want to use sampling to propagate densities over time (i.e., across frames in a video sequence) At each time step, represent posterior P(Xt|Yt) with weighted sample set Previous time step’s sample set P(Xt-1|Yt-1) is passed to next time step as the effective prior M. Isard and A. Blake, CONDENSATION -- conditional density propagation for visual tracking, IJCV 29(1):5-28, 1998

Particle filtering Start with weighted samples from previous time step Sample and shift according to dynamics model Spread due to randomness; this is predicted density P(Xt|Yt-1) Weight the samples according to observation density Arrive at corrected density estimate P(Xt|Yt) M. Isard and A. Blake, CONDENSATION -- conditional density propagation for visual tracking, IJCV 29(1):5-28, 1998

Particle filtering results http://www.robots.ox.ac.uk/~misard/condensation.html

Tracking issues Initialization Manual Background subtraction Detection

Tracking issues Initialization Obtaining observation and dynamics model Generative observation model: “render” the state on top of the image and compare Discriminative observation model: classifier or detector score Dynamics model: learn (very difficult) or specify using domain knowledge

Tracking issues Initialization Obtaining observation and dynamics model Prediction vs. correction If the dynamics model is too strong, will end up ignoring the data If the observation model is too strong, tracking is reduced to repeated detection

Tracking issues Initialization Obtaining observation and dynamics model Prediction vs. correction Data association What if we don’t know which measurements to associate with which tracks?

Tracking issues Initialization Obtaining observation and dynamics model Prediction vs. correction Data association Drift Errors caused by dynamical model, observation model, and data association tend to accumulate over time

Drift D. Ramanan, D. Forsyth, and A. Zisserman. Tracking People by Learning their Appearance. PAMI 2007.

Data association So far, we’ve assumed the entire measurement to be relevant to determining the state In reality, there may be uninformative measurements (clutter) or measurements may belong to different tracked objects Data association: task of determining which measurements go with which tracks

Data association Simple strategy: only pay attention to the measurement that is “closest” to the prediction

Data association Simple strategy: only pay attention to the measurement that is “closest” to the prediction Doesn’t always work…

Data association Simple strategy: only pay attention to the measurement that is “closest” to the prediction More sophisticated strategy: keep track of multiple state/observation hypotheses Can be done with particle filtering This is a general problem in computer vision, there is no easy solution

Recall: Generative part-based models h: assignment of features to parts Model Candidate parts

Recall: Generative part-based models h: assignment of features to parts h: assignment of features to parts Model Candidate parts

Tracking people by learning their appearance Person model = appearance + structure (+ dynamics) Structure and dynamics are generic, appearance is person-specific Trying to acquire an appearance model “on the fly” can lead to drift Instead, can use the whole sequence to initialize the appearance model and then keep it fixed while tracking Given strong structure and appearance models, tracking can essentially be done by repeated detection (with some smoothing) D. Ramanan, D. Forsyth, and A. Zisserman. Tracking People by Learning their Appearance. PAMI 2007.

Tracking people by learning their appearance Tracker D. Ramanan, D. Forsyth, and A. Zisserman. Tracking People by Learning their Appearance. PAMI 2007.

Representing people

Bottom-up initialization: Clustering D. Ramanan, D. Forsyth, and A. Zisserman. Tracking People by Learning their Appearance. PAMI 2007.

Top-down initialization: Exploit “easy” poses D. Ramanan, D. Forsyth, and A. Zisserman. Tracking People by Learning their Appearance. PAMI 2007.

Tracking by model detection D. Ramanan, D. Forsyth, and A. Zisserman. Tracking People by Learning their Appearance. PAMI 2007.

Example results http://www.ics.uci.edu/~dramanan/papers/pose/index.html