Computer Vision Non-linear tracking Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, M. Isard, T. Darrell …

Slides:



Advertisements
Similar presentations
Motivating Markov Chain Monte Carlo for Multiple Target Tracking
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Localization David Johnson cs6370. Basic Problem Go from thisto this.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Reliable Range based Localization and SLAM Joseph Djugash Masters Student Presenting work done by: Sanjiv Singh, George Kantor, Peter Corke and Derek Kurth.
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Computer Vision Fitting Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, T. Darrel, A. Zisserman,...
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Probabilistic Robotics
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
Computer Vision Segmentation Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, T. Darrel,...
Monte Carlo Localization
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
Today Introduction to MCMC Particle filters and MCMC
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Particle Filters++ TexPoint fonts used in EMF.
HCI / CprE / ComS 575: Computational Perception
Bayesian Filtering for Robot Localization
Particle Filtering in Network Tomography
Markov Localization & Bayes Filtering
Tracking Course web page: vision.cis.udel.edu/~cv April 18, 2003  Lecture 23.
Localization and Mapping (3)
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
Tracking with focus on the particle filter (part II) Michael Rubinstein IDC.
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Computer vision: models, learning and inference Chapter 19 Temporal models.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
A General Framework for Tracking Multiple People from a Moving Camera
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Probabilistic Robotics Bayes Filter Implementations.
Particle Filters.
-Arnaud Doucet, Nando de Freitas et al, UAI
Mobile Robot Localization (ch. 7)
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
CSE-473 Project 2 Monte Carlo Localization. Localization as state estimation.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
State Estimation and Kalman Filtering Zeeshan Ali Sayyed.
Tracking with dynamics
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Monte Carlo Localization for Mobile Robots Frank Dellaert 1, Dieter Fox 2, Wolfram Burgard 3, Sebastian Thrun 4 1 Georgia Institute of Technology 2 University.
Particle filters for Robot Localization An implementation of Bayes Filtering Markov Localization.
Autonomous Mobile Robots Autonomous Systems Lab Zürich Probabilistic Map Based Localization "Position" Global Map PerceptionMotion Control Cognition Real.
Introduction to Sampling based inference and MCMC
Tracking Objects with Dynamics
Kalman Filter Results Fred Astaire and Ginger Rogers Shall We Dance, MGM 1941 J. M. Rehg © 2003.
Particle Filtering for Geometric Active Contours
Probabilistic Robotics
Introduction to particle filter
Particle Filter/Monte Carlo Localization
Introduction to particle filter
Probabilistic Map Based Localization
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Nome Sobrenome. Time time time time time time..
Presentation transcript:

Computer Vision Non-linear tracking Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, M. Isard, T. Darrell …

Computer Vision 2 Jan 16/18-Introduction Jan 23/25CamerasRadiometry Jan 30/Feb1Sources & ShadowsColor Feb 6/8Linear filters & edgesTexture Feb 13/15Multi-View GeometryStereo Feb 20/22Optical flowProject proposals Feb27/Mar1Affine SfMProjective SfM Mar 6/8Camera CalibrationSegmentation Mar 13/15Springbreak Mar 20/22FittingProb. Segmentation Mar 27/29Silhouettes and Photoconsistency Linear tracking Apr 3/5Project UpdateNon-linear Tracking Apr 10/12Object Recognition Apr 17/19Range data Apr 24/26Final project Tentative class schedule

Computer Vision 3 Final project presentation No further assignments, focus on project Final presentation: Presentation and/or Demo (your choice, but let me know) Short paper (Due April 22 by 23:59) (preferably Latex IEEE proc. style) Final presentation/demo April 24 and 26

Computer Vision 4 Bayes Filters System state dynamics Observation dynamics We are interested in: Belief or posterior density Estimating system state from noisy observations

Computer Vision 5 From above, constructing two steps of Bayes Filters Predict: Update: Recall “law of total probability” and “Bayes’ rule”

Computer Vision 6 Predict: Update: Assumptions: Markov Process

Computer Vision 7 Bayes Filter How to use it? What else to know? Motion Model Perceptual Model Start from:

Computer Vision 8 Example 1 Step 0: initialization Step 1: updating

Computer Vision 9 Example 1 (continue) Step 3: updating Step 4: predicting Step 2: predicting

Computer Vision 10 Several types of Bayes filters They differs in how to represent probability densities –Kalman filter –Multihypothesis filter –Grid-based approach –Topological approach –Particle filter

Computer Vision 11 Kalman Filter Recall general problem Assumptions of Kalman Filter: Belief of Kalman Filter is actually a unimodal Gaussian Advantage: computational efficiency Disadvantage: assumptions too restrictive

Computer Vision 12

Computer Vision 13

Computer Vision 14 Multi-hypothesis Tracking Belief is a mixture of Gaussian Tracking each Gaussian hypothesis using a Kalman filter Deciding weights on the basis of how well the hypothesis predict the sensor measurements Advantage: –can represent multimodal Gaussian Disadvantage: –Computationally expensive –Difficult to decide on hypotheses

Computer Vision 15 Grid-based Approaches Using discrete, piecewise constant representations of the belief Tessellate the environment into small patches, with each patch containing the belief of object in it Advantage: –Able to represent arbitrary distributions over the discrete state space Disadvantage –Computational and space complexity required to keep the position grid in memory and update it

Computer Vision 16 Topological approaches A graph representing the state space –node representing object’s location (e.g. a room) –edge representing the connectivity (e.g. hallway) Advantage –Efficiency, because state space is small Disadvantage –Coarseness of representation

Computer Vision 17 Particle filters Also known as Sequential Monte Carlo Methods Representing belief by sets of samples or particles are nonnegative weights called importance factors Updating procedure is sequential importance sampling with re-sampling

Computer Vision 18 Example 2: Particle Filter Step 0: initialization Each particle has the same weight Step 1: updating weights. Weights are proportional to p(z|x)

Computer Vision 19 Example 2: Particle Filter Particles are more concentrated in the region where the person is more likely to be Step 3: updating weights. Weights are proportional to p(z|x) Step 4: predicting. Predict the new locations of particles. Step 2: predicting. Predict the new locations of particles.

Computer Vision 20 Compare Particle Filter with Bayes Filter with Known Distribution Example 1 Example 2 Example 1 Example 2 Predicting Updating

Computer Vision 21 Comments on Particle Filters Advantage: –Able to represent arbitrary density –Converging to true posterior even for non-Gaussian and nonlinear system –Efficient in the sense that particles tend to focus on regions with high probability Disadvantage –Worst-case complexity grows exponentially in the dimensions

Computer Vision 22 Particle Filtering in CV: Initial Particle Set Particles at t = 0 drawn from wide prior because of large initial uncertainty –Gaussian with large covariance –Uniform distribution from MacCormick & Blake, 1998 State includes shape & position; prior more constrained for shape

Computer Vision 23 Normalize N particle weights so that they sum to 1 Resample particles by picking randomly and uniformly in [0, 1] range N times –Analogous to spinning a roulette wheel with arc-lengths of bins equal to particle weights Adaptively focuses on promising areas of state space Particle Filtering: Sampling ¼ (1) ¼ (2) ¼ (3) ¼(N)¼(N) ¼ (N-1) courtesy of D. Fox

Computer Vision 24 Particle Filtering: Prediction Update each particle using generative form of dynamics: Drift may be nonlinear (i.e., different displacement for each particle) Each particle diffuses independently –Typically modeled with a Gaussian Random component (aka “diffusion”) Deterministic component (aka “drift”)

Computer Vision 25 Particle Filtering: Measurement For each particle s (i), compute new weight ¼ (i) as measurement likelihood ¼ (i) = P (z j s (i) ) Enforcing plausibility: Particles that represent impossible configurations are given 0 likelihood –E.g., positions outside of image from MacCormick & Blake, 1998 A snake measurement likelihood method

Computer Vision 26 Particle Filtering Steps (aka C ONDENSATION ) drift diffuse measure measurement likelihood from Isard & Blake, 1998 Sampling occurs here

Computer Vision 27 Particle Filtering Visualization courtesy of M. Isard 1-D system, red curve is measurement likelihood

Computer Vision 28 C ONDENSATION : Example State Posterior from Isard & Blake, 1998 Note how initial distribution “sharpens”

Computer Vision 29 Example: Contour-based Head Template Tracking courtesy of A. Blake

Computer Vision 30 Example: Recovering from Distraction from Isard & Blake, 1998

Computer Vision 31 Obtaining a State Estimate Note that there’s no explicit state estimate maintained—just a “cloud” of particles Can obtain an estimate at a particular time by querying the current particle set Some approaches –“Mean” particle Weighted sum of particles Confidence: inverse variance –Really want a mode finder—mean of tallest peak

Computer Vision 32 Condensation: Estimating Target State From Isard & Blake, 1998 State samples (thickness proportional to weight) Mean of weighted state samples

Computer Vision 33 More examples

Computer Vision 34 Multi-Modal Posteriors The MAP estimate is just the tallest one when there are multiple peaks in the posterior This is fine when one peak dominates, but when they are of comparable heights, we might sometimes pick the wrong one Committing to just one possibility can lead to mistracking –Want a wider sense of the posterior distribution to keep track of other good candidate states adapted from [Hong, 1995] Multiple peaks in the measurement likelihood

Computer Vision 35 MCMC-based particle filter Model interaction (higher dimensional state-space) CNN video (Khan, Balch & Dellaert PAMI05)

Computer Vision 36 Next class: recognition