Formation et Analyse d’Images Session 8

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Robot Vision SS 2005 Matthias Rüther 1 ROBOT VISION Lesson 10: Object Tracking and Visual Servoing Matthias Rüther.
Introduction To Tracking
Caught in Motion By: Eric Hunt-Schroeder EE275 – Final Project - Spring 2012.
(Includes references to Brian Clipp
Foreground Modeling The Shape of Things that Came Nathan Jacobs Advisor: Robert Pless Computer Science Washington University in St. Louis.
1 Formation et Analyse d’Images Session 6 Daniela Hall 18 November 2004.
1 Formation et Analyse d’Images Session 12 Daniela Hall 16 January 2006.
1 Formation et Analyse d’Images Session 3 Daniela Hall 14 October 2004.
Prof. Trevor Darrell Lecture 19: Tracking
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Motion Detection And Analysis Michael Knowles Tuesday 13 th January 2004.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Tracking with Online Appearance Model Bohyung Han
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Object Detection and Tracking Mike Knowles 11 th January 2005
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Fitting a Model to Data Reading: 15.1,
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Single Point of Contact Manipulation of Unknown Objects Stuart Anderson Advisor: Reid Simmons School of Computer Science Carnegie Mellon University.
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Tracking with Linear Dynamic Models. Introduction Tracking is the problem of generating an inference about the motion of an object given a sequence of.
Algorithm Evaluation and Error Analysis class 7 Multiple View Geometry Comp Marc Pollefeys.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
Overview and Mathematics Bjoern Griesbach
HCI / CprE / ComS 575: Computational Perception
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
Bayesian Filtering for Robot Localization
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean Hall 5409 T-R 10:30am – 11:50am.
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
1 Formation et Analyse d’Images Daniela Hall 30 Septembre 2004.
Markov Localization & Bayes Filtering
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
TP15 - Tracking Computer Vision, FCUP, 2013 Miguel Coimbra Slides by Prof. Kristen Grauman.
Real-time object tracking using Kalman filter Siddharth Verma P.hD. Candidate Mechanical Engineering.
Human-Computer Interaction Human-Computer Interaction Tracking Hanyang University Jong-Il Park.
Computer vision: models, learning and inference Chapter 19 Temporal models.
1. Introduction Motion Segmentation The Affine Motion Model Contour Extraction & Shape Estimation Recursive Shape Estimation & Motion Estimation Occlusion.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Computer Vision - A Modern Approach Set: Tracking Slides by D.A. Forsyth The three main issues in tracking.
Probabilistic Robotics Bayes Filter Implementations.
1 Formation et Analyse d’Images Session 7 Daniela Hall 25 November 2004.
Learning the Appearance and Motion of People in Video Hedvig Sidenbladh, KTH Michael Black, Brown University.
Mobile Robot Localization (ch. 7)
Stable Multi-Target Tracking in Real-Time Surveillance Video
Expectation-Maximization (EM) Case Studies
1 Formation et Analyse d’Images Session 2 Daniela Hall 26 September 2005.
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
1 Formation et Analyse d’Images Session 4 Daniela Hall 10 October 2005.
Tracking with dynamics
IEEE International Conference on Multimedia and Expo.
Computer Vision Exercise Session 8 – Condensation Tracker.
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Recognizing specific objects Matching with SIFT Original suggestion Lowe, 1999,2004.
SIFT Scale-Invariant Feature Transform David Lowe
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Paper – Stephen Se, David Lowe, Jim Little
Tracking Objects with Dynamics
Probabilistic Robotics
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Nome Sobrenome. Time time time time time time..
Presentation transcript:

Formation et Analyse d’Images Session 8 Daniela Hall 14 November 2005

Course Overview Session 1 (19/09/05) Session 2 (26/09/05) Human vision Homogenous coordinates Camera models Session 2 (26/09/05) Tensor notation Image transformations Homography computation Session 3 (3/10/05) Camera calibration Reflection models Color spaces Session 4 (10/10/05) Pixel based image analysis 17/10/05 course is replaced by Modelisation surfacique

Course overview Session 5 + 6 (24/10/05) 9:45 – 12:45 Contrast description Hough transform Session 7 (7/11/05) Kalman filter Session 8 (14/11/05) Tracking of regions, pixels, and lines Session 9 (21/11/05) Gaussian filter operators Session 10 (5/12/05) Scale Space Session 11 (12/12/05) Stereo vision Epipolar geometry Session 12 (16/01/06): exercises and questions

Session overview Tracking of objects Architecture of the robust tracker Tracking using Kalman filter Tracking using CONDENSATION

Robust tracking of objects List of predictions Predict Detection List of targets Measurements Correct Trigger regions New targets Detection

Tracking system Tracking system: detects position of targets at each time instant (using i.e. background differencing)

Tracking system Supervisor Target observation module Detection module calls image acquisition, target observation and detection in a cycle Target observation module ensures robust tracking by prediction of target positions using a Kalman filter Detection module verifies the predicted positions by measuring detection energy within the search region given by the Kalman filter creates new targets by evaluating detection energy within trigger regions Parameters noise threshold, detection energy threshold, parameters for splitting and merging

Detection by background differencing I=(IR,IG,IB) image, B=(BR,BG,BB) background Compute a binary difference image Id, where all pixels that have a difference diff larger than the noise threshold w are set to one. Then we compute the connected components of Id to detect the pixels that belong to a target. For each target, we compute mean and covariance of its pixels. The covariance is transformed to width and height of the bounding box and orientation of the target.

Real-time target detection Computing connected components for an image is computationally expensive. Idea: Restrict search of targets to a small number of search regions. These regions are: Entry regions marked by the user Search region obtained from the Kalman filter that predicts the next most likely position of a current target.

Background adaption to increase robustness of detection In long-term tracking, illumination of a scene changes. Image differencing with a static background causes lots of false detections. The background is updated regularily by t time, α=0.1 background adaption parameter Background adaption allows that the background incorporates slow illumination changes.

Example Detection module Parameters: detection energy threshold energy threshold too high: targets are missed or targets are split energy threshold too low: false detections Problem: energy threshold depends on illumination and target appearance

Session overview Tracking of objects Architecture of the robust tracker Tracking using Kalman filter Tracking using CONDENSATION

Tracking Targets are represented by position (x,y) and covariance. A first order Kalman filter is used to predict the position of the target in the next frame. The Kalman filter provides a ROI where to look for the target. ROI is computed from the a posteriori estimate xk and from the a posteriori error covariance Pk

Example

Example: Tracking bouncing ball Specifications: constant background colored ball Problems: noisy observations motion blur rapid motion changes Thanks to B. Fisher UEdin for providing slides and figures of this example. http://homespages.inf.ed.ac.uk/rbf/AVAUDIO/lect8.pdf

Ball physical model Position zk = (x, y) Position update zk = zk-1 + vk-1Δt Velocity update vk = vk-1+ak-1Δt Acceleration (gravity down) ak=(0,g)T

Robust tracking of objects Measurement State vector State equation Prediction State control

Robust Tracking of objects Measurement noise error covariance Temporal matrix Process noise error covariance a affects the computation speed (large a increases uncertainty and therefore the search regions)

Kalman filter successes

Kalman filter failures

Kalman filter analysis smoothes noisy observations dynamic model fails at bounce and stop could estimate ball radius could plot a boundary of 95% likelihood of ball position (the boundary would grow when the fit is bad).

Session overview Tracking of objects Architecture of the robust tracker Tracking using Kalman filter Tracking using CONDENSATION

Tracking by CONDENSATION CONDENSATION: Conditional Density Propagation. Also known as Particle Filtering. Ref: M.Isard and A. Blake: CONDENSATION for visual tracking, Int Journal of Computer Vision, 29(1),1998. http://www.robots.ox.ac.uk/%7Econtours/

CONDENSATION tracking Keeps multiple hypotheses updates using new data selects hypotheses probabilistically copes with very noisy data and process state changes tunable computation load (by choosing number of particles).

CONDENSATION algorithm Given a set of N hypotheses at time k Hk={x1,k, ... , xN,k} with associated probabilities {p(x1,k), ..., p(xN,k)} Repeat N times to generate Hk+1 1. randomly select a hypothesis xu,k from Hk with p(xu,k) 2. generate a new state vector sk from a distribution centered at xu,k 3. get new state vector using dynamic model xk+1=f(sk) and kalman filter. 4. evaluate probability p(zk+1|xk+1) of observed data zk+1 given state xk 5. use bayes rule to get p(xk+1|zk+1)

CONDENSATION algorithm Figure from book Isard, Blake: Active Contours

Why does condensation tracking work? many slightly different hypotheses suggests that maybe we find one that fits better. dynamic model allows to switch between different motion models Motion models of bouncing ball: bounce, freefall, stop sampling by probability weeds out bad hypotheses

Tracking of bouncing ball Select 100 hypotheses xk with probabilities p(xk) use estimated covariance P() to create state samples sk define a situation switching model

Tracking of bouncing ball If in STOP situation: y'=0 If in BOUNCE: x'=-0.7x', also add some random y' motion, y'=y'+r. If in FREEFALL: use freefall motion model. y'=gΔt and x'=x'+r then use Kalman filter for predicting ^xk 4. estimate hypothesis goodness by 1/||Hxk – zk||2 p(xk) is estimated from the goodness by normalization.

Example of sampling effects

Kalman filter failures fixed

Comparison Kalman vs condensation assumes Gaussian motion model. Easy to parametrize. Fast. Condensation: can track objects with non-gaussian motion. very good for multi-modal motion models simple algorithm reasonably fast