Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Andrew Cosand ECE CVRR CSE
Optical Flow Estimation
CISC 489/689 Spring 2009 University of Delaware
Motion.
Investigation Into Optical Flow Problem in the Presence of Spatially-varying Motion Blur Mohammad Hossein Daraei June 2014 University.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Computer Vision Optical Flow
Tracking Features with Large Motion. Abstract Problem: When frame-to-frame motion is too large, KLT feature tracker does not work. Solution: Estimate.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
LUCAS KANADE FEATURE TRACKER a pyramidal implementation
Recent progress in optical flow
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Optical Flow Methods 2007/8/9.
Lecture 9 Optical Flow, Feature Tracking, Normal Flow
Announcements Project1 artifact reminder counts towards your grade Demos this Thursday, 12-2:30 sign up! Extra office hours this week David (T 12-1, W/F.
Announcements Project 1 test the turn-in procedure this week (make sure your folder’s there) grading session next Thursday 2:30-5pm –10 minute slot to.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Caltech, Oct Lihi Zelnik-Manor
Motion Computing in Image Analysis
Optical Flow Estimation
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Optical Flow Estimation using Variational Techniques Darya Frolova.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
1 Stanford CS223B Computer Vision, Winter 2006 Lecture 7 Optical Flow Professor Sebastian Thrun CAs: Dan Maynes-Aminzade, Mitul Saha, Greg Corrado Slides.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
Announcements Project1 due Tuesday. Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Supplemental:
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Feature and object tracking algorithms for video tracking Student: Oren Shevach Instructor: Arie nakhmani.
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
Tzu ming Su Advisor : S.J.Wang MOTION DETAIL PRESERVING OPTICAL FLOW ESTIMATION 2013/1/28 L. Xu, J. Jia, and Y. Matsushita. Motion detail preserving optical.
7.1. Mean Shift Segmentation Idea of mean shift:
The Measurement of Visual Motion P. Anandan Microsoft Research.
EECS 274 Computer Vision Motion Estimation.
CSE 185 Introduction to Computer Vision Feature Tracking and Optical Flow.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Uses of Motion 3D shape reconstruction Segment objects based on motion cues Recognize events and activities Improve video quality Track objects Correct.
Effective Optical Flow Estimation
Over-Parameterized Variational Optical Flow
Pyramidal Implementation of Lucas Kanade Feature Tracker Jia Huang Xiaoyan Liu Han Xin Yizhen Tan.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
Miguel Tavares Coimbra
Non-Ideal Iris Segmentation Using Graph Cuts
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion Estimation I What affects the induced image motion?
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion / Optical Flow II Estimation of Motion Field Avneesh Sud.
Motion estimation Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/4/12 with slides by Michael Black and P. Anandan.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Segmentation at Any Speed Shrinivas J. Pundlik Department of Electrical and Computer Engineering, Clemson University, Clemson, SC.
Image Motion. The Information from Image Motion 3D motion between observer and scene + structure of the scene –Wallach O’Connell (1953): Kinetic depth.
MOTION Model. Road Map Motion Model Non Parametric Motion Field : Algorithms 1.Optical flow field estimation. 2.Block based motion estimation. 3.Pel –recursive.
Motion estimation Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/23 with slides by Michael Black and P. Anandan.
Motion and Optical Flow
Multiway Cut for Stereo and Motion with Slanted Surfaces
Motion Segmentation at Any Speed
Motion Estimation Today’s Readings
Announcements more panorama slots available now
Announcements Questions on the project? New turn-in info online
Coupled Horn-Schunck and Lukas-Kanade for image processing
Announcements more panorama slots available now
Optical flow Computer Vision Spring 2019, Lecture 21
Optical flow and keypoint tracking
Presentation transcript:

Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF FEATURES CONCLUSION ➢ Principles from Horn-Schunck can be used to track features together ➢ Results demonstrate improved tracking performance over standard Lucas Kanade ➢ Aperture problem is overcome, so that features can be tracked in untextured regions ➢ Future work: ➢ applying robust penalty functions to prevent smoothing across motion discontinuities ➢ explicit modeling of occlusions ➢ interpolation of dense optical flow from sparse feature points Sparse features have traditionally been tracked from frame to frame independently of one another. We propose a framework in which features are tracked jointly. Combining ideas from Lucas-Kanade and Horn-Schunck, the estimated motion of a feature is influenced by the estimated motion of neighboring features. The approach also handles the problem of tracking edges in a unified way by estimating motion perpendicular to the edge, using the motion of neighboring features to resolve the aperture problem. Results are shown on several image sequences to demonstrate the improved results obtained by the approach. Algorithm: Standard Lucas-Kanade For each feature i, 1. Initialize u i ← (0, 0) T 2. Set λ i ← 0 3. For pyramid level n − 1 to 0 step −1, (a) Compute Z i (b) Repeat until convergence: i. Compute the difference I t between the first image and the shifted second image: I t (x, y) = I 1 (x, y) − I 2 (x + u i, y + v i ) ii. Compute e i iii. Solve Z i u′ i = e i for incremental motion u’ i iv. Add incremental motion to overall estimate: u i ← u i + u′ i (c) Expand to the next level: u i ←  u i, where  is the pyramid scale factor Algorithm: Joint Lucas-Kanade For each feature i, 1. Initialize u i ← (0, 0) T 2. Initialize i For pyramid level n − 1 to 0 step −1, 1. For each feature i, compute Z i 2. Repeat until convergence: (a) For each feature i, i. Determine ii. Compute the difference I t between the first image and the shifted second image: I t (x, y) = I 1 (x, y) − I 2 (x + u i, y + v i ) iii. Compute e i iv. Solve Z i u′ i = e i for incremental motion u’ i v. Add incremental motion to overall estimate: u i ← u i + u′ i 3. Expand to the next level: u i ←  u i, where  is the pyramid scale factor Rubber Whale HydrangeaVenus Dimetrodon Standard LK (Open CV) Joint LK (our algorithm) AlgorithmRubber WhaleHydrangeaVenusDimetrodon AEEPAEEPAEEPAEEP Standard LK (OpenCV) Joint LK (our algorithm) The average angular error (AE) in degrees and the average endpoint error (EP) in pixels of the two algorithms. Image Gradient magnitude Standard LK Joint LK Algorithm comparison when the scene does not contain much texture, as is often the case in indoor man-made environments. EXPERIMENTAL RESULTS optic flow constraint equation: Lucas Kanade (sparse feature tracking) Horn Schunck (dense optic flow) assumes unknown displacement u of a pixel is constant within some neighborhood i.e., finds displacement of a small window centered around a pixel by minimizing: regularizes the unconstrained optic flow equation by imposing a global smoothness term computes global displacement functions u(x, y) v(x, y) by minimizing: λ: regularization parameter, Ω: image domain minimum of the functional is found by solving the corresponding Euler-Lagrange equations, leading to: denotes convolution with an integration window of size ρ differentiating with respect to u and v, setting the derivatives to zero leads to a linear system: I : Image I x : Image derivative in x direction I y : Image derivative in y direction I t : Image temporal derivative (u, v): pixel displacement in x and y direction Joint Tracking : Combine algorithms of Lucas Kanade and Horn Schunck, i.e., aggregate global information to improve the tracking of sparse feature points. (cf. Bruhn et al., IJCV 2005) Joint Lucas Kanade energy functional : (N : Number of features) (data term) (smoothness term) Differentiating E JLK with respect to the displacement (u,v) gives a 2Nx2N matrix equation, whose (2i – 1)th and (2i)th rows are given by :