Optical Flow Estimation

Slides:



Advertisements
Similar presentations
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Advertisements

Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 6 Author: Julia Richards and R. Scott Hawley.
Computer vision: models, learning and inference
CSCE 643 Computer Vision: Lucas-Kanade Registration
PSSA Preparation.
CISC 489/689 Spring 2009 University of Delaware
Motion.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
Computer Vision Optical Flow
Automatic Image Alignment (direct) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and Rick.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Lecture 9 Optical Flow, Feature Tracking, Normal Flow
Announcements Project1 artifact reminder counts towards your grade Demos this Thursday, 12-2:30 sign up! Extra office hours this week David (T 12-1, W/F.
Feature matching and tracking Class 5 Read Section 4.1 of course notes Read Shi and Tomasi’s paper on.
Announcements Project 1 test the turn-in procedure this week (make sure your folder’s there) grading session next Thursday 2:30-5pm –10 minute slot to.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure This lesson – unconstrained flow-fields Next lesson.
Motion Computing in Image Analysis
Optical Flow Estimation
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
1 Stanford CS223B Computer Vision, Winter 2006 Lecture 7 Optical Flow Professor Sebastian Thrun CAs: Dan Maynes-Aminzade, Mitul Saha, Greg Corrado Slides.
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
Automatic Image Alignment via Motion Estimation
KLT tracker & triangulation Class 6 Read Shi and Tomasi’s paper on good features to track
Computational Photography: Image Registration Jinxiang Chai.
Optical Flow Digital Photography CSE558, Spring 2003 Richard Szeliski (notes cribbed from P. Anandan)
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
Announcements Project1 due Tuesday. Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Supplemental:
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
Motion and optical flow Thursday, Nov 20 Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys, S. Lazebnik.
Optical flow Combination of slides from Rick Szeliski, Steve Seitz, Alyosha Efros and Bill Freeman.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
The Measurement of Visual Motion P. Anandan Microsoft Research.
EECS 274 Computer Vision Motion Estimation.
CSE 185 Introduction to Computer Vision Feature Tracking and Optical Flow.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Uses of Motion 3D shape reconstruction Segment objects based on motion cues Recognize events and activities Improve video quality Track objects Correct.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #16.
3D Imaging Motion.
Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
Miguel Tavares Coimbra
O PTICAL F LOW & L AYERED M OTION Prepared by: Lidya Tonkonogi Mu’in Rizik 1.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion Estimation I What affects the induced image motion?
1 Lucas-Kanade Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion estimation Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/4/12 with slides by Michael Black and P. Anandan.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
Image Motion. The Information from Image Motion 3D motion between observer and scene + structure of the scene –Wallach O’Connell (1953): Kinetic depth.
Motion estimation Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/23 with slides by Michael Black and P. Anandan.
Motion and optical flow
Feature Tracking and Optical Flow
CPSC 641: Image Registration
Motion and Optical Flow
Motion Estimation Today’s Readings
Announcements more panorama slots available now
Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
Announcements Questions on the project? New turn-in info online
Coupled Horn-Schunck and Lukas-Kanade for image processing
Announcements more panorama slots available now
23 April 2019.
Optical flow Computer Vision Spring 2019, Lecture 21
Optical flow and keypoint tracking
Presentation transcript:

Optical Flow Estimation Lucas-Kanade Algorithm and its Extensions Many slides from Alexei Efros ans Steve Seitz

2-D Motion ≠ Optical Flow 2-D Motion: Projection of 3-D motion, depending on 3D object motion and projection model. Optical flow: “Perceived” 2-D motion based on changes in image patterns, also depends on illumination and object surface texture. Nevertheless, people often use these two terms interchangeably. Static sphere, changing illumination Spinning sphere, static illumination

Problem Definition: Optical Flow How to estimate pixel motion from image H to image I? Why is it useful Depth (3D) reconstruction Motion detection - tracking moving objects Compression Mosaics And more…

Problem Definition: Optical Flow How to estimate pixel motion from image H to image I? Solve pixel correspondence problem given a pixel in H, look for nearby pixels of the same color in I Key assumptions color constancy: a point in H looks the same in I For grayscale images, this is brightness constancy small motion: points do not move very far This is called the optical flow problem

Motion estimation: Optical flow Goal: estimate the motion of each pixel

Classes of Techniques Feature-based methods Direct-methods Extract visual features (corners, textured areas) and track them over multiple frames Sparse motion fields, but possibly robust tracking Suitable especially when image motion is large (10-s of pixels) Direct-methods Directly recover image motion from spatio-temporal image brightness variations Motion vectors directly recovered without an intermediate feature extraction step Dense motion fields, but more sensitive to appearance variations Suitable for video and when image motion is small (< 10 pixels)

Optical flow constraints (grayscale images) Let’s take a closer look at these constraints: brightness constancy: Q: what’s the equation? H(x,y)=I(x+u,y+v) A: H(x,y) = I(x+u, y+v) small motion: (u and v are less than 1 pixel) suppose we take the Taylor series expansion of I:

Optical flow equation Combining these two equations In the limit, as u and v go to zero, this approximation is good

Optical flow equation Intuitively, what does this constraint mean? Q: How many equations and how many unknowns do we have per pixel? Intuitively, what does this constraint mean? The component of the flow in the gradient direction is determined The component of the flow parallel to an edge is unknown A: u and v are unknown, 1 equation This explains the Barber Pole illusion http://www.sandlotscience.com/Ambiguous/barberpole.htm

Getting more Equations How to get more equations for a pixel? Basic idea: impose additional constraints most common is to assume that the flow field is smooth locally one method: assume the pixel’s neighbors have the same (u,v) If we use a 5x5 window, this gives us 25 equations per pixel!

Lukas-Kanade Scheme We have over constrained equation set: Solution: solve least squares problem minimum least squares solution given by solution (in d) of: The summations are over all pixels in the K x K window This technique was first proposed by Lukas & Kanade (1981)

When can we solve this ? When is This Solvable? Optimal (u, v) satisfies Lucas-Kanade equation When is This Solvable? ATA should be invertible The eigenvalues of ATA should not be too small (noise) ATA should be well-conditioned l1/ l2 should not be too large (l1 = larger eigenvalue) ATA is invertible when there is no aperture problem

The Aperture Problem

The Aperture Problem

The Aperture Problem

The Aperture Problem and Let Algorithm: At each pixel compute by solving M is singular if all gradient vectors point in the same direction e.g., along an edge of course, trivially singular if the summation is over a single pixel or if there is no texture i.e., only normal flow is available (aperture problem) Corners and textured areas are OK

Local Patch Analysis

Edge large gradients, all the same large l1, small l2

Low texture region gradients have small magnitude small l1, small l2

High textured region gradients are different, large magnitudes large l1, large l2

Observation This is a two image problem BUT Can measure sensitivity/uncertainty by just looking at one of the images! This tells us which pixels are easy to track, which are hard very useful later on when we do feature tracking...

Errors in Lukas-Kanade What are the potential causes of errors in this procedure? Suppose ATA is easily invertible Suppose there is not much noise in the image When our assumptions are violated Brightness constancy is not satisfied The motion is not small A point does not move like its neighbors window size is too large what is the ideal window size?

Iterative Refinement Iterative Lukas-Kanade Algorithm Estimate velocity at each pixel by solving Lucas-Kanade equations Warp H towards I using the estimated flow field - use image warping techniques (easier said than done) Repeat until convergence

Optical Flow: Iterative Estimation estimate update Initial guess: Estimate: x0 x (using d for displacement here instead of u)

Optical Flow: Iterative Estimation x x0 estimate update Initial guess: Estimate:

Optical Flow: Iterative Estimation x x0 estimate update Initial guess: Estimate:

Optical Flow: Iterative Estimation x0 x

Optical Flow: Iterative Estimation Some Implementation Issues: Warp one image, take derivatives of the other so you don’t need to re-compute the gradient after each iteration. Often useful to low-pass filter the images before motion estimation (for better derivative estimation, and linear approximations to image intensity)

Optical Flow: Aliasing Temporal aliasing causes ambiguities in optical flow because images can have many pixels with the same intensity. I.e., how do we know which ‘correspondence’ is correct? actual shift estimated shift nearest match is correct (no aliasing) nearest match is incorrect (aliasing) To overcome aliasing: coarse-to-fine estimation.

Revisiting the small motion assumption Is this motion small enough? Probably not—it’s much larger than one pixel (2nd order terms dominate) How might we solve this problem?

Reduce the resolution!

Coarse-to-fine optical flow estimation Gaussian pyramid of image H Gaussian pyramid of image I image I image H u=10 pixels u=5 pixels u=2.5 pixels u=1.25 pixels image H image I

Coarse-to-fine optical flow estimation Gaussian pyramid of image H Gaussian pyramid of image I image I image H run iterative L-K warp & upsample run iterative L-K . image J image I

Multi-resolution Lucas Kanade Algorithm Compute Iterative LK at highest level For Each Level i Take flow u(i-1), v(i-1) from level i-1 Upsample the flow to create u*(i), v*(i) matrices of twice resolution for level i. Multiply u*(i), v*(i) by 2 Compute It from a block displaced by u*(i), v*(i) Apply LK to get u’(i), v’(i) (the correction in flow) Add corrections u’(i), v’(i) to obtain the flow u(i), v(i) at the ith level, i.e., u(i)=u*(i)+u’(i), v(i)=v*(i)+v’(i)

Optical Flow: Iterative Estimation Some Implementation Issues: Warping is not easy (ensure that errors in warping are smaller than the estimate refinement) Warp one image, take derivatives of the other so you don’t need to re-compute the gradient after each iteration. Often useful to low-pass filter the images before motion estimation (for better derivative estimation, and linear approximations to image intensity)

Optical Flow Results

Optical Flow Results

Optical flow Results

Other Classical Methods Spatial smoothness: [e.g., Horn & Schunk:81, Anandan:89] A heuristic -- violated at depth discontinuities Increase aperture: [e.g., Lucas & Kanade] Local singularities at degenerate image regions. Increase analysis window to large image regions => Global model constraints: Numerically stable, but requires prior model selection: Planar (2D) world model [e.g., Bergen-et-al:92, Irani-et-al:92+94, Black-et-al] 3D world model [e.g., Hanna-et-al:91+93, Stein & Shashua:97, Irani-et-al:1999] Copyright, 1996 © Dale Carnegie & Associates, Inc.