Optic Flow and Motion Detection Cmput 615 Martin Jagersand.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

CSCE 643 Computer Vision: Lucas-Kanade Registration
Optical Flow Estimation
Motion.
Optic Flow and Motion Detection Cmput 498/613 Martin Jagersand Readings: Papers and chapters on 613 web page.
Computer Vision Optical Flow
3D Computer Vision and Video Computing 3D Vision Topic 4 of Part II Visual Motion CSc I6716 Fall 2011 Cover Image/video credits: Rick Szeliski, MSR Zhigang.
Automatic Image Alignment (direct) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and Rick.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
Detecting Motion and Optic Flow Cmput 610 Martin Jagersand.
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Lecture 9 Optical Flow, Feature Tracking, Normal Flow
Announcements Project1 artifact reminder counts towards your grade Demos this Thursday, 12-2:30 sign up! Extra office hours this week David (T 12-1, W/F.
Feature matching and tracking Class 5 Read Section 4.1 of course notes Read Shi and Tomasi’s paper on.
Announcements Project 1 test the turn-in procedure this week (make sure your folder’s there) grading session next Thursday 2:30-5pm –10 minute slot to.
Sensing and Perceiving Motion Cmput 610 Martin Jagersand 2001.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Computing in Image Analysis
Optical Flow Estimation
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Optical Flow Estimation using Variational Techniques Darya Frolova.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
1 Stanford CS223B Computer Vision, Winter 2006 Lecture 7 Optical Flow Professor Sebastian Thrun CAs: Dan Maynes-Aminzade, Mitul Saha, Greg Corrado Slides.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
Automatic Image Alignment via Motion Estimation
KLT tracker & triangulation Class 6 Read Shi and Tomasi’s paper on good features to track
Computational Photography: Image Registration Jinxiang Chai.
Optical Flow Digital Photography CSE558, Spring 2003 Richard Szeliski (notes cribbed from P. Anandan)
Announcements Project1 due Tuesday. Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Supplemental:
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
Motion and optical flow Thursday, Nov 20 Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys, S. Lazebnik.
Optical flow Combination of slides from Rick Szeliski, Steve Seitz, Alyosha Efros and Bill Freeman.
1 Interest Operators Harris Corner Detector: the first and most basic interest operator Kadir Entropy Detector and its use in object recognition SIFT interest.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
EECS 274 Computer Vision Motion Estimation.
Motion Segmentation By Hadas Shahar (and John Y.A.Wang, and Edward H. Adelson, and Wikipedia and YouTube) 1.
CSE 185 Introduction to Computer Vision Feature Tracking and Optical Flow.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Uses of Motion 3D shape reconstruction Segment objects based on motion cues Recognize events and activities Improve video quality Track objects Correct.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #16.
3D Imaging Motion.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Segmentation of Vehicles in Traffic Video Tun-Yu Chiang Wilson Lau.
Miguel Tavares Coimbra
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion Estimation I What affects the induced image motion?
1 Lucas-Kanade Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
Recognizing specific objects Matching with SIFT Original suggestion Lowe, 1999,2004.
Image Motion. The Information from Image Motion 3D motion between observer and scene + structure of the scene –Wallach O’Connell (1953): Kinetic depth.
Motion estimation Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/23 with slides by Michael Black and P. Anandan.
Motion and optical flow
CPSC 641: Image Registration
Motion and Optical Flow
Optic Flow and Motion Detection
Motion Estimation Today’s Readings
Announcements more panorama slots available now
Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
Announcements Questions on the project? New turn-in info online
Announcements more panorama slots available now
Optical flow Computer Vision Spring 2019, Lecture 21
Optical flow and keypoint tracking
Presentation transcript:

Optic Flow and Motion Detection Cmput 615 Martin Jagersand

Image motion Somehow quantify the frame-to-frame differences in image sequences.Somehow quantify the frame-to-frame differences in image sequences. 1.Image intensity difference. 2.Optic flow dim image motion computation

Motion is used to: Attention: Detect and direct using eye and head motionsAttention: Detect and direct using eye and head motions Control: Locomotion, manipulation, toolsControl: Locomotion, manipulation, tools Vision: Segment, depth, trajectoryVision: Segment, depth, trajectory

Small camera re-orientation Note: Almost all pixels change!

MOVING CAMERAS ARE LIKE STEREO Locations of points on the object (the “structure”) The change in spatial location between the two cameras (the “motion”)

Classes of motion Still camera, single moving objectStill camera, single moving object Still camera, several moving objectsStill camera, several moving objects Moving camera, still backgroundMoving camera, still background Moving camera, moving objectsMoving camera, moving objects

The optic flow field Vector field over the image:Vector field over the image: [u,v] = f(x,y), u,v = Vel vector, x,y = Im pos [u,v] = f(x,y), u,v = Vel vector, x,y = Im pos FOE, FOC Focus of Expansion, ContractionFOE, FOC Focus of Expansion, Contraction

Motion/Optic flow vectors How to compute? –Solve pixel correspondence problem –given a pixel in Im1, look for same pixels in Im2 Possible assumptionsPossible assumptions 1.color constancy: a point in H looks the same in I –For grayscale images, this is brightness constancy 2.small motion: points do not move very far –This is called the optical flow problem

Optic/image flow Assume: 1.Image intensities from object points remain constant over time 2.Image displacement/motion small

Taylor expansion of intensity variation Keep linear terms Keep linear terms Use constancy assumption and rewrite:Use constancy assumption and rewrite: Notice: Linear constraint, but no unique solutionNotice: Linear constraint, but no unique solution

Aperture problem Rewrite as dot productRewrite as dot product Each pixel gives one equation in two unknowns:Each pixel gives one equation in two unknowns: n*f = k Min length solution: Can only detect vectors normal to gradient directionMin length solution: Can only detect vectors normal to gradient direction The motion of a line cannot be recovered using only local informationThe motion of a line cannot be recovered using only local ðñ á îx îy òó = r Im á îx îy òó f n f ît

Aperture problem 2

The flow continuity constraint Flows of nearby pixels or patches are (nearly) equalFlows of nearby pixels or patches are (nearly) equal Two equations, two unknowns:Two equations, two unknowns: n 1 * f = k 1 n 2 * f = k 2 Unique solution f exists, provided n 1 and n 2 not parallelUnique solution f exists, provided n 1 and n 2 not parallel f n f f n f

Sensitivity to error n 1 and n 2 might be almost paralleln 1 and n 2 might be almost parallel Tiny errors in estimates of k’s or n’s can lead to huge errors in the estimate of fTiny errors in estimates of k’s or n’s can lead to huge errors in the estimate of f f n f f n f

Using several points Typically solve for motion in 2x2, 4x4 or larger image patches.Typically solve for motion in 2x2, 4x4 or larger image patches. Over determined equation system:Over determined equation system: dIm = Mu dIm = Mu Can be solved in least squares sense using MatlabCan be solved in least squares sense using Matlab u = M\dIm u = M\dIm Can also be solved be solved using normal equationsCan also be solved be solved using normal equations u = (M T M) -1 *M T dIm u = (M T M) -1 *M T dIm

3-6D Optic flow Generalize to many freedooms (DOFs)Generalize to many freedooms (DOFs)

All 6 freedoms X Y Rotation Scale Aspect Shear

Conditions for solvability –SSD Optimal (u, v) satisfies Optic Flow equation When is this solvable? A T A should be invertible A T A entries should not be too small (noise) A T A should be well-conditioned Study eigenvalues: – 1 / 2 should not be too large ( 1 = larger eigenvalue)

Edge – gradients very large or very small – large  1, small 2

Low texture region – gradients have small magnitude – small  1, small 2

High textured region – gradients are different, large magnitudes – large  1, large 2

Observation This is a two image problem BUTThis is a two image problem BUT –Can measure sensitivity by just looking at one of the images! –This tells us which pixels are easy to track, which are hard –very useful later on when we do feature tracking...

Errors in Optic flow computation What are the potential causes of errors in this procedure?What are the potential causes of errors in this procedure? –Suppose A T A is easily invertible –Suppose there is not much noise in the image When our assumptions are violatedWhen our assumptions are violated –Brightness constancy is not satisfied –The motion is not small –A point does not move like its neighbors –window size is too large –what is the ideal window size?

Iterative Refinement Used in SSD/Lucas-Kanade tracking algorithmUsed in SSD/Lucas-Kanade tracking algorithm 1.Estimate velocity at each pixel by solving Lucas-Kanade equations 2.Warp H towards I using the estimated flow field - use image warping techniques 3.Repeat until convergence

Revisiting the small motion assumption Is this motion small enough?Is this motion small enough? –Probably not—it’s much larger than one pixel (2 nd order terms dominate) –How might we solve this problem?

Reduce the resolution!

image I image H Gaussian pyramid of image HGaussian pyramid of image I image I image H u=10 pixels u=5 pixels u=2.5 pixels u=1.25 pixels Coarse-to-fine optical flow estimation

image I image J Gaussian pyramid of image HGaussian pyramid of image I image I image H Coarse-to-fine optical flow estimation run iterative L-K warp & upsample......

Application: mpeg compression

HW accelerated computation of flow vectors Norbert’s trick: Use an mpeg-card to speed up motion computationNorbert’s trick: Use an mpeg-card to speed up motion computation

Other applications: Recursive depth recovery: Kostas and JaneRecursive depth recovery: Kostas and Jane Motion control (we will cover)Motion control (we will cover) SegmentationSegmentation TrackingTracking

Lab: Assignment1:Assignment1: Purpose:Purpose: –Intro to image capture and processing –Hands on optic flow experience See www page for details.See www page for details. Suggestions welcome!Suggestions welcome!

Organizing Optic Flow Cmput 615 Martin Jagersand

Questions to think about Readings: Book chapter, Fleet et al. paper. Compare the methods in the paper and lecture 1.Any major differences? 2.How dense flow can be estimated (how many flow vectore/area unit)? 3.How dense in time do we need to sample?

Organizing different kinds of motion Two examples: 1.Greg Hager paper: Planar motion 2.Mike Black, et al: Attempt to find a low dimensional subspace for complex motion

Remember: The optic flow field Vector field over the image:Vector field over the image: [u,v] = f(x,y), u,v = Vel vector, x,y = Im pos [u,v] = f(x,y), u,v = Vel vector, x,y = Im pos FOE, FOC Focus of Expansion, ContractionFOE, FOC Focus of Expansion, Contraction

(Parenthesis) Euclidean world motion -> image Let us assume there is one rigid object moving with velocity T and w = d R / dt For a given point P on the object, we have p = f P/z The apparent velocity of the point is V = -T – w x P Therefore, we have v = dp/dt = f (z V – V z P)/z 2

Component wise: Motion due to translation: depends on depth Motion due to rotation: independent of depth

Remember last lecture: Solving for the motion of a patchSolving for the motion of a patch Over determined equation system: Im t = Mu Im t = Mu Can be solved in e.g. least squares sense using matlab u = M\Im tCan be solved in e.g. least squares sense using matlab u = M\Im t t t+1

3-6D Optic flow Generalize to many freedooms (DOFs)Generalize to many freedooms (DOFs) Im = Mu

Know what type of motion (Greg Hager, Peter Belhumeur) u’ i = A u i + d E.g. Planar Object => Affine motion model: I t = g(p t, I 0 )

Mathematical Formulation Define a “warped image” gDefine a “warped image” g –f(p,x) = x’ (warping function), p warp parameters –I(x,t) (image a location x at time t) –g(p,I t ) = (I(f(p,x 1 ),t), I(f(p,x 2 ),t), … I(f(p,x N ),t))’ Define the Jacobian of warping functionDefine the Jacobian of warping function –M(p,t) = Model Model – I 0 = g(p t, I t ) (image I, variation model g, parameters p) –  I = M(p t, I t )  p  (local linearization M) Compute motion parametersCompute motion parameters –  p = (M T M) -1 M T  I where M = M(p t,I t )

Planar 3D motion From geometry we know that the correct plane- to-plane transform isFrom geometry we know that the correct plane- to-plane transform is 1.for a perspective camera the projective homography 2.for a linear camera (orthographic, weak-, para- perspective) the affine warp

Planar Texture Variability 1 Affine Variability Affine warp functionAffine warp function Corresponding image variabilityCorresponding image variability Discretized for imagesDiscretized for images

On The Structure of M u’ i = A u i + d Planar Object + linear (infinite) camera -> Affine motion model X Y Rotation Scale Aspect Shear

Planar Texture Variability 2 Projective Variability Homography warpHomography warp Projective variability:Projective variability: Where,Where, and and

Planar motion under perspective projection Perspective plane-plane transforms defined by homographiesPerspective plane-plane transforms defined by homographies

Planar-perspective motion 3 In practice hard to compute 8 parameter model stably from one image, and impossible to find out-of plane variationIn practice hard to compute 8 parameter model stably from one image, and impossible to find out-of plane variation Estimate variability basis from several images:Estimate variability basis from several images: Computed Estimated Computed Estimated

Another idea Black, Fleet) Organizing flow fields Express flow field f in subspace basis mExpress flow field f in subspace basis m Different “mixing” coefficients a correspond to different motionsDifferent “mixing” coefficients a correspond to different motions

Example: Image discontinuities

Mathematical formulation Let: Mimimize objective function: =Where Motion basis Robust error norm

Experiment Moving camera 4x4 pixel patches4x4 pixel patches Tree in foreground separates wellTree in foreground separates well

Experiment: Characterizing lip motion Very non-rigid!Very non-rigid!

Summary Three types of visual motion extractionThree types of visual motion extraction 1.Optic (image) flow: Find x,y – image velocities 2.3-6D motion: Find object pose change in image coordinates based more spatial derivatives (top down) 3.Group flow vectors into global motion patterns (bottom up) Visual motion still not satisfactorily solved problemVisual motion still not satisfactorily solved problem

Sensing and Perceiving Motion Cmput 610 Martin Jagersand

How come perceived as motion? Im = sin(t)U5+cos(t)U6Im = f1(t)U1+…+f6(t)U6

Counterphase sin grating Spatio-temporal patternSpatio-temporal pattern –Time t, Spatial x,y

Counterphase sin grating Spatio-temporal patternSpatio-temporal pattern –Time t, Spatial x,y Rewrite as dot product: = + Result: Standing wave is superposition of two moving waves

Analysis: Only one term: Motion left or rightOnly one term: Motion left or right Mixture of both: Standing waveMixture of both: Standing wave Direction can flip between left and rightDirection can flip between left and right

Reichardt detector QT movieQT movie

Several motion models Gradient: in Computer VisionGradient: in Computer Vision Correlation: In bio visionCorrelation: In bio vision Spatiotemporal filters: Unifying modelSpatiotemporal filters: Unifying model

Spatial response: Gabor function Definition:Definition:

Temporal response: Adelson, Bergen ’85 Note: Terms from taylor of sin(t) Spatio-temporal D=D s D t

Receptor response to Counterphase grating Separable convolutionSeparable convolution

Simplified: For our grating: (Theta=0)For our grating: (Theta=0) Write as sum of components:Write as sum of components: = exp(…)*(acos… + bsin…) = exp(…)*(acos… + bsin…)

Space-time receptive field

Combined cells Spat: Temp:Spat: Temp: Both:Both: Comb:Comb:

Result: More directionally specific responseMore directionally specific response

Energy model: Sum odd and even phase componentsSum odd and even phase components Quadrature rectifierQuadrature rectifier

Adaption: Motion aftereffect

Where is motion processed?

Higher effects:

Equivalence: Reich and Spat

Conclusion Evolutionary motion detection is importantEvolutionary motion detection is important Early processing modeled by Reichardt detector or spatio-temporal filters.Early processing modeled by Reichardt detector or spatio-temporal filters. Higher processing poorly understoodHigher processing poorly understood