Optic Flow and Motion Detection Cmput 498/613 Martin Jagersand Readings: Papers and chapters on 613 web page.

Slides:



Advertisements
Similar presentations
Lecture 8: Stereo.
Advertisements

Optic Flow and Motion Detection Cmput 615 Martin Jagersand.
Computer Vision Optical Flow
3D Computer Vision and Video Computing 3D Vision Topic 4 of Part II Visual Motion CSc I6716 Fall 2011 Cover Image/video credits: Rick Szeliski, MSR Zhigang.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Detecting Motion and Optic Flow Cmput 610 Martin Jagersand.
Optical Flow Methods 2007/8/9.
May 2004Motion/Optic Flow1 Motion Field and Optical Flow Ack, for the slides: Professor Yuan-Fang Wang Professor Octavia Camps.
CSc83029 – 3-D Computer Vision/ Ioannis Stamos 3-D Computational Vision CSc Optical Flow & Motion The Factorization Method.
Lecture 9: Image alignment CS4670: Computer Vision Noah Snavely
Sensing and Perceiving Motion Cmput 610 Martin Jagersand 2001.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Image-Based Rendering using Hardware Accelerated Dynamic Textures Keith Yerex Dana Cobzas Martin Jagersand.
Optical Flow Estimation
Optical Flow Estimation using Variational Techniques Darya Frolova.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
Optical Flow Digital Photography CSE558, Spring 2003 Richard Szeliski (notes cribbed from P. Anandan)
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
Motion from normal flow. Optical flow difficulties The aperture problemDepth discontinuities.
3D Computer Vision and Video Computing 3D Vision Topic 8 of Part 2 Visual Motion (II) CSC I6716 Spring 2004 Zhigang Zhu, NAC 8/203A
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Optical flow (motion vector) computation Course: Computer Graphics and Image Processing Semester:Fall 2002 Presenter:Nilesh Ghubade
Optical flow Combination of slides from Rick Szeliski, Steve Seitz, Alyosha Efros and Bill Freeman.
CSC 589 Lecture 22 Image Alignment and least square methods Bei Xiao American University April 13.
Lecture 12: Image alignment CS4670/5760: Computer Vision Kavita Bala
1 Interest Operators Harris Corner Detector: the first and most basic interest operator Kadir Entropy Detector and its use in object recognition SIFT interest.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
1-1 Measuring image motion velocity field “local” motion detectors only measure component of motion perpendicular to moving edge “aperture problem” 2D.
Computer Vision, Robert Pless Lecture 11 our goal is to understand the process of multi-camera vision. Last time, we studies the “Essential” and “Fundamental”
1 Computational Vision CSCI 363, Fall 2012 Lecture 20 Stereo, Motion.
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #16.
Course 11 Optical Flow. 1. Concept Observe the scene by moving viewer Optical flow provides a clue to recover the motion. 2. Constraint equation.
Computer Vision : CISC 4/689 Going Back a little Cameras.ppt.
1 Computational Vision CSCI 363, Fall 2012 Lecture 21 Motion II.
Affine Structure from Motion
3D Imaging Motion.
EECS 274 Computer Vision Affine Structure from Motion.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Computational Vision CSCI 363, Fall 2012 Lecture 22 Motion III
Image Warping Many slides from Alyosha Efros + Steve Seitz + Derek oeim Photo by Sean Carroll.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Recognizing specific objects Matching with SIFT Original suggestion Lowe, 1999,2004.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Lecture 10: Image alignment CS4670/5760: Computer Vision Noah Snavely
Lecture 16: Image alignment
Motion and optical flow
Algorithm For Image Flow Extraction- Using The Frequency Domain
Motion and Optical Flow
3D Vision Topic 4 of Part II Visual Motion CSc I6716 Fall 2009
Lecture 7: Image alignment
Range Imaging Through Triangulation
Multi-linear Systems and Invariant Theory
Optic Flow and Motion Detection
2D transformations (a.k.a. warping)
CSSE463: Image Recognition Day 30
Coupled Horn-Schunck and Lukas-Kanade for image processing
Lecture 8: Image alignment
CSSE463: Image Recognition Day 30
Optical flow Computer Vision Spring 2019, Lecture 21
CSSE463: Image Recognition Day 30
Optical flow and keypoint tracking
Image Stitching Linda Shapiro ECE/CSE 576.
Image Stitching Linda Shapiro ECE P 596.
Presentation transcript:

Optic Flow and Motion Detection Cmput 498/613 Martin Jagersand Readings: Papers and chapters on 613 web page

Image motion Somehow quantify the frame-to-frame differences in image sequences.Somehow quantify the frame-to-frame differences in image sequences. 1.Image intensity difference. 2.Optic flow dim image motion computation

Motion is used to: Attention: Detect and direct using eye and head motionsAttention: Detect and direct using eye and head motions Control: Locomotion, manipulation, toolsControl: Locomotion, manipulation, tools Vision: Segment, depth, trajectoryVision: Segment, depth, trajectory

Small camera re-orientation Note: Almost all pixels change!

MOVING CAMERAS ARE LIKE STEREO Locations of points on the object (the “structure”) The change in spatial location between the two cameras (the “motion”)

Classes of motion Still camera, single moving objectStill camera, single moving object Still camera, several moving objectsStill camera, several moving objects Moving camera, still backgroundMoving camera, still background Moving camera, moving objectsMoving camera, moving objects

The optic flow field Vector field over the image:Vector field over the image: [u,v] = f(x,y), u,v = Vel vector, x,y = Im pos [u,v] = f(x,y), u,v = Vel vector, x,y = Im pos FOE, FOC Focus of Expansion, ContractionFOE, FOC Focus of Expansion, Contraction

Optic/image flow Assumption: Image intensities from object points remain constant over time.Assumption: Image intensities from object points remain constant over time.

Taylor expansion of intensity variation Keep linear terms Keep linear terms Use constancy assumption and rewrite:Use constancy assumption and rewrite: Notice: Linear constraint, but no unique solutionNotice: Linear constraint, but no unique solution

Aperture problem Rewrite as dot productRewrite as dot product Each pixel gives one equation in two unknowns:Each pixel gives one equation in two unknowns: f. n = k Notice: Can only detect vectors normal to gradient directionNotice: Can only detect vectors normal to gradient direction The motion of a line cannot be recovered using only local informationThe motion of a line cannot be recovered using only local ðñ á îx îy òó = r Im á îx îy òó f n f ît

Aperture problem 2

The flow continuity constraint Flows of nearby patches are nearly equalFlows of nearby patches are nearly equal Two equations, two unknowns:Two equations, two unknowns: f. n 1 = k 1 f. n 2 = k 2 Unique solution exists, provided n 1 and n 2 not parallelUnique solution exists, provided n 1 and n 2 not parallel f n f f n f

Sensitivity to error n 1 and n 2 might be almost paralleln 1 and n 2 might be almost parallel Tiny errors in estimates of k’s or n’s can lead to huge errors in the estimate of fTiny errors in estimates of k’s or n’s can lead to huge errors in the estimate of f f n f f n f

Sometimes the continuity constraint is false

Using several points Typically solve for motion in 2x2, 4x4 or larger patches.Typically solve for motion in 2x2, 4x4 or larger patches. Over determined equation system:Over determined equation system: dI = Mu dI = Mu Can be solved in e.g. least squares sense using matlab u = M\dICan be solved in e.g. least squares sense using matlab u = M\dI

3-6D Optic flow Generalize to many freedooms (DOFs)Generalize to many freedooms (DOFs)

All 6 freedoms X Y Rotation Scale Aspect Shear

Flow vectors Norbert’s trick: Use an mpeg-card to speed up motion computationNorbert’s trick: Use an mpeg-card to speed up motion computation

Application: mpeg compression

Other applications: Recursive depth recovery: Kostas Danilidis and Jane MulliganRecursive depth recovery: Kostas Danilidis and Jane Mulligan Motion control (we will cover)Motion control (we will cover) SegmentationSegmentation TrackingTracking

Lab: Purpose:Purpose: –Hands on optic flow experience –Simple translation tracking Posted on www page.Posted on www page. Prepare through readings: (on-line papers.Prepare through readings: (on-line papers. Lab tutorial: How to capture.Lab tutorial: How to capture. Questions: or course newsgroupQuestions: or course newsgroup Due 1 week.Due 1 week.

Organizing different kinds of motion Two ideas: 1.Encode different motion structure in the M- matrix. Examples in 4,6,8 DOF image plane tracking. 3D a bit harder. 2.Attempt to find a low dimensional subspace for the set of motion vectors. Can use e.g. PCA

Remember: The optic flow field Vector field over the image:Vector field over the image: [u,v] = f(x,y), u,v = Vel vector, x,y = Im pos [u,v] = f(x,y), u,v = Vel vector, x,y = Im pos FOE, FOC Focus of Expansion, ContractionFOE, FOC Focus of Expansion, Contraction

Remember last lecture: Solving for the motion of a patchSolving for the motion of a patch Over determined equation system: Im t = Mu Im t = Mu Can be solved in e.g. least squares sense using matlab u = M\Im tCan be solved in e.g. least squares sense using matlab u = M\Im t t t+1

3-6D Optic flow Generalize to many freedooms (DOFs)Generalize to many freedooms (DOFs) Im = Mu

Know what type of motion (Greg Hager, Peter Belhumeur) u’ i = A u i + d E.g. Planar Object => Affine motion model: I t = g(p t, I 0 )

Mathematical Formulation Define a “warped image” gDefine a “warped image” g –f(p,x) = x’ (warping function), p warp parameters –I(x,t) (image a location x at time t) –g(p,I t ) = (I(f(p,x 1 ),t), I(f(p,x 2 ),t), … I(f(p,x N ),t))’ Define the Jacobian of warping functionDefine the Jacobian of warping function –M(p,t) = Model Model – I 0 = g(p t, I t ) (image I, variation model g, parameters p) –  I = M(p t, I t )  p  (local linearization M) Compute motion parametersCompute motion parameters –  p = (M T M) -1 M T  I where M = M(p t,I t )

Planar 3D motion From geometry we know that the correct plane- to-plane transform isFrom geometry we know that the correct plane- to-plane transform is 1.for a perspective camera the projective homography 2.for a linear camera (orthographic, weak-, para- perspective) the affine warp

Planar Texture Variability 1 Affine Variability Affine warp functionAffine warp function Corresponding image variabilityCorresponding image variability Discretized for imagesDiscretized for images

On The Structure of M u’ i = A u i + d Planar Object + linear (infinite) camera -> Affine motion model X Y Rotation Scale Aspect Shear

Planar Texture Variability 2 Projective Variability Homography warpHomography warp Projective variability:Projective variability: Where,Where, and and

Planar motion under perspective projection Perspective plane-plane transforms defined by homographiesPerspective plane-plane transforms defined by homographies

Planar-perspective motion 3 In practice hard to compute 8 parameter model stably from one image, and impossible to find out-of plane variationIn practice hard to compute 8 parameter model stably from one image, and impossible to find out-of plane variation Estimate variability basis from several images:Estimate variability basis from several images: Computed Estimated Computed Estimated

Another idea Black, Fleet) Organizing flow fields Express flow field f in subspace basis mExpress flow field f in subspace basis m Different “mixing” coefficients a correspond to different motionsDifferent “mixing” coefficients a correspond to different motions

Example: Image discontinuities

Mathematical formulation Let: Mimimize objective function: =Where Motion basis Robust error norm

Experiment Moving camera 4x4 pixel patches4x4 pixel patches Tree in foreground separates wellTree in foreground separates well

Experiment: Characterizing lip motion Very non-rigid!Very non-rigid!

Summary Three types of visual motion extractionThree types of visual motion extraction 1.Optic (image) flow: Find x,y – image velocities 2.3-6D motion: Find object pose change in image coordinates based more spatial derivatives (top down) 3.Group flow vectors into global motion patterns (bottom up) Visual motion still not satisfactorily solved problemVisual motion still not satisfactorily solved problem

Sensing and Perceiving Motion in Humans and Animals Martin Jagersand

How come perceived as motion? Im = sin(t)U5+cos(t)U6Im = f1(t)U1+…+f6(t)U6

Counterphase sin grating Spatio-temporal patternSpatio-temporal pattern –Time t, Spatial x,y

Counterphase sin grating Spatio-temporal patternSpatio-temporal pattern –Time t, Spatial x,y Rewrite as dot product: = + Result: Standing wave is superposition of two moving waves

Analysis: Only one term: Motion left or rightOnly one term: Motion left or right Mixture of both: Standing waveMixture of both: Standing wave Direction can flip between left and rightDirection can flip between left and right

Reichardt detector QT movieQT movie

Several motion models Gradient: in Computer VisionGradient: in Computer Vision Correlation: In bio visionCorrelation: In bio vision Spatiotemporal filters: Unifying modelSpatiotemporal filters: Unifying model

Spatial response: Gabor function Definition:Definition:

Temporal response: Adelson, Bergen ’85 Note: Terms from taylor of sin(t) Spatio-temporal D=D s D t

Receptor response to Counterphase grating Separable convolutionSeparable convolution

Simplified: For our grating: (Theta=0)For our grating: (Theta=0) Write as sum of components:Write as sum of components: = exp(…)*(acos… + bsin…) = exp(…)*(acos… + bsin…)

Space-time receptive field

Combined cells Spat: Temp:Spat: Temp: Both:Both: Comb:Comb:

Result: More directionally specific responseMore directionally specific response

Energy model: Sum odd and even phase componentsSum odd and even phase components Quadrature rectifierQuadrature rectifier

Adaption: Motion aftereffect

Where is motion processed?

Higher effects:

Equivalence: Reich and Spat

Conclusion Evolutionary motion detection is importantEvolutionary motion detection is important Early processing modeled by Reichardt detector or spatio-temporal filters.Early processing modeled by Reichardt detector or spatio-temporal filters. Higher processing poorly understoodHigher processing poorly understood