May 2004Motion/Optic Flow1 Motion Field and Optical Flow Ack, for the slides: Professor Yuan-Fang Wang Professor Octavia Camps.

Slides:



Advertisements
Similar presentations
Ter Haar Romeny, EMBS Berder 2004 How can we find a dense optic flow field from a motion sequence in 2D and 3D? Many approaches are taken: - gradient based.
Advertisements

Dynamic Occlusion Analysis in Optical Flow Fields
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Computer Vision Optical Flow
3D Computer Vision and Video Computing 3D Vision Topic 4 of Part II Visual Motion CSc I6716 Fall 2011 Cover Image/video credits: Rick Szeliski, MSR Zhigang.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Course Review CS/ECE 181b Spring Topics since Midterm Stereo vision Shape from shading Optical flow Face recognition project.
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Optical Flow Methods 2007/8/9.
E.G.M. PetrakisDynamic Vision1 Dynamic vision copes with –Moving or changing objects (size, structure, shape) –Changing illumination –Changing viewpoints.
CSc83029 – 3-D Computer Vision/ Ioannis Stamos 3-D Computational Vision CSc Optical Flow & Motion The Factorization Method.
CSSE463: Image Recognition Day 30 Due Friday – Project plan Due Friday – Project plan Evidence that you’ve tried something and what specifically you hope.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
CS 6825: Motion Part 2 – Optical Flow Recall: Optical Flow is an approximation of the 2D motion field. is an approximation of the 2D motion field. Motion.
Motion Computing in Image Analysis
Optical Flow Estimation
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
Image Motion.
Motion Field and Optical Flow. Outline Motion Field and Optical Flow Definition, Example, Relation Optical Flow Constraint Equation Assumptions & Derivation,
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
Image Primitives and Correspondence
Computational Architectures in Biological Vision, USC, Spring 2001
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
3D Motion Estimation. 3D model construction Video Manipulation.
Optical Flow Digital Photography CSE558, Spring 2003 Richard Szeliski (notes cribbed from P. Anandan)
3D Computer Vision and Video Computing 3D Vision Topic 8 of Part 2 Visual Motion (II) CSC I6716 Spring 2004 Zhigang Zhu, NAC 8/203A
Optical flow (motion vector) computation Course: Computer Graphics and Image Processing Semester:Fall 2002 Presenter:Nilesh Ghubade
Motion and optical flow Thursday, Nov 20 Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys, S. Lazebnik.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
1 Computational Vision CSCI 363, Fall 2012 Lecture 26 Review for Exam 2.
The Brightness Constraint
1-1 Measuring image motion velocity field “local” motion detectors only measure component of motion perpendicular to moving edge “aperture problem” 2D.
Computer Vision, Robert Pless Lecture 11 our goal is to understand the process of multi-camera vision. Last time, we studies the “Essential” and “Fundamental”
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
1 Computational Vision CSCI 363, Fall 2012 Lecture 20 Stereo, Motion.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #16.
1 Computational Vision CSCI 363, Fall 2012 Lecture 21 Motion II.
Motion Analysis using Optical flow CIS750 Presentation Student: Wan Wang Prof: Longin Jan Latecki Spring 2003 CIS Dept of Temple.
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
3D Imaging Motion.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Tal Amir Advanced Topics in Computer Vision May 29 th, 2015 COUPLED MOTION- LIGHTING ANALYSIS.
Motion / Optical Flow II Estimation of Motion Field Avneesh Sud.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
1 Computational Vision CSCI 363, Fall 2012 Lecture 29 Structure from motion, Heading.
Image Motion. The Information from Image Motion 3D motion between observer and scene + structure of the scene –Wallach O’Connell (1953): Kinetic depth.
MOTION Model. Road Map Motion Model Non Parametric Motion Field : Algorithms 1.Optical flow field estimation. 2.Block based motion estimation. 3.Pel –recursive.
Processing Images and Video for An Impressionist Effect Automatic production of “painterly” animations from video clips. Extending existing algorithms.
Correspondence and Stereopsis. Introduction Disparity – Informally: difference between two pictures – Allows us to gain a strong sense of depth Stereopsis.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Motion and Optical Flow
Range Imaging Through Triangulation
Filtering Things to take away from this lecture An image as a function
CSSE463: Image Recognition Day 30
Announcements Questions on the project? New turn-in info online
Coupled Horn-Schunck and Lukas-Kanade for image processing
CSSE463: Image Recognition Day 30
Optical flow Computer Vision Spring 2019, Lecture 21
Chapter 11: Stereopsis Stereopsis: Fusing the pictures taken by two cameras and exploiting the difference (or disparity) between them to obtain the depth.
Open Topics.
CSSE463: Image Recognition Day 30
Filtering An image as a function Digital vs. continuous images
Optical flow and keypoint tracking
Presentation transcript:

May 2004Motion/Optic Flow1 Motion Field and Optical Flow Ack, for the slides: Professor Yuan-Fang Wang Professor Octavia Camps

May 2004Motion/Optic Flow2 Motion Field and Optical Flow So far, algorithms deal with a single, static image In real world, a static pattern is a rarity, continuous motion and change are the rule Human eyes are well-equipped to take advantage of motion or change in an image sequence. –In our discussions, an image sequence is a discrete set of N images, taken at discrete time instants (may or may not be uniformly spaced in time). Image changes are due to the relative motion between the camera and the scene (illumination being constant).

May 2004Motion/Optic Flow3 Example Ullman’s concentric counter-rotating cylinder experiment Two concentric cylinders of different radii W. a random dot pattern on both surfaces (cylinder surfaces and boundaries are not displayed) Stationary: not able to tell them apart Counter-rotating: structures apparent

May 2004Motion/Optic Flow4 Motion helps in –segmentation (two structures) –identification (two cylinders) Example (cont.)

May 2004Motion/Optic Flow5 General Scenario Possibilities: camera moving, stationary scene camera stationary, moving objects both camera and scene moving

May 2004Motion/Optic Flow6 Visual Motion Allows us to compute useful properties of the 3D world, with very little knowledge. Example: Time to collision

May 2004Motion/Optic Flow7 Time to Collision f L v L DoDoDoDo l(t) An object of height L moves with constant velocity v: At time t=0 the object is at:At time t=0 the object is at: D(0) = D o D(0) = D o At time t it is atAt time t it is at D(t) = D o – vtD(t) = D o – vt It will crash with the camera at time:It will crash with the camera at time: D(  ) = D o – v  = 0 D(  ) = D o – v  = 0  = D o /v  = D o /v t=0t D(t)  Octavia Camps

May 2004Motion/Optic Flow8 Time to Collision f L v L DoDoDoDo l(t) t=0t D(t) The image of the object has size l(t): Taking derivative wrt time:  Octavia Camps

May 2004Motion/Optic Flow9 Time to Collision f L v L DoDoDoDo l(t) t=0t D(t)  Octavia Camps

May 2004Motion/Optic Flow10 Time to Collision f L v L DoDoDoDo l(t) t=0t D(t) And their ratio is:  Octavia Camps

May 2004Motion/Optic Flow11 Time to Collision f L v L DoDoDoDo l(t) t=0t D(t) And time to collision: Can be directly measured from image Can be found, without knowing L or D o or v !!  Octavia Camps

May 2004Motion/Optic Flow12 Comparison between Motion Analysis and Stereo Stereo: Two or more frames Motion: N frames baseline time The baseline is usually larger in stereo than in motion:The baseline is usually larger in stereo than in motion: Motion disparities tend to be smallerMotion disparities tend to be smaller Stereo images are taken at the same time:Stereo images are taken at the same time: Motion disparities can be due to scene motionMotion disparities can be due to scene motion There can be more than 1 transformation btw framesThere can be more than 1 transformation btw frames  Octavia Camps

May 2004Motion/Optic Flow13 Why Multitude of Formulations? The camera can –be stationary –execute simple translational motion –undergo general motion with both translation and rotation The object(s) can –be stationary –execute simple 2D motion parallel to the image plane –undergo general motion with both 3D translation and rotation

May 2004Motion/Optic Flow14 Why Multitude of Formulations? (cont.) There may be multiple moving and stationary objects in a scene The camera motion may be known or unknown The shape of the object may be known or unknown The motion of the object may be known or unknown etc. etc....

May 2004Motion/Optic Flow15 Motion Field (MF) The MF assigns a velocity vector to each pixel in the image. These velocities are INDUCED by the RELATIVE MOTION btw the camera and the 3D scene The MF can be thought as the projection of the 3D velocities on the image plane.  Octavia Camps

May 2004Motion/Optic Flow16 MF Estimation We will use the apparent motion of brightness patterns observed in an image sequence. This motion is called OPTICAL FLOW (OF)  Octavia Camps

May 2004Motion/Optic Flow17 MF  OF Consider a smooth, lambertian, uniform sphere rotating around a diameter, in front of a camera: –MF  0 since the points on the sphere are moving –OF = 0 since there are no moving patterns in the images 3DImage  Octavia Camps

May 2004Motion/Optic Flow18 MF  OF Consider a still, smooth, specular, uniform sphere, in front of a stationary camera and a moving light source: –MF  0 since the points on the sphere are not moving –OF  0 since there is a moving pattern in the images 3DImage1Image2  Octavia Camps

May 2004Motion/Optic Flow19 Approximation of the MF Never the less, keeping in mind that MF  OF, we will assume that the apparent brightness of moving objects remain constant and hence we will estimate OF instead (since MF cannot really be observed!)  Octavia Camps

May 2004Motion/Optic Flow20 Brightness Constancy Equation Let P be a moving point in 3D: –At time t, P has coords (X(t),Y(t),Z(t)) –Let p=(x(t),y(t)) be the coords. of its image at time t. –Let I(x(t),y(t),t) be the brightness at p at time t. Brightness Constancy Assumption: –As P moves over time, I(x(t),y(t),t) remains constant.  Octavia Camps

May 2004Motion/Optic Flow21 Approaches Feature based –tracking significant image features across multiple frames –sparse motion vectors at few feature points Flow based –matching intensity profiles across multiple frames –dense motion fields Correlation based –parametric model Spatial-temporal filtering –biology-motivated approach

May 2004Motion/Optic Flow22 Feature-Based Tracking Image features (patterns) w. unique and invariant intensity profiles Detect such patterns (corners, edges, etc.) in sequences of images Track their motion over time

May 2004Motion/Optic Flow23 Example

May 2004Motion/Optic Flow24 Example

Computing Optical Flow ==

May 2004Motion/Optic Flow26 Physical Interpretation Q: what is A: image brightness gradient direction –known (spatial derivatives) Q: what is A: local motion vector –unknown Q: what is A: change of brightness at a location w.r.t time –known (temporal derivative) Optical Flow Constraint Equation

May 2004Motion/Optic Flow27 no spatial change in brightness induce no temporal change in brightness no discernible motion motion perpendicular to local gradient induce no temporal change in brightness no discernible motion motion in the direction of local gradient induce temporal change in brightness discernible motion only the motion component in the direction of local gradient induce temporal change in brightness discernible motion Optical Flow Constraint

May 2004Motion/Optic Flow28 The aperture problem The Image Brightness Constancy Assumption only provides the OF component in the direction of the spatial image gradient  Octavia Camps

May 2004Motion/Optic Flow29 Difficulty One equation with two unknowns Aperture problem –spatial derivatives use only a few adjacent pixels (limited aperture and visibility) –many combinations of (u,v) will satisfy the equation Constraint line u v

May 2004Motion/Optic Flow30 intensity gradient is zero no constraints on (u,v) interpolated from other places intensity gradient is nonzero but is constant one constraints on (u,v) only the component along the gradient are recoverable intensity gradient is nonzero and changing multiple constraints on (u,v) motion recoverable

May 2004Motion/Optic Flow31 Temporal coherency Caveat: –(u,v) must stay the same across several frames –scenes highly textured –(u,v) at the same location actually refers to different object points

May 2004Motion/Optic Flow32 Results

May 2004Motion/Optic Flow33 Spatial-Temporal Filtering Motivation –It was known ( Hubel and Wisel, “Receptive fields of single neurons in the cat’s striate cortex”, 1959 ) that simple cells in visual cortex act like bar or edge detectors –They function like linear filters: their receptive field profiles represent a weighting function (spatial impulse response of a linear system) –However, their response is not temporally dependent

May 2004Motion/Optic Flow35 Temporal Filtering Consider stacking images one next to another into a volume x-t (y-t) slices have similar structure like x-y slices, and oriented filters can be used to detect motion + -

May 2004Motion/Optic Flow36

May 2004Motion/Optic Flow37 Approach The response of a detection cell must be both temporal and spatial varying A popular function for that is the Gabor function –Gaussian modulated sin curve

May 2004Motion/Optic Flow38 Why? The filter is separable, meaning that filter shape in different dimension can be designed separately and then combined By Fourier transform theory –The sigma controls the spread in a dimension –The omega controls the center frequency  A band pass filter of different orientation and frequency selectivity

May 2004Motion/Optic Flow39

May 2004Motion/Optic Flow40 Beyond OF: Structure from Motion Possible to recover camera motion and scene structure from a video sequence. Information about the movement of brightness patterns at only a few points can be used to determine the motion of the camera. –In general, seven points are sufficient to determine the motion uniquely. In the case of pure translation, the optical flow vectors all pass through a point when extended. This point where the optical flow is zero is called the focus of expansion. –It is the image of the ray along which the camera moves.

May 2004Motion/Optic Flow41 MF & OF Summary Motion field Optical flow MF not the same as OF Optical flow constraint equation Aperture problem

May 2004Motion/Optic Flow42 Next Class and Next Week Thursday: Application to Image databases –MPEG-7 standard –A presentation by Till Quack on a web image search engine FRIDAY discussion: review optical flow. –Note that this will be the last discussion session. Monday is a holiday, so *all* students are encouraged to attend the Friday meeting. Next Tuesday: Project review and discussions Next Thursday: Course review and conclusions.