Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.

Slides:



Advertisements
Similar presentations
Epipolar Geometry.
Advertisements

Lecture 11: Two-view geometry
Jan-Michael Frahm, Enrique Dunn Spring 2012
Two-view geometry.
Lecture 8: Stereo.
Camera calibration and epipolar geometry
Structure from motion.
Computer Vision : CISC 4/689
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Uncalibrated Geometry & Stratification Sastry and Yang
Lecture 21: Multiple-view geometry and structure from motion
Multiple View Geometry
Lecture 20: Two-view geometry CS6670: Computer Vision Noah Snavely.
Single-view geometry Odilon Redon, Cyclops, 1914.
Projected image of a cube. Classical Calibration.
May 2004Stereo1 Introduction to Computer Vision CS / ECE 181B Tuesday, May 11, 2004  Multiple view geometry and stereo  Handout #6 available (check with.
Lec 21: Fundamental Matrix
Multiple View Geometry
Global Alignment and Structure from Motion Computer Vision CSE455, Winter 2008 Noah Snavely.
Multiple View Geometry. THE GEOMETRY OF MULTIPLE VIEWS Reading: Chapter 10. Epipolar Geometry The Essential Matrix The Fundamental Matrix The Trifocal.
CS 558 C OMPUTER V ISION Lecture IX: Dimensionality Reduction.
Epipolar geometry Class 5. Geometric Computer Vision course schedule (tentative) LectureExercise Sept 16Introduction- Sept 23Geometry & Camera modelCamera.
Epipolar Geometry and Stereo Vision Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/12/11 Many slides adapted from Lana Lazebnik,
CSE 6367 Computer Vision Stereo Reconstruction Camera Coordinate Transformations “Everything should be made as simple as possible, but not simpler.” Albert.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Automatic Camera Calibration
Epipolar Geometry and Stereo Vision Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/05/15 Many slides adapted from Lana Lazebnik,
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
Epipolar Geometry and Stereo Vision Computer Vision ECE 5554 Virginia Tech Devi Parikh 10/15/13 These slides have been borrowed from Derek Hoiem and Kristen.
Multi-view geometry.
Epipolar geometry The fundamental matrix and the tensor
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Projective cameras Motivation Elements of Projective Geometry Projective structure from motion Planches : –
Computer Vision – Lecture 16
Lecture 04 22/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Multiview Geometry and Stereopsis. Inputs: two images of a scene (taken from 2 viewpoints). Output: Depth map. Inputs: multiple images of a scene. Output:
Stereo Course web page: vision.cis.udel.edu/~cv April 11, 2003  Lecture 21.
Announcements Project 3 due Thursday by 11:59pm Demos on Friday; signup on CMS Prelim to be distributed in class Friday, due Wednesday by the beginning.
EECS 274 Computer Vision Stereopsis.
Geometry of Multiple Views
Single-view geometry Odilon Redon, Cyclops, 1914.
CSE 576 Ali Farhadi Several slides from Larry Zitnick and Steve Seitz
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
MASKS © 2004 Invitation to 3D vision Uncalibrated Camera Chapter 6 Reconstruction from Two Uncalibrated Views Modified by L A Rønningen Oct 2008.
Single-view geometry Odilon Redon, Cyclops, 1914.
Stereo March 8, 2007 Suggested Reading: Horn Chapter 13.
Structure from motion Multi-view geometry Affine structure from motion Projective structure from motion Planches : –
Multiple View Geometry and Stereo. Overview Single camera geometry – Recap of Homogenous coordinates – Perspective projection model – Camera calibration.
Lec 26: Fundamental Matrix CS4670 / 5670: Computer Vision Kavita Bala.
CSE 185 Introduction to Computer Vision Stereo 2.
Multiview geometry ECE 847: Digital Image Processing Stan Birchfield Clemson University.
Calibrating a single camera
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Epipolar Geometry and Stereo Vision
Stereo CSE 455 Linda Shapiro.
Undergrad HTAs / TAs Help me make the course better!
CSE 455 Ali Farhadi Several slides from Larry Zitnick and Steve Seitz
Two-view geometry Computer Vision Spring 2018, Lecture 10
Epipolar geometry.
3D Photography: Epipolar geometry
Epipolar geometry continued
Estimating 2-view relationships
Uncalibrated Geometry & Stratification
Reconstruction.
Two-view geometry.
Two-view geometry.
Multi-view geometry.
Single-view geometry Odilon Redon, Cyclops, 1914.
Presentation transcript:

Multi-view geometry

Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates of that point Camera 3 R 3,t 3 Slide credit: Noah Snavely ? Camera 1 Camera 2 R 1,t 1 R 2,t 2

Multi-view geometry problems Stereo correspondence: Given a point in one of the images, where could its corresponding points be in the other images? Camera 3 R 3,t 3 Camera 1 Camera 2 R 1,t 1 R 2,t 2 Slide credit: Noah Snavely

Multi-view geometry problems Motion: Given a set of corresponding points in two or more images, compute the camera parameters Camera 1 Camera 2 Camera 3 R 1,t 1 R 2,t 2 R 3,t 3 ? ? ? Slide credit: Noah Snavely

Two-view geometry

Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the other camera center = vanishing points of the motion direction Baseline – line connecting the two camera centers Epipolar geometry X xx’

The Epipole Photo by Frank Dellaert

Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the other camera center = vanishing points of the motion direction Epipolar Lines - intersections of epipolar plane with image planes (always come in corresponding pairs) Baseline – line connecting the two camera centers Epipolar geometry X xx’

Example: Converging cameras

Example: Motion parallel to image plane

Example: Motion perpendicular to image plane

Points move along lines radiating from the epipole: “focus of expansion” Epipole is the principal point

Example: Motion perpendicular to image plane

Epipolar constraint If we observe a point x in one image, where can the corresponding point x’ be in the other image? x x’ X

Potential matches for x have to lie on the corresponding epipolar line l’. Potential matches for x’ have to lie on the corresponding epipolar line l. Epipolar constraint x x’ X X X

Epipolar constraint example

X xx’ Epipolar constraint: Calibrated case Intrinsic and extrinsic parameters of the cameras are known, world coordinate system is set to that of the first camera Then the projection matrices are given by K[I | 0] and K’[R | t] We can multiply the projection matrices (and the image points) by the inverse of the calibration matrices to get normalized image coordinates:

X xx’ = Rx+t Epipolar constraint: Calibrated case R t The vectors Rx, t, and x’ are coplanar = (x,1) T

Epipolar constraint: Calibrated case X xx’ = Rx+t Recall: The vectors Rx, t, and x’ are coplanar

Epipolar constraint: Calibrated case X xx’ = Rx+t Essential Matrix (Longuet-Higgins, 1981) The vectors Rx, t, and x’ are coplanar

X xx’ Epipolar constraint: Calibrated case E x is the epipolar line associated with x (l' = E x) Recall: a line is given by ax + by + c = 0 or

X xx’ Epipolar constraint: Calibrated case E x is the epipolar line associated with x (l' = E x) E T x' is the epipolar line associated with x' (l = E T x') E e = 0 and E T e' = 0 E is singular (rank two) E has five degrees of freedom

Epipolar constraint: Uncalibrated case The calibration matrices K and K’ of the two cameras are unknown We can write the epipolar constraint in terms of unknown normalized coordinates: X xx’

Epipolar constraint: Uncalibrated case X xx’ Fundamental Matrix (Faugeras and Luong, 1992)

Epipolar constraint: Uncalibrated case F x is the epipolar line associated with x (l' = F x) F T x' is the epipolar line associated with x' (l = F T x') F e = 0 and F T e' = 0 F is singular (rank two) F has seven degrees of freedom X xx’

Estimating the fundamental matrix

The eight-point algorithm Enforce rank-2 constraint (take SVD of F and throw out the smallest singular value) Solve homogeneous linear system using eight or more matches

Problem with eight-point algorithm

Poor numerical conditioning Can be fixed by rescaling the data

The normalized eight-point algorithm Center the image data at the origin, and scale it so the mean squared distance between the origin and the data points is 2 pixels Use the eight-point algorithm to compute F from the normalized points Enforce the rank-2 constraint (for example, take SVD of F and throw out the smallest singular value) Transform fundamental matrix back to original units: if T and T’ are the normalizing transformations in the two images, than the fundamental matrix in original coordinates is T’ T F T (Hartley, 1995)

Nonlinear estimation Linear estimation minimizes the sum of squared algebraic distances between points x’ i and epipolar lines F x i (or points x i and epipolar lines F T x’ i ): Nonlinear approach: minimize sum of squared geometric distances xixi

Comparison of estimation algorithms 8-pointNormalized 8-pointNonlinear least squares Av. Dist pixels0.92 pixel0.86 pixel Av. Dist pixels0.85 pixel0.80 pixel

The Fundamental Matrix Song

From epipolar geometry to camera calibration Estimating the fundamental matrix is known as “weak calibration” If we know the calibration matrices of the two cameras, we can estimate the essential matrix: E = K’ T FK The essential matrix gives us the relative rotation and translation between the cameras, or their extrinsic parameters