Camera calibration and epipolar geometry

Slides:



Advertisements
Similar presentations
Single-view geometry Odilon Redon, Cyclops, 1914.
Advertisements

Epipolar Geometry.
Lecture 11: Two-view geometry
Institut für Elektrische Meßtechnik und Meßsignalverarbeitung Professor Horst Cerjak, Augmented Reality VU 2 Calibration Axel Pinz.
Computer vision: models, learning and inference
Two-View Geometry CS Sastry and Yang
Jan-Michael Frahm, Enrique Dunn Spring 2012
Two-view geometry.
Lecture 8: Stereo.
Structure from motion.
Single-view metrology
Computer Vision : CISC 4/689
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Structure from motion. Multiple-view geometry questions Scene geometry (structure): Given 2D point matches in two or more images, where are the corresponding.
Uncalibrated Geometry & Stratification Sastry and Yang
Lecture 21: Multiple-view geometry and structure from motion
Multiple View Geometry
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Single-view geometry Odilon Redon, Cyclops, 1914.
Projected image of a cube. Classical Calibration.
May 2004Stereo1 Introduction to Computer Vision CS / ECE 181B Tuesday, May 11, 2004  Multiple view geometry and stereo  Handout #6 available (check with.
Multiple View Geometry
Multiple View Geometry. THE GEOMETRY OF MULTIPLE VIEWS Reading: Chapter 10. Epipolar Geometry The Essential Matrix The Fundamental Matrix The Trifocal.
Structure Computation. How to compute the position of a point in 3- space given its image in two views and the camera matrices of those two views Use.
CS 558 C OMPUTER V ISION Lecture IX: Dimensionality Reduction.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Epipolar Geometry and Stereo Vision Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/12/11 Many slides adapted from Lana Lazebnik,
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Epipolar Geometry and Stereo Vision Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/05/15 Many slides adapted from Lana Lazebnik,
Epipolar Geometry and Stereo Vision Computer Vision ECE 5554 Virginia Tech Devi Parikh 10/15/13 These slides have been borrowed from Derek Hoiem and Kristen.
Structure from images. Calibration Review: Pinhole Camera.
Camera Geometry and Calibration Thanks to Martial Hebert.
Multi-view geometry.
Epipolar geometry The fundamental matrix and the tensor
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Projective cameras Motivation Elements of Projective Geometry Projective structure from motion Planches : –
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Computer Vision – Lecture 16
Lecture 04 22/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Multiview Geometry and Stereopsis. Inputs: two images of a scene (taken from 2 viewpoints). Output: Depth map. Inputs: multiple images of a scene. Output:
Geometric Camera Models
Announcements Project 3 due Thursday by 11:59pm Demos on Friday; signup on CMS Prelim to be distributed in class Friday, due Wednesday by the beginning.
EECS 274 Computer Vision Stereopsis.
Geometry of Multiple Views
Affine Structure from Motion
Single-view geometry Odilon Redon, Cyclops, 1914.
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
EECS 274 Computer Vision Affine Structure from Motion.
stereo Outline : Remind class of 3d geometry Introduction
Computer vision: models, learning and inference M Ahad Multiple Cameras
MASKS © 2004 Invitation to 3D vision Uncalibrated Camera Chapter 6 Reconstruction from Two Uncalibrated Views Modified by L A Rønningen Oct 2008.
Single-view geometry Odilon Redon, Cyclops, 1914.
Structure from motion Multi-view geometry Affine structure from motion Projective structure from motion Planches : –
Multiple View Geometry and Stereo. Overview Single camera geometry – Recap of Homogenous coordinates – Perspective projection model – Camera calibration.
Lec 26: Fundamental Matrix CS4670 / 5670: Computer Vision Kavita Bala.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
Calibrating a single camera
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Epipolar Geometry and Stereo Vision
Two-view geometry Computer Vision Spring 2018, Lecture 10
Epipolar geometry.
Multiple View Geometry for Robotics
Uncalibrated Geometry & Stratification
Reconstruction.
Two-view geometry.
Two-view geometry.
Multi-view geometry.
Single-view geometry Odilon Redon, Cyclops, 1914.
Presentation transcript:

Camera calibration and epipolar geometry Odilon Redon, Cyclops, 1914

Review: Alignment What is the geometric relationship between pictures taken by cameras that share the same center? How many points do we need to estimate a homography? How do we estimate a homography?

Geometric vision Goal: Recovery of 3D structure What cues in the image allow us to do this?

Merle Norman Cosmetics, Los Angeles Visual cues Shading Merle Norman Cosmetics, Los Angeles Slide credit: S. Seitz

The Visual Cliff, by William Vandivert, 1960 Visual cues Shading Texture The Visual Cliff, by William Vandivert, 1960 Slide credit: S. Seitz

From The Art of Photography, Canon Visual cues Shading Texture Focus From The Art of Photography, Canon Slide credit: S. Seitz

Visual cues Shading Texture Focus Perspective Slide credit: S. Seitz

Visual cues Shading Texture Focus Perspective Motion Slide credit: S. Seitz

Our goal: Recovery of 3D structure We will focus on perspective and motion We need multi-view geometry because recovery of structure from one image is inherently ambiguous X? X? X? x

Our goal: Recovery of 3D structure We will focus on perspective and motion We need multi-view geometry because recovery of structure from one image is inherently ambiguous

Our goal: Recovery of 3D structure We will focus on perspective and motion We need multi-view geometry because recovery of structure from one image is inherently ambiguous

Recall: Pinhole camera model

Pinhole camera model

Camera coordinate system Principal axis: line from the camera center perpendicular to the image plane Normalized (camera) coordinate system: camera center is at the origin and the principal axis is the z-axis Principal point (p): point where principal axis intersects the image plane (origin of normalized coordinate system)

Principal point offset Camera coordinate system: origin is at the prinicipal point Image coordinate system: origin is in the corner

Principal point offset

Principal point offset calibration matrix

Pixel coordinates Pixel size: mx pixels per meter in horizontal direction, my pixels per meter in vertical direction pixels/m m pixels

Camera rotation and translation In general, the camera coordinate frame will be related to the world coordinate frame by a rotation and a translation coords. of point in camera frame coords. of camera center in world frame coords. of a point in world frame (nonhomogeneous)

Camera rotation and translation In non-homogeneous coordinates: Note: C is the null space of the camera projection matrix (PC=0)

Camera parameters Intrinsic parameters Principal point coordinates Focal length Pixel magnification factors Skew (non-rectangular pixels) Radial distortion

Camera parameters Intrinsic parameters Extrinsic parameters Principal point coordinates Focal length Pixel magnification factors Skew (non-rectangular pixels) Radial distortion Extrinsic parameters Rotation and translation relative to world coordinate system

Camera calibration Given n points with known 3D coordinates Xi and known image projections xi, estimate the camera parameters Xi xi

Camera calibration Two linearly independent equations

Camera calibration P has 11 degrees of freedom (12 parameters, but scale is arbitrary) One 2D/3D correspondence gives us two linearly independent equations Homogeneous least squares 6 correspondences needed for a minimal solution

Camera calibration Note: for coplanar points that satisfy ΠTX=0, we will get degenerate solutions (Π,0,0), (0,Π,0), or (0,0,Π)

Camera calibration Once we’ve recovered the numerical form of the camera matrix, we still have to figure out the intrinsic and extrinsic parameters This is a matrix decomposition problem, not an estimation problem (see F&P sec. 3.2, 3.3)

Two-view geometry Scene geometry (structure): Given corresponding points in two or more images, where is the pre-image of these points in 3D? Correspondence (stereo matching): Given a point in just one image, how does it constrain the position of the corresponding point x’ in another image? Camera geometry (motion): Given a set of corresponding points in two images, what are the cameras for the two views?

Triangulation Given projections of a 3D point in two or more images (with known camera matrices), find the coordinates of the point O1 O2 x1 x2 X?

Triangulation We want to intersect the two visual rays corresponding to x1 and x2, but because of noise and numerical errors, they don’t meet exactly R1 R2 O1 O2 x1 x2 X?

Triangulation: Geometric approach Find shortest segment connecting the two viewing rays and let X be the midpoint of that segment X x2 x1 O1 O2

Triangulation: Linear approach Cross product as matrix multiplication:

Triangulation: Linear approach Two independent equations each in terms of three unknown entries of X

Triangulation: Nonlinear approach Find X that minimizes X? x’1 x2 x1 x’2 O1 O2

Epipolar geometry Baseline – line connecting the two camera centers X x x’ Baseline – line connecting the two camera centers Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the other camera center = vanishing points of camera motion direction Epipolar Lines - intersections of epipolar plane with image planes (always come in corresponding pairs)

Example: Converging cameras

Example: Motion parallel to image plane

Example: Forward motion Epipole has same coordinates in both images. Points move along lines radiating from e: “Focus of expansion”

Epipolar constraint X x x’ If we observe a point x in one image, where can the corresponding point x’ be in the other image?

Epipolar constraint X X X x x’ x’ x’ Potential matches for x have to lie on the corresponding epipolar line l’. Potential matches for x’ have to lie on the corresponding epipolar line l.

Epipolar constraint example Source: K. Grauman

Epipolar constraint: Calibrated case X x x’ Assume that the intrinsic and extrinsic parameters of the cameras are known We can multiply the projection matrix of each camera (and the image points) by the inverse of the calibration matrix to get normalized image coordinates We can also set the global coordinate system to the coordinate system of the first camera

Epipolar constraint: Calibrated case X x x’ t R Camera matrix: [I|0] X = (u, v, w, 1)T x = (u, v, w)T Camera matrix: [RT | –RTt] Vector x’ in second coord. system has coordinates Rx’ in the first one The vectors x, t, and Rx’ are coplanar

Epipolar constraint: Calibrated case X x x’ Essential Matrix (Longuet-Higgins, 1981) The vectors x, t, and Rx’ are coplanar

Epipolar constraint: Calibrated case X x x’ E x’ is the epipolar line associated with x’ (l = E x’) ETx is the epipolar line associated with x (l’ = ETx) E e’ = 0 and ETe = 0 E is singular (rank two) E has five degrees of freedom (up to scale)

Epipolar constraint: Uncalibrated case X x x’ The calibration matrices K and K’ of the two cameras are unknown We can write the epipolar constraint in terms of unknown normalized coordinates:

Epipolar constraint: Uncalibrated case X x x’ Fundamental Matrix (Faugeras and Luong, 1992)

Epipolar constraint: Uncalibrated case X x x’ F x’ is the epipolar line associated with x’ (l = F x’) FTx is the epipolar line associated with x (l’ = FTx) F e’ = 0 and FTe = 0 F is singular (rank two) F has seven degrees of freedom

The eight-point algorithm x = (u, v, 1)T, x’ = (u’, v’, 1)T Minimize: under the constraint |F|2 = 1

The eight-point algorithm Meaning of error sum of Euclidean distances between points xi and epipolar lines F x’i (or points x’i and epipolar lines FTxi) multiplied by a scale factor Nonlinear approach: minimize

Problem with eight-point algorithm

Problem with eight-point algorithm Poor numerical conditioning Can be fixed by rescaling the data

The normalized eight-point algorithm (Hartley, 1995) Center the image data at the origin, and scale it so the mean squared distance between the origin and the data points is 2 pixels Use the eight-point algorithm to compute F from the normalized points Enforce the rank-2 constraint (for example, take SVD of F and throw out the smallest singular value) Transform fundamental matrix back to original units: if T and T’ are the normalizing transformations in the two images, than the fundamental matrix in original coordinates is TT F T’

Comparison of estimation algorithms 8-point Normalized 8-point Nonlinear least squares Av. Dist. 1 2.33 pixels 0.92 pixel 0.86 pixel Av. Dist. 2 2.18 pixels 0.85 pixel 0.80 pixel

Epipolar transfer Assume the epipolar geometry is known Given projections of the same point in two images, how can we compute the projection of that point in a third image? ? x3 x1 x2

Epipolar transfer Assume the epipolar geometry is known Given projections of the same point in two images, how can we compute the projection of that point in a third image? x1 x2 x3 l31 l32 l31 = FT13 x1 l32 = FT23 x2 When does epipolar transfer fail?

Next time: Stereo