Download presentation
Presentation is loading. Please wait.
Published byDwight Haynes Modified over 9 years ago
1
CS 558 C OMPUTER V ISION Lecture IX: Dimensionality Reduction
2
CS 558 C OMPUTER V ISION Supplementary Lecture: Single View and Epipolar Geometry Slide adapted from S. Lazebnik
3
O UTLINE Single view geometry Epipolar geometry
4
S INGLE - VIEW GEOMETRY Odilon Redon, Cyclops, 1914
5
O UR GOAL : R ECOVERY OF 3D STRUCTURE Recovery of structure from one image is inherently ambiguous x X?
6
O UR GOAL : R ECOVERY OF 3D STRUCTURE Recovery of structure from one image is inherently ambiguous
7
O UR GOAL : R ECOVERY OF 3D STRUCTURE Recovery of structure from one image is inherently ambiguous
8
A MES R OOM http://en.wikipedia.org/wiki/Ames_room
9
O UR GOAL : R ECOVERY OF 3D STRUCTURE We will need multi-view geometry
10
R ECALL : P INHOLE CAMERA MODEL Principal axis: line from the camera center perpendicular to the image plane Normalized (camera) coordinate system: camera center is at the origin and the principal axis is the z- axis
11
R ECALL : P INHOLE CAMERA MODEL
12
P RINCIPAL POINT Principal point (p): point where principal axis intersects the image plane (origin of normalized coordinate system). Normalized coordinate system: origin is at the principal point. Image coordinate system: origin is in the corner. How to go from normalized coordinate system to image coordinate system? pxpx pypy
13
P RINCIPAL POINT OFFSET principal point: pxpx pypy
14
P RINCIPAL POINT OFFSET calibration matrix principal point:
15
P IXEL COORDINATES m x pixels per meter in horizontal direction, m y pixels per meter in vertical direction Pixel size: pixels/m mpixels
16
C AMERA ROTATION AND TRANSLATION In general, the camera coordinate frame will be related to the world coordinate frame by a rotation and a translation coords. of point in camera frame coords. of camera center in world frame coords. of a point in world frame (nonhomogeneous)
17
C AMERA ROTATION AND TRANSLATION In non-homogeneous coordinates: Note: C is the null space of the camera projection matrix (PC=0)
18
Intrinsic parameters Principal point coordinates Focal length Pixel magnification factors Skew (non-rectangular pixels) Radial distortion C AMERA PARAMETERS
19
Intrinsic parameters Principal point coordinates Focal length Pixel magnification factors Skew (non-rectangular pixels) Radial distortion Extrinsic parameters Rotation and translation relative to world coordinate system C AMERA PARAMETERS
20
C AMERA CALIBRATION Source: D. Hoiem
21
C AMERA CALIBRATION Given n points with known 3D coordinates X i and known image projections x i, estimate the camera parameters XiXi xixi
22
C AMERA CALIBRATION : L INEAR METHOD Two linearly independent equations
23
C AMERA CALIBRATION : L INEAR METHOD P has 11 degrees of freedom (12 parameters, but scale is arbitrary) One 2D/3D correspondence gives us two linearly independent equations Homogeneous least squares 6 correspondences needed for a minimal solution
24
C AMERA CALIBRATION : L INEAR METHOD Note: for coplanar points that satisfy Π T X=0, we will get degenerate solutions (Π,0,0), (0,Π,0), or (0,0,Π)
25
C AMERA CALIBRATION : L INEAR METHOD Advantages: easy to formulate and solve Disadvantages Doesn’t directly tell you camera parameters Doesn’t model radial distortion Can’t impose constraints, such as known focal length and orthogonality Non-linear methods are preferred Define error as difference between projected points and measured points Minimize error using Newton’s method or other non- linear optimization Source: D. Hoiem
26
Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates of that point Camera 3 R 3,t 3 Slide credit: Noah Snavely ? Camera 1 Camera 2 R 1,t 1 R 2,t 2 M ULTI - VIEW GEOMETRY PROBLEMS
27
Stereo correspondence: Given a point in one of the images, where could its corresponding points be in the other images? Camera 3 R 3,t 3 Camera 1 Camera 2 R 1,t 1 R 2,t 2 Slide credit: Noah Snavely
28
M ULTI - VIEW GEOMETRY PROBLEMS Motion: Given a set of corresponding points in two or more images, compute the camera parameters Camera 1 Camera 2 Camera 3 R 1,t 1 R 2,t 2 R 3,t 3 ? ? ? Slide credit: Noah Snavely
29
T RIANGULATION Given projections of a 3D point in two or more images (with known camera matrices), find the coordinates of the point O1O1 O2O2 x1x1 x2x2 X?
30
T RIANGULATION We want to intersect the two visual rays corresponding to x 1 and x 2, but because of noise and numerical errors, they don’t meet exactly O1O1 O2O2 x1x1 x2x2 X? R1R1 R2R2
31
T RIANGULATION : G EOMETRIC APPROACH Find shortest segment connecting the two viewing rays and let X be the midpoint of that segment O1O1 O2O2 x1x1 x2x2 X
32
T RIANGULATION : L INEAR APPROACH Cross product as matrix multiplication:
33
T RIANGULATION : L INEAR APPROACH Two independent equations each in terms of three unknown entries of X
34
T RIANGULATION : N ONLINEAR APPROACH Find X that minimizes O1O1 O2O2 x1x1 x2x2 X? x’ 1 x’ 2
35
T WO - VIEW GEOMETRY
36
Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the other camera center Baseline – line connecting the two camera centers E PIPOLAR GEOMETRY X xx’
37
T HE E PIPOLE Photo by Frank Dellaert
38
Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the other camera center Epipolar Lines - intersections of epipolar plane with image planes (always come in corresponding pairs) Baseline – line connecting the two camera centers E PIPOLAR GEOMETRY X xx’
39
E XAMPLE : C ONVERGING CAMERAS
40
E XAMPLE : M OTION PARALLEL TO IMAGE PLANE
41
E XAMPLE : M OTION PERPENDICULAR TO IMAGE PLANE
43
e e’ E XAMPLE : M OTION PERPENDICULAR TO IMAGE PLANE Epipole has same coordinates in both images. Points move along lines radiating from e: “Focus of expansion”
44
E PIPOLAR CONSTRAINT If we observe a point x in one image, where can the corresponding point x’ be in the other image? x x’ X
45
Potential matches for x have to lie on the corresponding epipolar line l’. Potential matches for x’ have to lie on the corresponding epipolar line l. E PIPOLAR CONSTRAINT x x’ X X X
46
E PIPOLAR CONSTRAINT EXAMPLE
47
X xx’ E PIPOLAR CONSTRAINT : C ALIBRATED CASE Assume that the intrinsic and extrinsic parameters of the cameras are known We can multiply the projection matrix of each camera (and the image points) by the inverse of the calibration matrix to get normalized image coordinates We can also set the global coordinate system to the coordinate system of the first camera. Then the projection matrix of the first camera is [I | 0].
48
X xx’ E PIPOLAR CONSTRAINT : C ALIBRATED CASE R t The vectors x, t, and Rx’ are coplanar = RX’ + t
49
Essential Matrix (Longuet-Higgins, 1981) E PIPOLAR CONSTRAINT : C ALIBRATED CASE X xx’ The vectors x, t, and Rx’ are coplanar
50
X xx’ E PIPOLAR CONSTRAINT : C ALIBRATED CASE E x’ is the epipolar line associated with x’ ( l = E x’ ) E T x is the epipolar line associated with x ( l’ = E T x ) E e’ = 0 and E T e = 0 E is singular (rank two) E has five degrees of freedom
51
E PIPOLAR CONSTRAINT : U NCALIBRATED CASE The calibration matrices K and K’ of the two cameras are unknown We can write the epipolar constraint in terms of unknown normalized coordinates: X xx’
52
E PIPOLAR CONSTRAINT : U NCALIBRATED CASE X xx’ Fundamental Matrix (Faugeras and Luong, 1992)
53
E PIPOLAR CONSTRAINT : U NCALIBRATED CASE F x’ is the epipolar line associated with x’ (l = F x’) F T x is the epipolar line associated with x (l’ = F T x) F e’ = 0 and F T e = 0 F is singular (rank two) F has seven degrees of freedom X xx’
54
T HE EIGHT - POINT ALGORITHM x = (u, v, 1) T, x’ = (u’, v’, 1) T Minimize: under the constraint F 33 = 1
55
T HE EIGHT - POINT ALGORITHM Meaning of error sum of Euclidean distances between points x i and epipolar lines F x’ i (or points x’ i and epipolar lines F T x i ) multiplied by a scale factor Nonlinear approach: minimize
56
P ROBLEM WITH EIGHT - POINT ALGORITHM
57
Poor numerical conditioning Can be fixed by rescaling the data
58
T HE NORMALIZED EIGHT - POINT ALGORITHM Center the image data at the origin, and scale it so the mean squared distance between the origin and the data points is 2 pixels Use the eight-point algorithm to compute F from the normalized points Enforce the rank-2 constraint (for example, take SVD of F and throw out the smallest singular value) Transform fundamental matrix back to original units: if T and T’ are the normalizing transformations in the two images, than the fundamental matrix in original coordinates is T T F T’ (Hartley, 1995)
59
C OMPARISON OF ESTIMATION ALGORITHMS 8-pointNormalized 8-pointNonlinear least squares Av. Dist. 12.33 pixels0.92 pixel0.86 pixel Av. Dist. 22.18 pixels0.85 pixel0.80 pixel
60
F ROM EPIPOLAR GEOMETRY TO CAMERA CALIBRATION Estimating the fundamental matrix is known as “weak calibration” If we know the calibration matrices of the two cameras, we can estimate the essential matrix: E = K T FK’ The essential matrix gives us the relative rotation and translation between the cameras, or their extrinsic parameters
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.