Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.

Slides:



Advertisements
Similar presentations
Camera Calibration.
Advertisements

Epipolar Geometry.
Lecture 11: Two-view geometry
3D reconstruction.
Computer vision: models, learning and inference
Self-calibration.
Jan-Michael Frahm, Enrique Dunn Spring 2012
Two-view geometry.
Epipolar Geometry class 11 Multiple View Geometry Comp Marc Pollefeys.
Camera calibration and epipolar geometry
Multiple View Geometry
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Robot Vision SS 2008 Matthias Rüther 1 ROBOT VISION Lesson 6: Shape from Stereo Matthias Rüther Slides partial courtesy of Marc Pollefeys Department of.
Epipolar Geometry Class 7 Read notes Feature tracking run iterative L-K warp & upsample Tracking Good features Multi-scale Transl. Affine.
3D reconstruction class 11
Used slides/content with permission from
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Uncalibrated Geometry & Stratification Sastry and Yang
Epipolar Geometry and the Fundamental Matrix F
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Multiple View Geometry
Multiple View Geometry
Multiple View Geometry
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Camera calibration and single view metrology Class 4 Read Zhang’s paper on calibration
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
3D Urban Modeling Marc Pollefeys Jan-Michael Frahm, Philippos Mordohai Fall 2006 / Comp Tue & Thu 14:00-15:15.
May 2004Stereo1 Introduction to Computer Vision CS / ECE 181B Tuesday, May 11, 2004  Multiple view geometry and stereo  Handout #6 available (check with.
Lec 21: Fundamental Matrix
CSE473/573 – Stereo Correspondence
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Camera Calibration class 9 Multiple View Geometry Comp Marc Pollefeys.
Epipolar geometry Class 5
Computer Vision Calibration Marc Pollefeys COMP 256 Read F&P Chapter 2 Some slides/illustrations from Ponce, Hartley & Zisserman.
Geometria della telecamera. the image of a point P belongs to the line (P,O) p P O p = image of P = image plane ∩ line(O,P) interpretation line of p:
Structure Computation. How to compute the position of a point in 3- space given its image in two views and the camera matrices of those two views Use.
Epipolar geometry Class 5. Geometric Computer Vision course schedule (tentative) LectureExercise Sept 16Introduction- Sept 23Geometry & Camera modelCamera.
CSE 6367 Computer Vision Stereo Reconstruction Camera Coordinate Transformations “Everything should be made as simple as possible, but not simpler.” Albert.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
Multi-view geometry.
Epipolar geometry The fundamental matrix and the tensor
Camera Calibration class 9 Multiple View Geometry Comp Marc Pollefeys.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Structure from Motion Course web page: vision.cis.udel.edu/~cv April 25, 2003  Lecture 26.
Camera Model & Camera Calibration
Stereo Course web page: vision.cis.udel.edu/~cv April 11, 2003  Lecture 21.
Robot Vision SS 2007 Matthias Rüther 1 ROBOT VISION Lesson 6a: Shape from Stereo, short summary Matthias Rüther Slides partial courtesy of Marc Pollefeys.
1 Formation et Analyse d’Images Session 7 Daniela Hall 25 November 2004.
Two-view geometry Epipolar geometry F-matrix comp. 3D reconstruction
Geometry of Multiple Views
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
Computer vision: models, learning and inference M Ahad Multiple Cameras
ROBOT VISION Lesson 4: Camera Models and Calibration Matthias Rüther Slides partial courtesy of Marc Pollefeys Department of Computer Science University.
Parameter estimation class 5 Multiple View Geometry CPSC 689 Slides modified from Marc Pollefeys’ Comp
Lec 26: Fundamental Matrix CS4670 / 5670: Computer Vision Kavita Bala.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Parameter estimation class 5
Two-view geometry Computer Vision Spring 2018, Lecture 10
Epipolar geometry.
Epipolar Geometry class 11
3D Photography: Epipolar geometry
More on single-view geometry class 10
Camera Calibration class 9
Two-view geometry Epipolar geometry F-matrix comp. 3D reconstruction
Presentation transcript:

Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai

2 Outline Computation of camera matrix P Introduction to epipolar geometry estimation Chapters 7 and 9 of “Multiple View Geometry in Computer Vision” by Hartley and Zisserman Note (2011): the slides refer to chapter numbers from the previous edition of the book. I try to update them but they may be off by one.

3 Pinhole camera

4 Camera center Principal point Principal ray Camera anatomy

5 Camera calibration

6 Resectioning Resectioning: correspondence between 3D and image entities

7 Basic equations Similar to last week’s H estimation

8 minimal solution Over-determined solution  5½ correspondences needed (say 6) P has 11 dof, 2 independent eq./points n  6 points minimize subject to constraint Basic equations

9 (i)Camera and points on a twisted cubic (ii)Points lie on plane or single line passing through projection center Degenerate configurations

10 Scaling Data normalization Centroid at origin

11 Geometric error Assume 3D points are more accurately known

12 Gold Standard algorithm Objective Given n≥6 3D to 2D point correspondences {X i ↔x i ’}, determine the Maximum Likelihood Estimation of P Algorithm (i)Linear solution: (a)Normalization: (b)DLT: Form 2nx12 matrix A and solve for Ap=0, subject to ||p||=1. p is the singular vector of A corresponding to the smallest singular value. (ii)Minimization of geometric error: using the linear estimate as a starting point minimize the geometric error: (iii)Denormalization: ~~ ~

13 (i)Canny edge detection (ii)Straight line fitting to the detected edges (iii)Intersecting the lines to obtain the images corners typically precision <1/10 (HZ rule of thumb: 5 n constraints for n unknowns) Calibration example

14 Errors in the image and in the world points Errors in the world

15 Minimize geometric error  impose constraint through parameterization  Image only  9   2n, otherwise  3n+9   5n Find best fit that satisfies skew s is zero pixels are square principal point is known complete camera matrix K is known Minimize algebraic error  assume map from param q  P=K[R|-RC], i.e. p=g(q)  minimize ||Ag(q)|| (9 instead of 12 parameters) Restricted camera estimation

16 Initialization Use general DLT Clamp values to desired values, e.g. s=0,  x =  y Note: can sometimes cause big jump in error Alternative initialization Use general DLT Impose soft constraints gradually increase weights Restricted camera estimation

17 Calibrated camera, position and orientation unknown  Pose estimation 6 dof  3 points minimal (4 solutions in general) Exterior orientation (hand-eye calibration)

18 Algebraic method minimizes 12 errors instead of 2n=396 Experimental evaluation

19 ML residual error Example: n=197, =0.365, =0.37 Covariance estimation d: number of parameters

20 Compute Jacobian of measured points in terms of camera parameters at ML solution, then (variance per parameter can be found on diagonal) (chi-square distribution =distribution of sum of squares) cumulative -1 Covariance for estimated camera Confidence ellipsoid for camera center:

21

22 short and long focal length Radial distortion

23

24

25 Choice of the distortion function and center Computing the parameters of the distortion function (i)Minimize with additional unknowns (ii)Straighten lines Correction of distortion 2

26 Placing virtual models in video unmodelled radial distortion Bundle adjustment needed to avoid drift of virtual object throughout sequence (Sony TRV900; miniDV) modelled radial distortion

27 Tsai calibration Zhang calibration Z. Zhang. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11): , Z. Zhang. Flexible Camera Calibration By Viewing a Plane From Unknown Orientations. International Conference on Computer Vision (ICCV'99), Corfu, Greece, pages , September Some typical calibration algorithms

28 Related Topics Calibration from vanishing points Calibration from the absolute conic

29 Outline Computation of camera matrix P Introduction to epipolar geometry estimation

30 (i)Correspondence geometry: Given an image point x in the first image, how does this constrain the position of the corresponding point x’ in the second image? (ii)Camera geometry (motion): Given a set of corresponding image points {x i ↔x’ i }, i=1,…,n, what are the cameras P and P’ for the two views? (iii)Scene geometry (structure): Given corresponding image points x i ↔x’ i and cameras P, P’, what is the position of (their pre-image) X in space? Three questions:

31 C,C’,x,x’ and X are coplanar The epipolar geometry

32 What if only C,C’,x are known? The epipolar geometry

33 All points on  project on l and l’ The epipolar geometry

34 Family of planes  and lines l and l’ Intersection in e and e’ The epipolar geometry

35 epipoles e,e’ = intersection of baseline with image plane = projection of projection center in other image = vanishing point of camera motion direction an epipolar plane = plane containing baseline (1-D family) an epipolar line = intersection of epipolar plane with image (always come in corresponding pairs) The epipolar geometry

36 Example: converging cameras

37 (simple for stereo  rectification) Example: motion parallel with image plane

38 e e’ Example: forward motion

39 The fundamental matrix F algebraic representation of epipolar geometry we will see that mapping is a (singular) correlation (i.e. projective mapping from points to lines) represented by the fundamental matrix F

40 The fundamental matrix F correspondence condition The fundamental matrix satisfies the condition that for any pair of corresponding points x↔x ’ in the two images

41 The fundamental matrix F F is the unique 3x3 rank 2 matrix that satisfies x’ T Fx=0 for all x↔x’ (i)Transpose: if F is fundamental matrix for (P,P’), then F T is fundamental matrix for (P’,P) (ii)Epipolar lines: l’=Fx & l=F T x’ (iii)Epipoles: on all epipolar lines, thus e’ T Fx=0,  x  e’ T F=0, similarly Fe=0 (iv)F has 7 d.o.f., i.e. 3x3-1(homogeneous)-1(rank2) (v)F is a correlation, projective mapping from a point x to a line l’=Fx (not a proper correlation, i.e. not invertible)

42 Projective transformation and invariance Derivation based purely on projective concepts F invariant to transformations of projective 3-space unique not unique canonical form

43 The essential matrix ~fundamental matrix for calibrated cameras (remove K) 5 d.o.f. (3 for R; 2 for t up to scale) E is essential matrix if and only if two singular values are equal (and third=0)

44 Four possible reconstructions from E (only one solution where points is in front of both cameras)