Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.

Similar presentations


Presentation on theme: "Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai."— Presentation transcript:

1 Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai

2 2 Outline Parameter estimation Robust estimation (RANSAC) Chapter 3 of “Multiple View Geometry in Computer Vision” by Hartley and Zisserman

3 3 Parameter estimation 2D homography Given a set of (x i,x i ’), compute H (x i ’=Hx i ) 3D to 2D camera projection Given a set of (X i,x i ), compute P (x i =PX i ) Fundamental matrix Given a set of (x i,x i ’), compute F (x i ’ T Fx i =0) Trifocal tensor Given a set of (x i,x i ’,x i ”), compute T

4 4 Number of measurements required At least as many independent equations as degrees of freedom required Example: 2 independent equations / point 8 degrees of freedom 4x2≥8

5 5 Approximate solutions Minimal solution 4 points yield an exact solution for H More points –No exact solution, because measurements are inexact (“noise”) –Search for “best” according to some cost function –Algebraic or geometric/statistical cost

6 6 Gold Standard algorithm Cost function that is optimal for some assumptions Computational algorithm that minimizes it is called “Gold Standard” algorithm Other algorithms can then be compared to it

7 7 Direct Linear Transformation (DLT)

8 8 Equations are linear in h Only 2 out of 3 are linearly independent (indeed, 2 eq/pt) (only drop third row if w i ’≠0) Holds for any homogeneous representation, e.g. ( x i ’, y i ’,1)

9 9 Direct Linear Transformation (DLT) Solving for H size A is 8x9 or 12x9, but rank 8 Trivial solution is h =0 9 T is not interesting 1-D null-space yields solution of interest pick for example the one with

10 10 Direct Linear Transformation (DLT) Over-determined solution No exact solution because of inexact measurement i.e. “noise” Find approximate solution - Additional constraint needed to avoid 0, e.g. - not possible, so minimize

11 11 Singular Value Decomposition Homogeneous least-squares

12 12 DLT algorithm Objective Given n≥4 2D to 2D point correspondences {x i ↔x i ’}, determine the 2D homography matrix H such that x i ’=Hx i Algorithm (i)For each correspondence x i ↔x i ’ compute A i. Usually only two first rows needed. (ii)Assemble n 2x9 matrices A i into a single 2 n x9 matrix A (iii)Obtain SVD of A. Solution for h is last column of V (iv)Determine H from h

13 13 Inhomogeneous solution Since h can only be computed up to scale, pick h j =1, e.g. h 9 =1, and solve for 8-vector Solve using Gaussian elimination (4 points) or using linear least-squares (more than 4 points) However, if h 9 =0 this approach fails also poor results if h 9 close to zero Therefore, not recommended Note h 9 =H 33 =0 if origin is mapped to infinity

14 14 Degenerate configurations x4x4 x1x1 x3x3 x2x2 x4x4 x1x1 x3x3 x2x2 H? H’? x1x1 x3x3 x2x2 x4x4 Constraints: i =1,2,3,4 Define: Then, H * is rank-1 matrix and thus not a homography (case A) (case B) If H * is unique solution, then no homography mapping x i →x i ’(case B) If further solution H exist, then also αH * +βH (case A) (2-D null-space in stead of 1-D null-space)

15 15 Solutions from lines, etc. 2D homographies from 2D lines Minimum of 4 lines Minimum of 5 points or 5 planes 3D Homographies (15 dof) 2D affinities (6 dof) Minimum of 3 points or lines Conic provides 5 constraints Mixed configurations?

16 16 Cost functions Algebraic distance Geometric distance Reprojection error Comparison Geometric interpretation Sampson error

17 17 Algebraic distance DLT minimizes residual vector partial vector for each (x i ↔x i ’) algebraic error vector algebraic distance where Not geometrically/statistically meaningfull, but given good normalization it works fine and is very fast (use for initialization)

18 18 Geometric distance measured coordinates estimated coordinates true coordinates Error in one image e.g. calibration pattern Symmetric transfer error d (.,.) Euclidean distance (in image) Reprojection error

19 19 Reprojection error

20 20 Comparison of geometric and algebraic distances Error in one image typical, but not, except for affinities For affinities DLT can minimize geometric distance Possibility for iterative algorithm

21 21 Geometric interpretation of reprojection error Analog to conic fitting

22 22 Vector that minimizes the geometric error is the closest point on the variety to the measurement Sampson error between algebraic and geometric error Sampson error: 1 st order approximation of Find the vector that minimizes subject to

23 23 Find the vector that minimizes subject to Use Lagrange multipliers: minimize derivatives

24 24 Sampson error Vector that minimizes the geometric error is the closest point on the variety to the measurement between algebraic and geometric error Sampson error: 1 st order approximation of Find the vector that minimizes subject to (Sampson error)

25 25 Sampson approximation A few points: For a 2D homography X=(x,y,x’,y’) is the algebraic error vector is a 2x4 matrix, e.g. Similar to algebraic error in fact, same as Mahalanobis distance Sampson error independent of linear reparametrization (cancels out in between e and J) Must be summed for all points Close to geometric error, but much fewer unknowns

26 26 Statistical cost function and Maximum Likelihood Estimation Optimal cost function related to noise model Assume zero-mean isotropic Gaussian noise (assume outliers removed) Error in one image Maximum Likelihood Estimate: minimize

27 27 Statistical cost function and Maximum Likelihood Estimation Optimal cost function related to noise model Assume zero-mean isotropic Gaussian noise (assume outliers removed) Error in both images Maximum Likelihood Estimate : minimize

28 28 Mahalanobis distance General Gaussian case Measurement X with covariance matrix Σ Error in two images (independent) Varying covariances

29 29 Outline Parameter estimation Robust estimation (RANSAC) Chapter 3 of “Multiple View Geometry in Computer Vision” by Hartley and Zisserman

30 30 Robust estimation What if set of matches contains gross outliers?

31 31 RANSAC Objective Robust fit of model to data set S which contains outliers Algorithm (i)Randomly select a sample of s data points from S and instantiate the model from this subset. (ii)Determine the set of data points S i which are within a distance threshold t of the model. The set S i is the consensus set of samples and defines the inliers of S. (iii)If the subset of S i is greater than some threshold T, re- estimate the model using all the points in S i and terminate (iv)If the size of S i is less than T, select a new subset and repeat the above. (v)After N trials the largest consensus set S i is selected, and the model is re-estimated using all the points in the subset S i

32 32 How many samples? Choose t so probability for inlier is α (e.g. 0.95) –Or empirically Choose N so that, with probability p, at least one random sample is free from outliers. e.g. p =0.99 proportion of outliers e s5%10%20%25%30%40%50% 2235671117 33479111935 435913173472 54612172657146 64716243797293 748203354163588 8592644782721177

33 33 Acceptable consensus set? Typically, terminate when inlier ratio reaches expected ratio of inliers

34 34 Adaptively determining the number of samples e is often unknown a priori, so pick worst case, e.g. 50%, and adapt if more inliers are found, e.g. 80% would yield e =0.2 –N=∞, sample_count =0 –While N >sample_count repeat Choose a sample and count the number of inliers Set e=1-(number of inliers)/(total number of points) Recompute N from e Increment the sample_count by 1 –Terminate

35 35 Robust Maximum Likelihood Estimation Previous MLE algorithm considers fixed set of inliers Better, robust cost function (reclassifies)

36 36 Other robust algorithms RANSAC maximizes number of inliers LMedS minimizes median error –No threshold –Needs estimate of percentage of outliers Not recommended: case deletion, iterative least-squares, etc.

37 37 Automatic computation of H Objective Compute homography between two images Algorithm (i)Interest points: Compute interest points in each image (ii)Putative correspondences: Compute a set of interest point matches based on some similarity measure (iii)RANSAC robust estimation: Repeat for N samples (a) Select 4 correspondences and compute H (b) Calculate the distance d  for each putative match (c) Compute the number of inliers consistent with H (d  <t) Choose H with most inliers (iv)Optimal estimation: re-estimate H from all inliers by minimizing ML cost function with Levenberg-Marquardt (v)Guided matching: Determine more matches using prediction by computed H Optionally iterate last two steps until convergence

38 38 Determine putative correspondences Compare interest points Similarity measure: –SAD, SSD, ZNCC on small neighborhood If motion is limited, only consider interest points with similar coordinates More advanced approaches exist, based on invariance…

39 39 Example: robust computation Interest points (500/image) (640x480) Putative correspondences (268) (Best match,SSD<20,±320) Outliers (117) (t=1.25 pixel; 43 iterations) Inliers (151) Final inliers (262) (2 MLE-inlier cycles; d  =0.23→d  =0.19; Iter Lev-Mar =10) #in 1-e adapt. N 62%20M 103%2.5M 4416%6,922 5821%2,291 7326%911 15156%43


Download ppt "Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai."

Similar presentations


Ads by Google