Download presentation
Presentation is loading. Please wait.
Published byGabriel Lloyd Modified over 9 years ago
1
1 Comparative Survey on Fundamental Matrix Estimation Computer Vision and Robotics Group Institute of Informatics and Applications University of Girona. Girona (Spain) {armangue, jpages, qsalvi, jbatlle}@eia.udg.es X. Armangué, J. Pagès, J. Salvi and J. Batlle
2
2 Contents : 1.- Stereo Vision 2.- The Epipolar Geometry 3.- Computing the Fundamental Matrix 3.1.- Linear methods 3.2.- Iterative methods 3.3.- Robust methods 4.- Experimental Results
3
3 The principle of Triangulation 3D point Stereo Vision
4
4 T21 = [R1(:,1:3) T1]; invT21=inv(T21); P2Dw1=invT21*[Xu1; Yu1; f1; 1]; Ocw1=invT21(:,4); T22 = [R2(:,1:3) T2]; invT22=inv(T22); P2Dw2=invT22*[Xu2; Yu2; f2; 1] Ocw2=invT22(:,4); W Oc 2 {W} W Oc 1 W P2D 2 W P2D 1 u v pq pq=Ocw2(1:3)-Ocw1(1:3); u=P2Dw1(1:3,i)-Ocw1(1:3); v=P2Dw2(1:3,i)-Ocw2(1:3); alpha=(pq'*v-(pq'*u)*norm(v)^2/(u'*v))/((u'*v)-norm(u)^2*norm(v)^2/(u'*v)); beta=(-pq'*u+alpha*norm(u)^2)/(u'*v); r=Ocw1(1:3)+alpha.*u; s=Ocw2(1:3)+beta.*v; P3Dstereo = (r+s)./2; disterror = norm(r-s); W P3D Getting the 3D point Stereo Vision
5
5 Camera Pose Stereo Vision 3D Reconstruction: Optics and Internal Geometry Constraints: The Correspondence Problem. Active Systems: Non static Camera Position nor Orientation Epipolar Geometry I’I’ I OC’OC’ OCOC m m’m’ M OIOI OI’OI’ OWOW
6
6 The Epipolar Geometry O W coincides with O C ’ I’I’ I OC’OC’ OCOC m m’m’ e e’e’ M OIOI OI’OI’ lm’lm’ l’ml’m OWOW CKC’CKC’ Intrinsic Extrinsic
7
7 Epipole Epipolar lines Area 1 Area 2 Correspondence points Zoom Area 1 Zoom Area 2 Epipolar geometry of Camera 1 Epipolar geometry of Camera 2 The Epipolar Geometry
8
8 Computing the Fundamental Matrix: The Survey Linear Methods. Iterative Methods. Robust Methods. DrivingINRIAAerialUnderwater
9
9 LinearIterativeRobustOptimisationRank-2 Seven points (7p) X—yes Eight points (8p) XLS or Eig.no Rank-2 constraint XLSyes Iterative Newton- Raphson XLSno Linear iterative XLSno Non-linear minimization in parameter space XEig.yes Gradient technique XLS or Eig.no M-Estimator XLS or Eig.no / yes LMedS X7p / LS or Eig.no RANSAC X7p / Eigno LS: Least-Squares Eig: Eigen Analysis Computing the Fundamental Matrix: Survey
10
10 Computing the Fundamental Matrix: Linear Methods Seven points It depends extremely on the seven points used. Eight points Least-squares minimization Eigen analysis Better results increasing the number of points. Eigen minimization is more realistic. Analytical method with Rank-2 constraint. Forces a unique epipole, but results do not improve. Least-squares Eigen Analysis
11
11 Linear Methods Seven Points 8-points Least Squares 8-points Eigen Analysis Computing Noise Outliers X X
12
12 Computing the Fundamental Matrix: Iterative Methods Iterative Newton-Raphson Good results. Depends on the initial guess. Linear iterative method. i is based on the F of the previous step. F is computed by using Least-squares in each iteration. Improves linear least-squares considerably. Non-linear minimization in parameter space Forces a Rank-2 F but the discrepancy is high. Gradient technique Least-squares or Eigen Analysis. Better results with eigen analysis.
13
13 Iterative Methods Newton-Raphson Linear minimization Forcing Rank-2 Computing Noise Outliers X X
14
14 Computing the Fundamental Matrix: Robust Methods Robust Methods: M-Estimator Reduces effect of outliers weighting the residual of each point. Lots of methods just defining a new weight-function. Leasts-squares, eigen analysis, Torr, etc... Good results in the presence gaussian noise in point localization. Bad results in the presence of outliers. LMedS & RANSAC Points used to compute F randomly selected. LMedS uses the median of distances. RANSAC maximises de number of inliers. LMedS is more restrictive than RANSAC (removes more points). Once the outliers are removed, F is recalculated.
15
15 Robust Methods Initial Matching
16
16 Robust Methods: M-estimator Torr
17
17 Robust Methods: RANSAC
18
18 Robust Methods: LMedS Eigen Computing Noise Outliers X X
19
19 Methods Implemented with mean and std. of error: 1.- seven points; 2.- least-squares (LS); 3.- orthogonal LS; 4.- rank-2 constraint; 5.- iterative lineal using LS; 6.- iterative Newton-Raphson using LS; 7.- minimization in parameter space using eigen; 8.- gradient using LS; 9.- gradient using eigen; 10.- M-Estimator using LS; 11.- M-Estimator using eigen; 12.- M-Estimator proposed by Torr; 13.- LMedS using LS; 14.- LMedS using eigen; 15.- RANSAC using eigen. LinearIterativeRobust Computing the Fundamental Matrix: Results
20
20 Computing the Fundamental Matrix: Time Linear Iterative Robust
21
21 Conclusions Survey of fifteen methods of computing F. Conditions: Gaussian Noise, Outliers and Real Images. Linear methods: Good results if the points are well located and the correspondence problem previously solved (without outliers). Iterative methods: Can cope with noise but inefficient in the presence of outliers. Robust methods: Cope with both noise and outliers. Eigen Analysis is better than least-squares. Rank-2 matrices are preferred if a good geometry is required. Better results if data is previously normalized. traslated so that they centroid is placed at the origin. Scaled so that the mean of distances to the origin is [-1, 1] Code available: http://eia.udg.es/~armangue/research.html
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.