Robust Estimation Course web page: vision.cis.udel.edu/~cv April 23, 2003  Lecture 25.

Slides:



Advertisements
Similar presentations
The fundamental matrix F
Advertisements

Summary of Friday A homography transforms one 3d plane to another 3d plane, under perspective projections. Those planes can be camera imaging planes or.
Computer Vision TexPoint fonts used in EMF: AAA Niels Chr Overgaard 2010 Lecture 8: Structure from Motion RANSAC Structure from motion problem Structure.
The Trifocal Tensor Multiple View Geometry. Scene planes and homographies plane induces homography between two views.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Robot Vision SS 2005 Matthias Rüther 1 ROBOT VISION Lesson 3: Projective Geometry Matthias Rüther Slides courtesy of Marc Pollefeys Department of Computer.
Lecture 8: Stereo.
Image alignment Image from
776 Computer Vision Jared Heinly Spring 2014 (slides borrowed from Jan-Michael Frahm, Svetlana Lazebnik, and others)
Mosaics con’t CSE 455, Winter 2010 February 10, 2010.
Multiple View Geometry
Computer Vision Fitting Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, T. Darrel, A. Zisserman,...
Robust Estimator 學生 : 范育瑋 老師 : 王聖智. Outline Introduction LS-Least Squares LMS-Least Median Squares RANSAC- Random Sample Consequence MLESAC-Maximum likelihood.
Fitting. We’ve learned how to detect edges, corners, blobs. Now what? We would like to form a higher-level, more compact representation of the features.
Computing F Class 8 Read notes Section 4.2. C1C1 C2C2 l2l2  l1l1 e1e1 e2e2 Fundamental matrix (3x3 rank 2 matrix) 1.Computable from corresponding points.
Geometric Optimization Problems in Computer Vision.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Two-view geometry Epipolar geometry F-matrix comp. 3D reconstruction Structure comp.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Triangulation and Multi-View Geometry Class 9 Read notes Section 3.3, , 5.1 (if interested, read Triggs’s paper on MVG using tensor notation, see.
Fitting a Model to Data Reading: 15.1,
Fitting. Choose a parametric object/some objects to represent a set of tokens Most interesting case is when criterion is not local –can’t tell whether.
Lecture 8: Image Alignment and RANSAC
Automatic Image Alignment (feature-based) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Lecture 10: Robust fitting CS4670: Computer Vision Noah Snavely.
1Jana Kosecka, CS 223b EM and RANSAC EM and RANSAC.
Fitting.
Robust fitting Prof. Noah Snavely CS1114
Automatic Camera Calibration
Image Stitching Ali Farhadi CSE 455
CSE 185 Introduction to Computer Vision
Chapter 6 Feature-based alignment Advanced Computer Vision.
CSC 589 Lecture 22 Image Alignment and least square methods Bei Xiao American University April 13.
1 Interest Operators Harris Corner Detector: the first and most basic interest operator Kadir Entropy Detector and its use in object recognition SIFT interest.
Fitting & Matching Lecture 4 – Prof. Bregler Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.
Structure from Motion Course web page: vision.cis.udel.edu/~cv April 25, 2003  Lecture 26.
Example: line fitting. n=2 Model fitting Measure distances.
3D Reconstruction Jeff Boody. Goals ● Reconstruct 3D models from a sequence of at least two images ● No prior knowledge of the camera or scene ● Use the.
Stereo Course web page: vision.cis.udel.edu/~cv April 11, 2003  Lecture 21.
Robot Vision SS 2007 Matthias Rüther 1 ROBOT VISION Lesson 6a: Shape from Stereo, short summary Matthias Rüther Slides partial courtesy of Marc Pollefeys.
Single View Geometry Course web page: vision.cis.udel.edu/cv April 9, 2003  Lecture 20.
Computing F. Content Background: Projective geometry (2D, 3D), Parameter estimation, Algorithm evaluation. Single View: Camera model, Calibration, Single.
Parameter estimation. 2D homography Given a set of (x i,x i ’), compute H (x i ’=Hx i ) 3D to 2D camera projection Given a set of (X i,x i ), compute.
Computer Vision : CISC 4/689 Going Back a little Cameras.ppt.
Geometric Transformations
Final Review Course web page: vision.cis.udel.edu/~cv May 21, 2003  Lecture 37.
776 Computer Vision Jan-Michael Frahm Spring 2012.
Fitting.
EE 7730 Parametric Motion Estimation. Bahadir K. Gunturk2 Parametric (Global) Motion Affine Flow.
Invariant Local Features Image content is transformed into local feature coordinates that are invariant to translation, rotation, scale, and other imaging.
776 Computer Vision Jan-Michael Frahm Spring 2012.
Fitting a transformation: feature-based alignment
Lecture 7: Image alignment
Homography From Wikipedia In the field of computer vision, any
A special case of calibration
3D Photography: Epipolar geometry
Multiple View Geometry Comp Marc Pollefeys
Fitting.
Segmentation by fitting a model: robust estimators and RANSAC
Introduction to Sensor Interpretation
Introduction to Sensor Interpretation
Calibration and homographies
Back to equations of geometric transformations
CS5760: Computer Vision Lecture 9: RANSAC Noah Snavely
Parameter estimation class 6
CS5760: Computer Vision Lecture 9: RANSAC Noah Snavely
Image Stitching Linda Shapiro ECE/CSE 576.
Lecture 11: Image alignment, Part 2
Image Stitching Linda Shapiro ECE P 596.
Presentation transcript:

Robust Estimation Course web page: vision.cis.udel.edu/~cv April 23, 2003  Lecture 25

Announcements Read Forsyth & Ponce Chapters , , and Hartley & Zisserman Chapter on triangulation and structure computation HW 4 is due Monday –Affine rectification: Choose H A carefully for texture mapping

Outline RANSAC

The Problem with Outliers Least squares is a technique for fitting a model to data that exhibit a Gaussian error distribution When there are outliers—data points that are not drawn from the same distribution— the estimation result can be biased Line fitting using regression is biased by outliers from Hartley & Zisserman

Robust Estimation View estimation as a two-stage process: –Classify data points as outliers or inliers –Fit model to inliers

RANSAC (RANdom SAmple Consensus) 1.Randomly choose minimal subset of data points necessary to fit model (a sample) 2.Points within some distance threshold t of model are a consensus set. Size of consensus set is model’s support 3.Repeat for N samples; model with biggest support is most robust fit –Points within t of best model are inliers –Fit final model to all inliers Two samples and their supports for line-fitting from Hartley & Zisserman

RANSAC: Picking the Distance Threshold t Usually chosen empirically But…when measurement error is known to be Gaussian with mean ¹ and variance ¾ 2 : –Sum of squared errors follows a  2 distribution with m DOF, where m is the DOF of the error measure (the codimension) E.g., m = 1 for line fitting because error is perpendicular distance E.g., m = 2 for point distance Examples for probability ® = 0.95 that point is inlier m Model t2t2 1Line, fundamental matrix 3.84 ¾ 2 2Homography, camera matrix 5.99 ¾ 2

RANSAC: How many samples? Using all possible samples is often infeasible Instead, pick N to assure probability p of at least one sample (containing s points) being all inliers where ² is probability that point is an outlier Typically p = 0.99

RANSAC: Computed N ( p = 0.99 ) Sample size Proportion of outliers ² s 5%10%20%25%30%40%50% adapted from Hartley & Zisserman

Example: N for the line-fitting problem n = 12 points Minimal sample size m = 2 2 outliers ) ² = 1/6 ¼ 20% So N = 5 gives us a 99% chance of getting a pure-inlier sample –Compared to N = 66 by trying every pair of points from Hartley & Zisserman

RANSAC: Determining N adaptively If the outlier fraction ² is not known initially, it can be estimated iteratively: 1.Set N = 1 and outlier fraction to worst case—e.g., ² = For every sample, count number of inliers (support) 3.Update outlier fraction if lower than previous estimate: ² = 1 ¡ (number of inliers) / (total number of points) 1.Set new value of N using formula 2.If number of samples checked so far exceeds current N, stop

After RANSAC RANSAC divides data into inliers and outliers and yields estimate computed from minimal set of inliers with greatest support Improve this initial estimate with ML estimation over all inliers (i.e., standard minimization) But this may change inliers, so alternate fitting with re-classification as inlier/outlier from Hartley & Zisserman

Automatic Fundamental Matrix F Estimation How to get correct correspondences without human intervention? from Hartley & Zisserman

Automatic F Estimation: Feature Extraction Find features in pair of images using corner detection— e.g., minimum eigenvalue over threshold of: from Hartley & Zisserman

Automatic F Estimation: Finding Feature Matches Best match over threshold within square search window (here §300 pixels) using SSD or normalized cross-correlation from Hartley & Zisserman

Automatic F Estimation: Finding Feature Matches Best match over threshold within square search window (here §300 pixels) using SSD or normalized cross-correlation from Hartley & Zisserman

Automatic F Estimation: Initial Match Hypotheses 188 matched features in left image pointing to locations of corresponding right image features from Hartley & Zisserman

Automatic F Estimation: Applying RANSAC Sampling –Size: Recall that the DOF of the fundamental matrix is 7, so s = 7 : 9 entries in 3 x 3 matrix – homogeneous scaling – rank 2 constraint –Choice Disregard degenerate configurations Ensure points have good spatial distribution over image Distance measure –Obvious choice is symmetric epipolar distance already defined –Better choice is reprojection error or approximation of reprojection error Involves simultaneous estimation of F and 3-D point locations

Automatic F Estimation: Outliers & Inliers after RANSAC 407 samples used with t = 1.25 pixels –RMS pixel error with estimated F was outliers ( ² = 0.47 ) 99 inliers from Hartley & Zisserman