Prof. Adriana Kovashka University of Pittsburgh October 19, 2016

Slides:



Advertisements
Similar presentations
Fitting: The Hough transform. Voting schemes Let each feature vote for all the models that are compatible with it Hopefully the noise features will not.
Advertisements

Feature Matching and Robust Fitting Computer Vision CS 143, Brown James Hays Acknowledgment: Many slides from Derek Hoiem and Grauman&Leibe 2008 AAAI Tutorial.
Photo by Carl Warner.
Fitting: The Hough transform
1 Model Fitting Hao Jiang Computer Science Department Oct 8, 2009.
Fitting. We’ve learned how to detect edges, corners, blobs. Now what? We would like to form a higher-level, more compact representation of the features.
Fitting a Model to Data Reading: 15.1,
Fitting: The Hough transform
Robust estimation Problem: we want to determine the displacement (u,v) between pairs of images. We are given 100 points with a correlation score computed.
Fitting.
כמה מהתעשייה? מבנה הקורס השתנה Computer vision.
Fitting and Registration
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean Hall 5409 T-R 10:30am – 11:50am.
October 8, 2013Computer Vision Lecture 11: The Hough Transform 1 Fitting Curve Models to Edges Most contours can be well described by combining several.
CSE 185 Introduction to Computer Vision
Computer Vision - Fitting and Alignment
Fitting and Registration Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/14/12.
Fitting: The Hough transform. Voting schemes Let each feature vote for all the models that are compatible with it Hopefully the noise features will not.
Fitting & Matching Lecture 4 – Prof. Bregler Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.
Photo by Carl Warner. Feature Matching and Robust Fitting Computer Vision James Hays Acknowledgment: Many slides from Derek Hoiem and Grauman&Leibe.
Fitting : Voting and the Hough Transform Monday, Feb 14 Prof. Kristen Grauman UT-Austin.
Object Detection 01 – Advance Hough Transformation JJCAO.
CS 1699: Intro to Computer Vision Matching and Fitting Prof. Adriana Kovashka University of Pittsburgh September 29, 2015.
CS654: Digital Image Analysis Lecture 25: Hough Transform Slide credits: Guillermo Sapiro, Mubarak Shah, Derek Hoiem.
Fitting: The Hough transform
Model Fitting Computer Vision CS 143, Brown James Hays 10/03/11 Slides from Silvio Savarese, Svetlana Lazebnik, and Derek Hoiem.
Edge Detection and Geometric Primitive Extraction Jinxiang Chai.
EECS 274 Computer Vision Model Fitting. Fitting Choose a parametric object/some objects to represent a set of points Three main questions: –what object.
Fitting Thursday, Sept 24 Kristen Grauman UT-Austin.
1 Model Fitting Hao Jiang Computer Science Department Sept 30, 2014.
Computer Vision - Fitting and Alignment (Slides borrowed from various presentations)
CSE 185 Introduction to Computer Vision Feature Matching.
776 Computer Vision Jan-Michael Frahm Spring 2012.
Fitting : Voting and the Hough Transform Thursday, September 17 th 2015 Devi Parikh Virginia Tech 1 Slide credit: Kristen Grauman Disclaimer: Many slides.
CS 2750: Machine Learning Linear Regression Prof. Adriana Kovashka University of Pittsburgh February 10, 2016.
Fitting.
Hough Transform CS 691 E Spring Outline Hough transform Homography Reading: FP Chapter 15.1 (text) Some slides from Lazebnik.
776 Computer Vision Jan-Michael Frahm Spring 2012.
CS 2770: Computer Vision Fitting Models: Hough Transform & RANSAC
Segmentation by fitting a model: Hough transform and least squares
CS 4501: Introduction to Computer Vision Sparse Feature Descriptors: SIFT, Model Fitting, Panorama Stitching. Connelly Barnes Slides from Jason Lawrence,
Lecture 07 13/12/2011 Shai Avidan הבהרה: החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
University of Ioannina
Fitting a transformation: feature-based alignment
Fitting: Voting and the Hough Transform
Fitting.
Line Fitting James Hayes.
Fitting: The Hough transform
Exercise Class 11: Robust Tecuniques RANSAC, Hough Transform
Lecture 7: Image alignment
Fitting: Voting and the Hough Transform (part 2)
CS 2750: Machine Learning Linear Regression
A special case of calibration
Fitting: Voting and the Hough Transform April 24th, 2018
Fitting Curve Models to Edges
Fitting.
Photo by Carl Warner.
Photo by Carl Warner.
Segmentation by fitting a model: robust estimators and RANSAC
CS 1675: Intro to Machine Learning Regression and Overfitting
Fitting CS 678 Spring 2018.
776 Computer Vision Jan-Michael Frahm.
CSE 185 Introduction to Computer Vision
Calibration and homographies
CS5760: Computer Vision Lecture 9: RANSAC Noah Snavely
Introduction to Artificial Intelligence Lecture 22: Computer Vision II
CS5760: Computer Vision Lecture 9: RANSAC Noah Snavely
Prof. Adriana Kovashka University of Pittsburgh September 26, 2019
Presentation transcript:

Prof. Adriana Kovashka University of Pittsburgh October 19, 2016 CS 1674: Intro to Computer Vision Fitting Models: Hough Transform & RANSAC Prof. Adriana Kovashka University of Pittsburgh October 19, 2016

Plan for today Last lecture: Detecting edges This lecture: Detecting lines (and other shapes) Find the parameters of a line that best fits our data Least squares Hough transform RANSAC

Characterizing edges An edge is a place of rapid change in the image intensity function intensity function (along horizontal scanline) image gradient (first derivative) edges correspond to extrema of gradient Adapted from L. Lazebnik

Gradients -> edges Primary edge detection steps: 1. Smoothing: suppress noise 2. Edge enhancement: filter for contrast (compute gradients) 3. Edge localization: determine which local maxima from filter output are actually edges vs. noise Basic step: Thresholding Choose a threshold value t Set any pixels less than t = 0 (off) Set any pixels greater than or equal to t = 1 (on) More advanced steps: Thinning and hysteresis Adapted from K. Grauman

Original image Source: K. Grauman

Thresholding gradient with a higher threshold Source: K. Grauman

Related: Line detection (fitting) Why fit lines? Many objects characterized by presence of straight lines Why aren’t we done just by running edge detection? Kristen Grauman CS 376 Lecture 8 Fitting

Difficulty of line fitting Noise in measured edge points, orientations: e.g. edges not collinear where they should be how to detect true underlying parameters? Extra edge points (clutter): which points go with which line, if any? Only some parts of each line detected, and some parts are missing: how to find a line that bridges missing evidence? Adapted from Kristen Grauman CS 376 Lecture 8 Fitting

Fitting other objects Want to associate a model with observed features The model could be a line, a circle, or an arbitrary shape [Fig from Marszalek & Schmid, 2007] Kristen Grauman CS 376 Lecture 8 Fitting

Least squares line fitting Data: (x1, y1), …, (xn, yn) Line equation: yi = m xi + b Find (m, b) to minimize y=mx+b (xi, yi) where line you found tells you point is along y axis where point really is along y axis You want to find a single line that “explains” all of the points in your data, but data may be noisy! Matlab: p = A \ y; Adapted from Svetlana Lazebnik

Outliers affect least squares fit Kristen Grauman CS 376 Lecture 11 11

Outliers affect least squares fit Kristen Grauman CS 376 Lecture 11 12

Outliers Outliers can hurt the quality of our parameter estimates: E.g. an edge point that is noise, or doesn’t belong to the line we are fitting Two common ways to deal with outliers, both boil down to voting: Hough transform RANSAC Kristen Grauman CS 376 Lecture 11

Dealing with outliers: Voting Voting is a general technique where we let the features vote for all models that are compatible with it. Cycle through features, cast votes for model parameters. Look for model parameters that receive a lot of votes. Noise & clutter features? They will cast votes too, but typically their votes should be inconsistent with the majority of “good” features. Adapted from Kristen Grauman CS 376 Lecture 8 Fitting

Fitting lines: Hough transform Given points that belong to a line, what is the line? How many lines are there? Which points belong to which lines? Hough Transform is a voting technique that can be used to answer all of these questions. Main idea: Record vote for each possible line on which some edge point lies. Look for lines that get many votes. Kristen Grauman CS 376 Lecture 8 Fitting

Finding lines in an image: Hough space y b b0 x m0 m image space Hough (parameter) space Connection between image (x,y) and Hough (m,b) spaces A line in the image corresponds to a point in Hough space Steve Seitz CS 376 Lecture 8 Fitting

Finding lines in an image: Hough space y b y0 Answer: the solutions of b = -x0m + y0 This is a line in Hough space x0 x m image space Hough (parameter) space Connection between image (x,y) and Hough (m,b) spaces A line in the image corresponds to a point in Hough space What does a point (x0, y0) in the image space map to? To go from image space to Hough space: given a pair of points (x,y), find all (m,b) such that y = mx + b Adapted from Steve Seitz CS 376 Lecture 8 Fitting

Finding lines in an image: Hough space y b (x1, y1) y0 (x0, y0) b = –x1m + y1 x0 x m image space Hough (parameter) space What are the line parameters for the line that contains both (x0, y0) and (x1, y1)? It is the intersection of the lines b = –x0m + y0 and b = –x1m + y1 Steve Seitz CS 376 Lecture 8 Fitting

Finding lines in an image: Hough space y b x m image space Hough (parameter) space How can we use this to find the most likely parameters (m,b) for the most prominent line in the image space? Let each edge point in image space vote for a set of possible parameters in Hough space Accumulate votes in discrete set of bins; parameters with the most votes indicate line in image space. Steve Seitz CS 376 Lecture 8 Fitting

Finding lines in an image: Hough space y m b x x y m 3 5 2 7 11 10 4 1 b 􀁑 Basic ideas • A line is represented as . Every line in the image corresponds to a point in the parameter space • Every point in the image domain corresponds to a line in the parameter space (why ? Fix (x,y), (m,n) can change on the line . In other words, for all the lines passing through (x,y) in the image space, their parameters form a line in the parameter space) • Points along a line in the space correspond to lines passing through the same point in the parameter space (why ?) Adapted from Silvio Savarese

Parameter space representation Problems with the (m,b) space: Unbounded parameter domains Vertical lines require infinite m Svetlana Lazebnik

Parameter space representation Problems with the (m,b) space: Unbounded parameter domains Vertical lines require infinite m Alternative: polar representation Each point (x,y) will add a sinusoid in the (,) parameter space Svetlana Lazebnik

Parameter space representation P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Use a polar representation for the parameter space Each line is a sinusoid in Hough parameter space y 􀁑 Basic ideas • A line is represented as . Every line in the image corresponds to a point in the parameter space • Every point in the image domain corresponds to a line in the parameter space (why ? Fix (x,y), (m,n) can change on the line . In other words, for all the lines passing through (x,y) in the image space, their parameters form a line in the parameter space) • Points along a line in the space correspond to lines passing through the same point in the parameter space (why ?) x Hough space Silvio Savarese

Algorithm outline Initialize accumulator H to all zeros For each feature point (x,y) in the image For θ = 0 to 180 ρ = x cos θ + y sin θ H(θ, ρ) = H(θ, ρ) + 1 end end Find the value(s) of (θ*, ρ*) where H(θ, ρ) is a local maximum The detected line in the image is given by ρ* = x cos θ* + y sin θ* ρ θ Svetlana Lazebnik

Hough transform example Derek Hoiem http://ostatic.com/files/images/ss_hough.jpg

Impact of noise on Hough d y x  Image space edge coordinates Votes Silvio Savarese CS 376 Lecture 8 Fitting

Impact of noise on Hough d y x  Image space edge coordinates Votes What difficulty does this present for an implementation? Kristen Grauman CS 376 Lecture 8 Fitting

Impact of noise on Hough Noisy data This is 15.1 lower half features votes Need to adjust grid size or smooth Silvio Savarese

Impact of noise on Hough Image space edge coordinates Votes Here, everything appears to be “noise”, or random edge points, but we still see peaks in the vote space. Kristen Grauman CS 376 Lecture 8 Fitting

Algorithm outline: Let’s simplify Initialize accumulator H to all zeros For each feature point (x,y) in the image For θ = 0 to 180 ρ = x cos θ + y sin θ H(θ, ρ) = H(θ, ρ) + 1 end end Find the value(s) of (θ*, ρ*) where H(θ, ρ) is a local maximum The detected line in the image is given by ρ* = x cos θ* + y sin θ* ρ θ Svetlana Lazebnik

Incorporating image gradients Recall: when we detect an edge point, we also know its gradient direction But this means that the line is uniquely determined! Modified Hough transform: For each feature point (x,y) in the image θ = gradient orientation at (x,y) ρ = x cos θ + y sin θ H(θ, ρ) = H(θ, ρ) + 1 end Svetlana Lazebnik

Hough transform for circles A circle with radius r and center (a, b) can be described as: x = a + r cos(θ) y = b + r sin(θ) (x, y) (a, b) CS 376 Lecture 8 Fitting

Hough transform for circles Circle: center (a, b) and radius r For a fixed radius r, unknown gradient direction Image space Hough space Kristen Grauman CS 376 Lecture 8 Fitting

Hough transform for circles Circle: center (a, b) and radius r For a fixed radius r, unknown gradient direction Intersection: most votes for center occur here. Image space Hough space Kristen Grauman CS 376 Lecture 8 Fitting

Hough transform for circles Circle: center (a, b) and radius r For an unknown radius r, unknown gradient direction b a r ? Image space Hough space Kristen Grauman CS 376 Lecture 8 Fitting

Hough transform for circles Circle: center (a, b) and radius r For an unknown radius r, unknown gradient direction r ? b a Image space Hough space Kristen Grauman CS 376 Lecture 8 Fitting

Hough transform for circles Circle: center (a, b) and radius r For an unknown radius r, known gradient direction x θ Image space Hough space Kristen Grauman CS 376 Lecture 8 Fitting

Hough transform for circles x = a + r cos(θ) y = b + r sin(θ) For every edge pixel (x,y) : For each possible radius value r: For each possible gradient direction θ: // or use estimated gradient at (x,y) a = x – r cos(θ) // column b = y – r sin(θ) // row H[a,b,r] += 1 end θ x Modified from Kristen Grauman CS 376 Lecture 8 Fitting

Example: detecting circles with Hough Original Edges Votes: Penny Note: a different Hough transform (with separate accumulators) was used for each circle radius (quarters vs. penny). Kristen Grauman, images from Vivek Kwatra CS 376 Lecture 8 Fitting

Example: detecting circles with Hough Combined detections Original Edges Votes: Quarter Note: a different Hough transform (with separate accumulators) was used for each circle radius (quarters vs. penny). Kristen Grauman, images from Vivek Kwatra CS 376 Lecture 8 Fitting

Example: iris detection Gradient+threshold Hough space (fixed radius) Max detections Hemerson Pistori and Eduardo Rocha Costa http://rsbweb.nih.gov/ij/plugins/hough-circles.html Kristen Grauman

Voting: practical tips Minimize irrelevant tokens first (reduce noise) Choose a good grid / discretization Too coarse: large votes obtained when too many different lines correspond to a single bucket Too fine: miss lines because points that are not exactly collinear cast votes for different buckets Vote for neighbors, also (smoothing in accumulator array) Use direction of edge to reduce parameters by 1 To read back which points voted for “winning” peaks, keep tags on the votes Too coarse Too fine ? Kristen Grauman CS 376 Lecture 8 Fitting

Generalized Hough transform (quickly) We want to find a template defined by its reference point (center) and several distinct types of landmark points in stable spatial configuration Template Triangle, circle, diamond: some type of visual token, e.g. feature or edge point -The center or reference point is not necessarily a visible point, it’s just the “origin” of the shape’s coordinate system -The landmark points don’t have to be located on the boundary, they just need to be in a stable spatial configuration w.r.t. the center - The types of landmark points correspond to “visual words” or features that have a distinct local appearance (how to identify such visual words will be discussed later in the course) c Adapted from Svetlana Lazebnik

Generalized Hough transform (quickly) What if we want to detect arbitrary shapes? (Specifically: where is the reference point?) Intuition: Model image Novel image Displacement vectors x x x Ref. point x x Vote space Now suppose those colors encode gradient directions… Adapted from Kristen Grauman CS 376 Lecture 8 Fitting

Generalized Hough transform (quickly) Define a model shape by its boundary points and a reference point. x Offline procedure: a At each boundary point, compute displacement vector: r = a – pi. Store these vectors in a table indexed by gradient orientation θ. p1 θ p2 θ Model shape θ … Kristen Grauman [Dana H. Ballard, Generalizing the Hough Transform to Detect Arbitrary Shapes, 1980] CS 376 Lecture 8 Fitting

Generalized Hough transform (quickly) Detection procedure: x For each edge point: Use its gradient orientation θ to index into stored table Use retrieved r vectors to vote for reference point x x θ x x θ θ p1 θ θ Novel image θ … Assuming translation is the only transformation here, i.e., orientation and scale are fixed. Kristen Grauman CS 376 Lecture 8 Fitting

Generalized Hough transform (quickly) Template representation: for each type of landmark point, store all possible displacement vectors towards the center Model Template Svetlana Lazebnik

Generalized Hough transform (quickly) Detecting the template: For each feature in a new image, look up that feature type in the model and vote for the possible center locations associated with that type in the model Model Test image Note that some of the features present in the template may actually be absent in the test image. The positions of the features can move around; new “noise” or “clutter” features can be added. The small circles represent votes for possible center location. Svetlana Lazebnik

Preview: Hough for object detection Index displacements by “visual codeword” training image “visual codeword” with displacement vectors B. Leibe, A. Leonardis, and B. Schiele, Combined Object Categorization and Segmentation with an Implicit Shape Model, ECCV Workshop on Statistical Learning in Computer Vision 2004 Svetlana Lazebnik CS 376 Lecture 8 Fitting

Hough transform: pros and cons All points are processed independently, so can cope with occlusion, gaps Some robustness to noise: noise points unlikely to contribute consistently to any single bin Can detect multiple instances of a model in a single pass Cons Complexity of search time for maxima increases exponentially with the number of model parameters If 3 parameters and 10 choices for each, search is O(103) Non-target shapes can produce spurious peaks in parameter space Quantization: can be tricky to pick a good grid size Adapted from Kristen Grauman CS 376 Lecture 8 Fitting

RANSAC RANdom Sample Consensus Approach: we want to avoid the impact of outliers, so let’s look for “inliers”, and use those only. Intuition: if an outlier is chosen to compute the current fit, then the resulting line won’t have much support from rest of the points. Kristen Grauman CS 376 Lecture 11

RANSAC: General form RANSAC loop: Randomly select a seed group of s points on which to base model estimate Fit model to these s points Find inliers to this model (i.e., points whose distance from the line is less than t) If there are d or more inliers, re-compute estimate of model on all of the inliers Repeat N times Keep the model with the largest number of inliers Modified from Kristen Grauman and Svetlana Lazebnik CS 376 Lecture 11

(RANdom SAmple Consensus) : RANSAC Fischler & Bolles in ‘81. (RANdom SAmple Consensus) : Line fitting example Algorithm: Sample (randomly) the number of points required to fit the model Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Let me give you an intuition of what is going on. Suppose we have the standard line fitting problem in presence of outliers. We can formulate this problem as follows: want to find the best partition of points in inlier set and outlier set such that… The objective consists of adjusting the parameters of a model function so as to best fit a data set. "best" is defined by a function f that needs to be minimized. Such that the best parameter of fitting the line give rise to a residual error lower that delta as when the sum, S, of squared residuals Silvio Savarese

(RANdom SAmple Consensus) : RANSAC Fischler & Bolles in ‘81. (RANdom SAmple Consensus) : Line fitting example Algorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Let me give you an intuition of what is going on. Suppose we have the standard line fitting problem in presence of outliers. We can formulate this problem as follows: want to find the best partition of points in inlier set and outlier set such that… The objective consists of adjusting the parameters of a model function so as to best fit a data set. "best" is defined by a function f that needs to be minimized. Such that the best parameter of fitting the line give rise to a residual error lower that delta as when the sum, S, of squared residuals Silvio Savarese

(RANdom SAmple Consensus) : RANSAC Fischler & Bolles in ‘81. (RANdom SAmple Consensus) : Line fitting example Algorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Let me give you an intuition of what is going on. Suppose we have the standard line fitting problem in presence of outliers. We can formulate this problem as follows: want to find the best partition of points in inlier set and outlier set such that… The objective consists of adjusting the parameters of a model function so as to best fit a data set. "best" is defined by a function f that needs to be minimized. Such that the best parameter of fitting the line give rise to a residual error lower that delta as when the sum, S, of squared residuals Silvio Savarese

(RANdom SAmple Consensus) : RANSAC Fischler & Bolles in ‘81. (RANdom SAmple Consensus) : Line fitting example Algorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Let me give you an intuition of what is going on. Suppose we have the standard line fitting problem in presence of outliers. We can formulate this problem as follows: want to find the best partition of points in inlier set and outlier set such that… The objective consists of adjusting the parameters of a model function so as to best fit a data set. "best" is defined by a function f that needs to be minimized. Such that the best parameter of fitting the line give rise to a residual error lower that delta as when the sum, S, of squared residuals Silvio Savarese

(RANdom SAmple Consensus) : RANSAC Fischler & Bolles in ‘81. (RANdom SAmple Consensus) : Line fitting example Algorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Let me give you an intuition of what is going on. Suppose we have the standard line fitting problem in presence of outliers. We can formulate this problem as follows: want to find the best partition of points in inlier set and outlier set such that… The objective consists of adjusting the parameters of a model function so as to best fit a data set. "best" is defined by a function f that needs to be minimized. Such that the best parameter of fitting the line give rise to a residual error lower that delta as when the sum, S, of squared residuals Silvio Savarese

Review: Keypoint matching for search In database 1. Find a set of distinctive key- points Query A1 A2 A3 B1 B2 B3 2. Define a region around each keypoint (window) 3. Compute a local descriptor from the region 4. Match descriptors Adapted from K. Grauman, B. Leibe

Example: solving for translation B1 B2 B3 Given matched points in {A} and {B}, estimate the translation of the object Derek Hoiem

Example: solving for translation B1 B2 B3 (tx, ty) Least squares solution Write down objective function in form Ax=b Solve using pseudo-inverse or eigenvalue decomposition Derek Hoiem

Example: solving for translation B4 B1 B2 B3 (tx, ty) A4 B5 Problem: outliers RANSAC solution Sample a set of matching points (1 pair) Solve for transformation parameters Score parameters with number of inliers Repeat steps 1-3 N times Derek Hoiem

Example: solving for translation B4 B5 B6 A1 A2 A3 B1 B2 B3 (tx, ty) A4 A5 A6 Problem: outliers, multiple objects, and/or many-to-one matches Hough transform solution Initialize a grid of parameter values Each matched pair casts a vote for consistent values Find the parameters with the most votes Solve using least squares with inliers Derek Hoiem

Fitting and Matching: Summary Fitting problems require finding any supporting evidence for a model, even within clutter and missing features. Voting and inlier approaches, such as the Hough transform and RANSAC, make it possible to find likely model parameters without searching all combinations of features. Can use these approaches to compute robust feature alignment/matching, and to match object templates. Adapted from Kristen Grauman and Derek Hoiem CS 376 Lecture 8 Fitting

The remaining slides show another more detailed example of RANSAC, and pros/cons: Review on your own time

RANSAC for line fitting example Source: R. Raguram CS 376 Lecture 11

RANSAC for line fitting example Least-squares fit Source: R. Raguram CS 376 Lecture 11

RANSAC for line fitting example Randomly select minimal subset of points Source: R. Raguram CS 376 Lecture 11

RANSAC for line fitting example Randomly select minimal subset of points Hypothesize a model Source: R. Raguram CS 376 Lecture 11

RANSAC for line fitting example Randomly select minimal subset of points Hypothesize a model Compute error function Source: R. Raguram CS 376 Lecture 11

RANSAC for line fitting example Randomly select minimal subset of points Hypothesize a model Compute error function Select points consistent with model Source: R. Raguram CS 376 Lecture 11

RANSAC for line fitting example Randomly select minimal subset of points Hypothesize a model Compute error function Select points consistent with model Repeat hypothesize-and-verify loop Source: R. Raguram CS 376 Lecture 11

RANSAC for line fitting example Randomly select minimal subset of points Hypothesize a model Compute error function Select points consistent with model Repeat hypothesize-and-verify loop Source: R. Raguram CS 376 Lecture 11

RANSAC for line fitting example Uncontaminated sample Randomly select minimal subset of points Hypothesize a model Compute error function Select points consistent with model Repeat hypothesize-and-verify loop Source: R. Raguram CS 376 Lecture 11

RANSAC for line fitting example Randomly select minimal subset of points Hypothesize a model Compute error function Select points consistent with model Repeat hypothesize-and-verify loop Source: R. Raguram CS 376 Lecture 11 74

How to choose parameters? Number of samples N Choose N so that, with probability p, at least one random sample is free from outliers (e.g. p=0.99) (outlier ratio: e ) Number of sampled points s Minimum number needed to fit the model Distance threshold  Choose  so that a good point with noise is likely (e.g., prob=0.95) within threshold Explanation in Szeliski 6.1.4 proportion of outliers e s 5% 10% 20% 25% 30% 40% 50% 2 3 5 6 7 11 17 4 9 19 35 13 34 72 12 26 57 146 16 24 37 97 293 8 20 33 54 163 588 44 78 272 1177 Marc Pollefeys and Derek Hoiem

RANSAC pros and cons Pros Cons Common applications Simple and general Applicable to many different problems Often works well in practice Cons Lots of parameters to tune (see previous slide) Doesn’t work well for low inlier ratios (too many iterations, or can fail completely) Can’t always get a good initialization of the model based on the minimum number of samples Common applications Image stitching, relating two views Spatial verification Svetlana Lazebnik CS 376 Lecture 11