Model Fitting Computer Vision CS 143, Brown James Hays 10/03/11 Slides from Silvio Savarese, Svetlana Lazebnik, and Derek Hoiem.

Slides:



Advertisements
Similar presentations
Fitting: The Hough transform. Voting schemes Let each feature vote for all the models that are compatible with it Hopefully the noise features will not.
Advertisements

Feature Matching and Robust Fitting Computer Vision CS 143, Brown James Hays Acknowledgment: Many slides from Derek Hoiem and Grauman&Leibe 2008 AAAI Tutorial.
Photo by Carl Warner.
776 Computer Vision Jared Heinly Spring 2014 (slides borrowed from Jan-Michael Frahm, Svetlana Lazebnik, and others)
Fitting: The Hough transform
CS 558 C OMPUTER V ISION Lecture VIII: Fitting, RANSAC, and Hough Transform Adapted from S. Lazebnik.
1 Image Recognition - I. Global appearance patterns Slides by K. Grauman, B. Leibe.
Image alignment.
1 Model Fitting Hao Jiang Computer Science Department Oct 8, 2009.
Fitting. We’ve learned how to detect edges, corners, blobs. Now what? We would like to form a higher-level, more compact representation of the features.
Fitting: Concepts and recipes. Review: The Hough transform What are the steps of a Hough transform? What line parametrization is best for the Hough transform?
Fitting a Model to Data Reading: 15.1,
Lecture 8: Image Alignment and RANSAC
Automatic Image Alignment (feature-based) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and.
Fitting: The Hough transform
Robust estimation Problem: we want to determine the displacement (u,v) between pairs of images. We are given 100 points with a correlation score computed.
Lecture 10: Robust fitting CS4670: Computer Vision Noah Snavely.
Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography Martin A. Fischler, Robert C. Bolles.
1Jana Kosecka, CS 223b EM and RANSAC EM and RANSAC.
Fitting.
Alignment and Object Instance Recognition Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/16/12.
כמה מהתעשייה? מבנה הקורס השתנה Computer vision.
Fitting & Matching Lectures 7 & 8 – Prof. Fergus Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.
Fitting and Registration
Image Stitching Ali Farhadi CSE 455
Image alignment.
CSE 185 Introduction to Computer Vision
CSC 589 Lecture 22 Image Alignment and least square methods Bei Xiao American University April 13.
Computer Vision - Fitting and Alignment
Lecture 12: Image alignment CS4670/5760: Computer Vision Kavita Bala
1 Robust estimation techniques in real-time robot vision Ezio Malis, Eric Marchand INRIA Sophia, projet ICARE INRIA Rennes, projet Lagadic.
Fitting and Registration Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/14/12.
Fitting: The Hough transform. Voting schemes Let each feature vote for all the models that are compatible with it Hopefully the noise features will not.
Fitting & Matching Lecture 4 – Prof. Bregler Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.
Photo by Carl Warner. Feature Matching and Robust Fitting Computer Vision James Hays Acknowledgment: Many slides from Derek Hoiem and Grauman&Leibe.
CS 1699: Intro to Computer Vision Matching and Fitting Prof. Adriana Kovashka University of Pittsburgh September 29, 2015.
CS654: Digital Image Analysis Lecture 25: Hough Transform Slide credits: Guillermo Sapiro, Mubarak Shah, Derek Hoiem.
Fitting: The Hough transform
Edge Detection and Geometric Primitive Extraction Jinxiang Chai.
EECS 274 Computer Vision Model Fitting. Fitting Choose a parametric object/some objects to represent a set of points Three main questions: –what object.
Computer Vision - Fitting and Alignment (Slides borrowed from various presentations)
CSE 185 Introduction to Computer Vision Feature Matching.
776 Computer Vision Jan-Michael Frahm Spring 2012.
CS 2750: Machine Learning Linear Regression Prof. Adriana Kovashka University of Pittsburgh February 10, 2016.
Fitting.
Hough Transform CS 691 E Spring Outline Hough transform Homography Reading: FP Chapter 15.1 (text) Some slides from Lazebnik.
CS 558 C OMPUTER V ISION Fitting, RANSAC, and Hough Transfrom Adapted from S. Lazebnik.
Grouping and Segmentation. Sometimes edge detectors find the boundary pretty well.
CS 2770: Computer Vision Fitting Models: Hough Transform & RANSAC
Segmentation by fitting a model: Hough transform and least squares
CS 4501: Introduction to Computer Vision Sparse Feature Descriptors: SIFT, Model Fitting, Panorama Stitching. Connelly Barnes Slides from Jason Lawrence,
University of Ioannina
Fitting.
Line Fitting James Hayes.
Fitting: The Hough transform
Lecture 7: Image alignment
CS 2750: Machine Learning Linear Regression
A special case of calibration
Fitting.
Photo by Carl Warner.
Prof. Adriana Kovashka University of Pittsburgh October 19, 2016
Photo by Carl Warner.
Segmentation by fitting a model: robust estimators and RANSAC
CS 1675: Intro to Machine Learning Regression and Overfitting
Fitting CS 678 Spring 2018.
776 Computer Vision Jan-Michael Frahm.
Lecture 8: Image alignment
CSE 185 Introduction to Computer Vision
Calibration and homographies
CS5760: Computer Vision Lecture 9: RANSAC Noah Snavely
Presentation transcript:

Model Fitting Computer Vision CS 143, Brown James Hays 10/03/11 Slides from Silvio Savarese, Svetlana Lazebnik, and Derek Hoiem

Fitting: find the parameters of a model that best fit the data Alignment: find the parameters of the transformation that best align matched points

Example: Computing vanishing points Slide from Silvio Savarese

H Example: Estimating an homographic transformation Slide from Silvio Savarese

Example: Estimating “fundamental matrix” that corresponds two views Slide from Silvio Savarese

A Example: fitting an 2D shape template Slide from Silvio Savarese

Example: fitting a 3D object model Slide from Silvio Savarese

Critical issues: noisy data Slide from Silvio Savarese

A Critical issues: intra-class variability “All models are wrong, but some are useful.” Box and Draper 1979 Slide from Silvio Savarese

H Critical issues: outliers Slide from Silvio Savarese

Critical issues: missing data (occlusions) Slide from Silvio Savarese

Fitting and Alignment Design challenges – Design a suitable goodness of fit measure Similarity should reflect application goals Encode robustness to outliers and noise – Design an optimization method Avoid local optima Find best parameters quickly Slide from Derek Hoiem

Fitting and Alignment: Methods Global optimization / Search for parameters – Least squares fit – Robust least squares – Iterative closest point (ICP) Hypothesize and test – Generalized Hough transform – RANSAC Slide from Derek Hoiem

Simple example: Fitting a line Slide from Derek Hoiem

Least squares line fitting Data: (x 1, y 1 ), …, (x n, y n ) Line equation: y i = m x i + b Find ( m, b ) to minimize (x i, y i ) y=mx+b Matlab: p = A \ y; Modified from S. Lazebnik

Problem with “vertical” least squares Not rotation-invariant Fails completely for vertical lines Slide from S. Lazebnik

Total least squares If (a 2 +b 2 =1) then Distance between point (x i, y i ) is |ax i + by i + c| (x i, y i ) ax+by+c=0 Unit normal: N=(a, b) Slide modified from S. Lazebnik proof: LineDistance2-Dimensional.html LineDistance2-Dimensional.html

Total least squares If (a 2 +b 2 =1) then Distance between point (x i, y i ) is |ax i + by i + c| Find (a, b, c) to minimize the sum of squared perpendicular distances (x i, y i ) ax+by+c=0 Unit normal: N=(a, b) Slide modified from S. Lazebnik

Total least squares Find (a, b, c) to minimize the sum of squared perpendicular distances (x i, y i ) ax+by+c=0 Unit normal: N=(a, b) Solution is eigenvector corresponding to smallest eigenvalue of A T A See details on Raleigh Quotient: Slide modified from S. Lazebnik

Recap: Two Common Optimization Problems Problem statementSolution Problem statementSolution (matlab)

Least squares: Robustness to noise Least squares fit to the red points: Slides from Svetlana Lazebnik

Least squares: Robustness to noise Least squares fit with an outlier: Problem: squared error heavily penalizes outliers

Search / Least squares conclusions Good Clearly specified objective Optimization is easy (for least squares) Bad Not appropriate for non-convex objectives – May get stuck in local minima Sensitive to outliers – Bad matches, extra points Doesn’t allow you to get multiple good fits – Detecting multiple objects, lines, etc. Slide from Derek Hoiem

Robust least squares (to deal with outliers) General approach: minimize u i (x i, θ) – residual of i th point w.r.t. model parameters θ ρ – robust function with scale parameter σ The robust function ρ Favors a configuration with small residuals Constant penalty for large residuals Slide from S. Savarese

Choosing the scale: Just right The effect of the outlier is minimized

The error value is almost the same for every point and the fit is very poor Choosing the scale: Too small

Choosing the scale: Too large Behaves much the same as least squares

Robust estimation: Details Robust fitting is a nonlinear optimization problem that must be solved iteratively Least squares solution can be used for initialization Adaptive choice of scale: approx. 1.5 times median residual (F&P, Sec )

Hypothesize and test 1.Propose parameters – Try all possible – Each point votes for all consistent parameters – Repeatedly sample enough points to solve for parameters 2.Score the given parameters – Number of consistent points, possibly weighted by distance 3.Choose from among the set of parameters – Global or local maximum of scores 4.Possibly refine parameters using inliers

x y b m y = m x + b Hough transform Given a set of points, find the curve or line that explains the data points best P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Hough space Slide from S. Savarese

x y b m x ym b Hough transform Slide from S. Savarese

x y Hough transform Issue : parameter space [m,b] is unbounded… P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Hough space Use a polar representation for the parameter space Slide from S. Savarese

featuresvotes Issue: Grid size needs to be adjusted… Hough transform - experiments Noisy data Slide from S. Savarese

Generalized Hough transform We want to find a template defined by its reference point (center) and several distinct types of landmark points in stable spatial configuration c Template

Generalized Hough transform Template representation: for each type of landmark point, store all possible displacement vectors towards the center Model Template

Generalized Hough transform Detecting the template: For each feature in a new image, look up that feature type in the model and vote for the possible center locations associated with that type in the model Model Test image

Application in recognition Index displacements by “visual codeword” B. Leibe, A. Leonardis, and B. Schiele, Combined Object Categorization and Segmentation with an Implicit Shape Model, ECCV Workshop on Statistical Learning in Computer Vision 2004Combined Object Categorization and Segmentation with an Implicit Shape Model training image visual codeword with displacement vectors

Application in recognition Index displacements by “visual codeword” B. Leibe, A. Leonardis, and B. Schiele, Combined Object Categorization and Segmentation with an Implicit Shape Model, ECCV Workshop on Statistical Learning in Computer Vision 2004Combined Object Categorization and Segmentation with an Implicit Shape Model test image

Hough transform conclusions Good Robust to outliers: each point votes separately Fairly efficient (often faster than trying all sets of parameters) Provides multiple good fits Bad Some sensitivity to noise Bin size trades off between noise tolerance, precision, and speed/memory – Can be hard to find sweet spot Not suitable for more than a few parameters – grid size grows exponentially Common applications Line fitting (also circles, ellipses, etc.) Object instance recognition (parameters are affine transform) Object category recognition (parameters are position/scale)

such that: Model parameters RANSAC Fischler & Bolles in ‘81. (RANdom SAmple Consensus) : Learning technique to estimate parameters of a model by random sampling of observed data Source: Savarese

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Fischler & Bolles in ‘81. (RANdom SAmple Consensus) :

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Illustration by Savarese Line fitting example

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Line fitting example

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Line fitting example

RANSAC Algorithm: 1. Sample (randomly) the number of points required to fit the model (#=2) 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence

Choosing the parameters Initial number of points s Typically minimum number needed to fit the model Distance threshold t Choose t so probability for inlier is p (e.g. 0.95) Zero-mean Gaussian noise with std. dev. σ: t 2 =3.84σ 2 Number of samples N Choose N so that, with probability p, at least one random sample is free from outliers (e.g. p=0.99) (outlier ratio: e) proportion of outliers e s5%10%20%25%30%40%50% Source: M. Pollefeys

RANSAC conclusions Good Robust to outliers Applicable for larger number of parameters than Hough transform Parameters are easier to choose than Hough transform Bad Computational time grows quickly with fraction of outliers and number of parameters Not good for getting multiple fits Common applications Computing a homography (e.g., image stitching) Estimating fundamental matrix (relating two views)

What if you want to align but have no prior matched pairs? Hough transform and RANSAC not applicable Important applications Medical imaging: match brain scans or contours Robotics: match point clouds Slide from Derek Hoiem

Iterative Closest Points (ICP) Algorithm Goal: estimate transform between two dense sets of points 1.Assign each point in {Set 1} to its nearest neighbor in {Set 2} 2.Estimate transformation parameters – e.g., least squares or robust least squares 3.Transform the points in {Set 1} using estimated parameters 4.Repeat steps 1-3 until change is very small Slide from Derek Hoiem

Example: solving for translation A1A1 A2A2 A3A3 B1B1 B2B2 B3B3 Given matched points in {A} and {B}, estimate the translation of the object

Example: solving for translation A1A1 A2A2 A3A3 B1B1 B2B2 B3B3 Least squares solution (t x, t y ) 1.Write down objective function 2.Derived solution a)Compute derivative b)Compute solution 3.Computational solution a)Write in form Ax=b b)Solve using pseudo-inverse or eigenvalue decomposition

Example: solving for translation A1A1 A2A2 A3A3 B1B1 B2B2 B3B3 RANSAC solution (t x, t y ) 1.Sample a set of matching points (1 pair) 2.Solve for transformation parameters 3.Score parameters with number of inliers 4.Repeat steps 1-3 N times Problem: outliers A4A4 A5A5 B5B5 B4B4

Example: solving for translation A1A1 A2A2 A3A3 B1B1 B2B2 B3B3 Hough transform solution (t x, t y ) 1.Initialize a grid of parameter values 2.Each matched pair casts a vote for consistent values 3.Find the parameters with the most votes 4.Solve using least squares with inliers A4A4 A5A5 A6A6 B4B4 B5B5 B6B6 Problem: outliers, multiple objects, and/or many-to-one matches

Example: solving for translation (t x, t y ) Problem: no initial guesses for correspondence ICP solution 1.Find nearest neighbors for each point 2.Compute transform using matches 3.Move points using transform 4.Repeat steps 1-3 until convergence