RANdom SAmple Consensus

Slides:



Advertisements
Similar presentations
“Students” t-test.
Advertisements

Numerical Solution of Linear Equations
Linear Regression.
Brief introduction on Logistic Regression
Sample size estimation
1 1 Slide STATISTICS FOR BUSINESS AND ECONOMICS Seventh Edition AndersonSweeneyWilliams Slides Prepared by John Loucks © 1999 ITP/South-Western College.
PCA + SVD.
Robot Vision SS 2005 Matthias Rüther 1 ROBOT VISION Lesson 3: Projective Geometry Matthias Rüther Slides courtesy of Marc Pollefeys Department of Computer.
Integration of sensory modalities
x – independent variable (input)
Multiple View Geometry
Robust Estimator 學生 : 范育瑋 老師 : 王聖智. Outline Introduction LS-Least Squares LMS-Least Median Squares RANSAC- Random Sample Consequence MLESAC-Maximum likelihood.
Fitting. We’ve learned how to detect edges, corners, blobs. Now what? We would like to form a higher-level, more compact representation of the features.
Pengujian Hipotesis Nilai Tengah Pertemuan 19 Matakuliah: I0134/Metode Statistika Tahun: 2007.
Geometric Optimization Problems in Computer Vision.
Fitting a Model to Data Reading: 15.1,
Lecture 8: Image Alignment and RANSAC
Experimental Evaluation
Lecture 10: Robust fitting CS4670: Computer Vision Noah Snavely.
Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography Martin A. Fischler, Robert C. Bolles.
1Jana Kosecka, CS 223b EM and RANSAC EM and RANSAC.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Fitting.
כמה מהתעשייה? מבנה הקורס השתנה Computer vision.
(RANdom SAmple Consensus)
Parallel implementation of RAndom SAmple Consensus (RANSAC) Adarsh Kowdle.
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
Non-parametric Tests. With histograms like these, there really isn’t a need to perform the Shapiro-Wilk tests!
Disclosure risk when responding to queries with deterministic guarantees Krish Muralidhar University of Kentucky Rathindra Sarathy Oklahoma State University.
Functions of Several Variables Copyright © Cengage Learning. All rights reserved.
Brian Macpherson Ph.D, Professor of Statistics, University of Manitoba Tom Bingham Statistician, The Boeing Company.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
RANSAC Robust model estimation from data contaminated by outliers Ondřej Chum.
Sampling distributions rule of thumb…. Some important points about sample distributions… If we obtain a sample that meets the rules of thumb, then…
Fitting image transformations Prof. Noah Snavely CS1114
Machine Learning 5. Parametric Methods.
1 What happens to the location estimator if we minimize with a power other that 2? Robert J. Blodgett Statistic Seminar - March 13, 2008.
CS 2750: Machine Learning Linear Regression Prof. Adriana Kovashka University of Pittsburgh February 10, 2016.
Chapter 13 Understanding research results: statistical inference.
Network Partition –Finding modules of the network. Graph Clustering –Partition graphs according to the connectivity. –Nodes within a cluster is highly.
Robust Estimation Course web page: vision.cis.udel.edu/~cv April 23, 2003  Lecture 25.
Fitting.
MATH Section 4.4.
Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.
Simple and multiple regression analysis in matrix form Least square Beta estimation Beta Simple linear regression Multiple regression with two predictors.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Inferences Concerning Means.
Line fitting.
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
Virtual University of Pakistan
Estimating standard error using bootstrap
CSE 554 Lecture 8: Alignment
Chapter 4: Basic Estimation Techniques
(5) Notes on the Least Squares Estimate
Robot & Vision Lab. Wanjoo Park (revised by Dong-eun Seo)
Chapter 5 STATISTICS (PART 4).
A Brief Introduction of RANSAC
A special case of calibration
Fitting Curve Models to Edges
Singular Value Decomposition SVD
Segmentation by fitting a model: robust estimators and RANSAC
5.2 Least-Squares Fit to a Straight Line
Generally Discriminant Analysis
Lecture # 2 MATHEMATICAL STATISTICS
Introduction to Sensor Interpretation
Introduction to Sensor Interpretation
Calibration and homographies
Back to equations of geometric transformations
CS5760: Computer Vision Lecture 9: RANSAC Noah Snavely
MATH 2311 Section 4.4.
Presentation transcript:

RANdom SAmple Consensus RANSAC RANdom SAmple Consensus Prof. Mariolino De Cecco, Ing. Luca Baglivo, Ing. Nicolò Blasi, Ilya Afanasyev Department of Structural Mechanical Engineering, University of Trento Email: mariolino.dececco@ing.unitn.it http://www.mariolinodececco.altervista.org

ANALISI DI REGRESSIONE L’analisi di regressione consente di determinare i parametri di un modello in modo tale che “al meglio” interpreti i dati sperimentali mediante un legame algebrico ingresso-uscita. Analisi di regressione Fondamenti Regressione di un modello

Scopo Determinare i parametri ci, i=1,...m, in base alla ripetizione delle misure delle grandezze xi, i=1,...N, e delle corrispondenti uscite yi ed alla scelta del tipo di modello, minimizzando un certo indice di prestazione. Analisi di regressione Scopo 3. regressione del modello

L’insieme dei parametri si determina: metodo dei minimi quadrati Nel metodo dei minimi quadrati l’indice di prestazione è costituito dalla somma dei quadrati degli scarti (anche detti residui): Essendo: L’insieme dei parametri si determina: metodo dei minimi quadrati 3. regressione del modello

Caso lineare - Calcolo retta ai minimi quadrati: Si noti che i residui risultano essere lineari in funzione dei parametri da determinare Dunque anche il fitting con un polinomio qualsiasi risulta risolvibile in maniera analoga y ei parametri (a,b) Il metodo dei minimi quadrati Esempio: retta ai minimi quadrati N Dati sperimentali x 3. regressione del modello

3. regressione del modello Posizione del problema (La soluzione ha un solo minimo) Il metodo dei minimi quadrati Segue (retta ai minimi quadrati) Posizione del problema 3. regressione del modello

Soluzione Dove: 3. regressione del modello Il metodo dei minimi quadrati Segue (retta ai minimi quadrati) soluzione 3. regressione del modello

What happens if we have outliers ? Fitted line outlier Fitting with outliers

Let’s try to formalize a more general way of fitting data with outliers. The problem can be stated in the following way: given a set of 2D data points, find the line which minimizes the sum of squared perpendicular distances (orthogonal regression), subject to the condition that none of the valid points deviates from this line by more than a threshold t This is actually two problems: classification of the data into inliers (valid points) and outliers. The threshold t is set according to the measurement noise (for example t = 3), line fit to the data Fitting with outliers

Fitting with outliers The key idea is very simple: repeat N times the following: select randomly two points compute the connecting line the support for this line is measured by the number of points that lie within the distance threshold t the line with most support is deemed the robust fit The points within the threshold distance are the inliers (and constitute the consensus set). Fitting with outliers

The intuition is that if one of the points is an outlier then the line will not gain much support Fitting with outliers

Scoring a line by its support has the additional advantage of favouring better fits. The line (a, b) has a support of 10, whereas the line (a, d), where the sample points are neighbours, has a support of only 4. Consequently, even though both samples contain no outliers, the line (a. b) will be selected Fitting with outliers

As stated by Fischler and Bolles [Fischler, 1981]: "The RANSAC procedure is opposite to that of conventional smoothing techniques: Rather than using as much of the data as possible to obtain an initial solution and then attempting to eliminate the invalid data points (like for example using the Chauvenet criteria), RANSAC uses as small an initial data set as feasible and enlarges this set with consistent data when possible". Fitting with outliers

Objective Robust fit of a model to a data set S which contains outliers. Algorithm Randomly select a sample of s data points from S and instantiate the model from this subset Determine the set of data points Si which are within a distance threshold t of the model. The set Si, is the consensus set of the sample and defines the inliers of S If the size of Si (the number of inliers) is greater than some threshold T, re-estimate the model using all the points in Si and terminate (iv) If the size of Si is less than T, select a new subset and repeat the above, (v) After N trials the largest consensus set Si is selected, and the model is re-estimated using all the points in the subset Si Algorithm

Select a sample of s data { This to verify if the sample is able to provide a reliable fitting Verify the sample data Use the s data to estimate the model Determine the consensus set Si and its cardinality ns no ns >T { yes Use Si to estimate the model This to ensure a Maximum Likelikood solution Determine the new consensus set Si and the cardinality nsnew no nnew = nold yes Stop Algorithm revised

It is often computationally infeasible and unnecessary to try every possible sample. Instead the number of samples N is chosen sufficiently high to ensure with a probability, p, that at least one of the random samples of s points is free from outliers. Usually p is chosen at 0.99 The number N of samples required to ensure, with a probability p = 0.99, that at least one sample has no outliers for a given size of sample S, and proportion of outliers  Number of samples

Number of samples - consensus set For a given  and p the number of samples increases with the size of the minimal subset It might be thought that it would be advantageous to use more than the minimal subset, three or more points in the case of a line, because then a better estimate of the line would be obtained, and the measured support would more accurately reflect the true support. However, this possible advantage in measuring support is generally outweighed by the severe increase in computational cost incurred by the increase in the number of samples. How large is an acceptable consensus set? A rule of thumb is to terminate if the size of the consensus set is similar to the number of inliers believed to be in the data set, given the assumed proportion of outliers, i.e. for n data points T < (1 - ) n Number of samples - consensus set

An advantage of RANSAC is its ability to achieve robust estimation of the model parameters, i.e., it can estimate the parameters with a high degree of accuracy even when significant amount of outliers are present in the data set. A disadvantage of RANSAC is that there is no upper bound on the time it takes to compute these parameters. When an upper time bound is used (a maximum number of iterations) the solution obtained may not be the optimal one, it may not even be one that fits the data in a good way. A reasonable model can be produced by RANSAC only with a certain probability, a probability that becomes larger the more iterations that are used Another disadvantage of RANSAC is that it requires the setting of problem-specific thresholds. RANSAC - conclusions

Suppose we acquired the 3D cloud of points of a cube: Example - plane fitting

Example - main function close all clear all %% CARICHIAMO E PLOTTIAMO IL SET DI DATI load('Points.mat'); all_point = Points(: , 1:3); all_point(: , 1) = -all_point(: , 1); % convenzione Kinect ? % Plot all points figure('Name','All points') hold on axis equal plot3(Points(:,1), Points(:,2), Points(:,3), '.' ) %% FITTIAMO CON RANSAC IL PIANO % decimiamo i punti ss = length(all_point); ii = 1 : 10 : ss; all_pointDec = all_point(ii , :); % FIT RANSAC [B, P, inliers] = ransacfitplane(all_pointDec', 0.01, 1); % Plot risultato plotPlane3D( all_point' , B ) Example - main function

Example - RANSACFITPLANE % RANSACFITPLANE - fits plane to 3D array of points using RANSAC % Usage [B, P, inliers] = ransacfitplane(XYZ, t, feedback) % This function uses the RANSAC algorithm to robustly fit a plane % to a set of 3D data points. % % Arguments: % XYZ - 3xNpts array of xyz coordinates to fit plane to. % t - The distance threshold between data point and the plane % used to decide whether a point is an inlier or not. % feedback - Optional flag 0 or 1 to turn on RANSAC feedback % information. % Returns: % B - 4x1 array of plane coefficients in the form % b(1)*X + b(2)*Y +b(3)*Z + b(4) = 0 % The magnitude of B is 1. % This plane is obtained by a least squares fit to all the % points that were considered to be inliers, hence this % plane will be slightly different to that defined by P below. % P - The three points in the data set that were found to % define a plane having the most number of inliers. % The three columns of P defining the three points. % inliers - The indices of the points that were considered % inliers to the fitted plane. % Copyright (c) 2003-2008 Peter Kovesi Example - RANSACFITPLANE

Example - RANSACFITPLANE s = 3; % Minimum No of points needed to fit a plane. fittingfn = @defineplane; distfn = @planeptdist; degenfn = @isdegenerate; [P, inliers] = ransac(XYZ, fittingfn, distfn, degenfn, s, t, feedback); % Perform least squares fit by means of the inlying points B = fitplane(XYZ(:,inliers)); %------------------------------------------------------------------------ % Function to define a plane given 3 data points as required by % RANSAC. In our case we use the 3 points directly to define the plane. function P = defineplane(X); P = X; Example - RANSACFITPLANE

Example - RANSACFITPLANE % Function to calculate distances between a plane and a an array of points in the X matrix where % each column represents the x, y, z coordinates (X: 3 x Npts array of xyz coordinates) % The plane is defined by a 3x3 matrix, P. The three columns of P defining % three points that are within the plane. function [inliers, P] = planeptdist(P, X, t) n = cross(P(:,2)-P(:,1), P(:,3)-P(:,1)); % Plane normal. n = n/norm(n); % Make it a unit vector. npts = length(X); d = zeros(npts,1); % d will be an array of distance values. % The perpendicular distance from the plane to each point: % d = n’ * ( X - P(:,1) ) [1 x Npts] for i=1:3 d = d + ( X(i,:)’ - P(i,1) ) * n(i) ; end inliers = find(abs(d) < t); % Function to determine whether a set of 3 points are in a degenerate % configuration for fitting a plane as required by RANSAC. In this case % they are degenerate if they are colinear. function r = isdegenerate(X) % The three columns of X are the coords of the 3 points. r = iscolinear(X(:,1),X(:,2),X(:,3)); Example - RANSACFITPLANE

At the end we obtain the following fitted ‘floor’ plane Example - OUTCOME

Ideas for homework Ideas for homework To use the camera view to assert consensus. Colour or curvature information could be used To use the 3D and colour information to assert consensus To modify the consensus criteria simulating the laser view of the fitted object Ideas for homework

…. ….