3D Geometry for Computer Graphics. 2 The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.

Slides:



Advertisements
Similar presentations
Ordinary Least-Squares
Advertisements

3D Geometry for Computer Graphics
How to find intersection of lines? Snehal Poojary.
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
STROUD Worked examples and exercises are in the text PROGRAMME F6 POLYNOMIAL EQUATIONS.
Numeriska beräkningar i Naturvetenskap och Teknik Today’s topic: Approximations Least square method Interpolations Fit of polynomials Splines.
Robot Vision SS 2005 Matthias Rüther 1 ROBOT VISION Lesson 3: Projective Geometry Matthias Rüther Slides courtesy of Marc Pollefeys Department of Computer.
4.4 Application--OLS Estimation. Background When we do a study of data and are looking at the relationship between 2 variables, and have reason to believe.
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Curve-Fitting Regression
Curves Locus of a point moving with one degree of freedom
1 MF-852 Financial Econometrics Lecture 2 Matrix Operations in Econometrics, Optimization with Excel Roy J. Epstein Fall 2003.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Orthogonality and Least Squares
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Computer Graphics Recitation The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
Last lecture summary independent vectors x
Chapter 10 Real Inner Products and Least-Square (cont.)
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Solve Systems of Equations By Graphing
BMI II SS06 – Class 3 “Linear Algebra” Slide 1 Biomedical Imaging II Class 3 – Mathematical Preliminaries: Elementary Linear Algebra 2/13/06.
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Unit 4: Modeling Topic 6: Least Squares Method April 1, 2003.
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Multivariate Statistics Matrix Algebra II W. M. van der Veld University of Amsterdam.
Chapter Content Real Vector Spaces Subspaces Linear Independence
AN ORTHOGONAL PROJECTION
SVD: Singular Value Decomposition
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
Orthogonality and Least Squares
Chapter 7 Inner Product Spaces 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Quadratics Solving equations Using “Completing the Square”
Elementary Linear Algebra Anton & Rorres, 9th Edition
Parameter estimation. 2D homography Given a set of (x i,x i ’), compute H (x i ’=Hx i ) 3D to 2D camera projection Given a set of (X i,x i ), compute.
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
L8 Optimal Design concepts pt D
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
3.1 – Solve Linear Systems by Graphing A system of two linear equations in two variables x and y, also called a linear system, consists of two equations.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Programme F6: Polynomial equations Worked examples and exercises are in the text STROUD PROGRAMME F6 POLYNOMIAL EQUATIONS.
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Parameter estimation class 5 Multiple View Geometry CPSC 689 Slides modified from Marc Pollefeys’ Comp
Least Squares Problems From Wikipedia, the free encyclopedia The method of least squares is a standard approach to the approximate solution of overdetermined.
Camera Calibration Course web page: vision.cis.udel.edu/cv March 24, 2003  Lecture 17.
Topic VII: Polynomial Functions 7.3 Solving Polynomial Equations.
Chapter 12: Data Analysis by linear least squares Overview: Formulate problem as an over-determined linear system of equations Solve the linear system.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Chapter 7 Inner Product Spaces
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
PROGRAMME F6 POLYNOMIAL EQUATIONS.
Least Squares Approximations
Ellipse Fitting COMP 4900C Winter 2008.
Orthogonal Projection
Orthogonality and Least Squares
Linear Algebra Lecture 40.
Elementary Linear Algebra
Elementary Linear Algebra Anton & Rorres, 9th Edition
Orthogonality and Least Squares
Approximation of Functions
Approximation of Functions
Presentation transcript:

3D Geometry for Computer Graphics

2 The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial surface fitting

3 y = f (x) Motivation Given data points, fit a function that is “close” to the points y x P i = (x i, y i )

4 Motivation Local surface fitting to 3D points

5 Line fitting Orthogonal offsets minimization – we already learned (PCA) x y

6 Line fitting y- offsets minimization x y P i = (x i, y i )

7 Line fitting Find a line y = ax + b that minimizes E(a,b) is quadratic in the unknown parameters a, b Another option would be, for example: But – it is not differentiable, harder to minimize…

8 Line fitting – LS minimization To find optimal a, b we differentiate E(a, b) : E(a, b) = (–2x i )[y i – (ax i + b)] = 0 E(a, b) = (–2)[y i – (ax i + b)] = 0

9 Line fitting – LS minimization We get two linear equations for a, b : (–2x i )[y i – (ax i + b)] = 0 (–2)[y i – (ax i + b)] = 0

10 Line fitting – LS minimization We get two linear equations for a, b : [x i y i – ax i 2 – bx i ] = 0 [y i – ax i – b] = 0

11 Line fitting – LS minimization We get two linear equations for a, b : ( x i 2 ) a + ( x i ) b = x i y i ( x i ) a + ( 1) b = y i

12 Line fitting – LS minimization Solve for a, b using e.g. Gauss elimination Question: why the solution is the minimum for the error function? E(a, b) = [y i – (ax i + b)] 2

13 Fitting polynomials y x

14 Decide on the degree of the polynomial, k Want to fit f (x) = a k x k + a k-1 x k-1 + … + a 1 x+ a 0 Minimize: E(a 0, a 1, …, a k ) = [y i – (a k x i k +a k-1 x i k-1 + …+a 1 x i +a 0 )] 2 E(a 0,…,a k ) = (– 2x m )[y i – (a k x i k +a k-1 x i k-1 +…+ a 0 )] = 0 Fitting polynomials

15 Fitting polynomials We get a linear system of k+1 in k+1 variables

16 General parametric fitting We can use this approach to fit any function f(x)  Specified by parameters a, b, c, …  The expression f(x) linearly depends on the parameters a, b, c, …

17 General parametric fitting Want to fit function f abc… (x) to data points (x i, y i )  Define E(a,b,c,…) = [y i – f abc… (x i )] 2  Solve the linear system

18 General parametric fitting It can even be some crazy function like Or in general:

19 Solving linear systems in LS sense Let’s look at the problem a little differently:  We have data points (x i, y i )  We want the function f(x) to go through the points:  i =1, …, n: y i = f(x i )  Strict interpolation is in general not possible In polynomials: n+1 points define a unique interpolation polynomial of degree n. So, if we have 1000 points and want a cubic polynomial, we probably won’t find it…

20 Solving linear systems in LS sense We have an over-determined linear system n  k : f(x 1 ) = 1 f 1 (x 1 ) + 2 f 2 (x 1 ) + … + k f k (x 1 ) = y 1 f(x 2 ) = 1 f 1 (x 2 ) + 2 f 2 (x 2 ) + … + k f k (x 2 ) = y 2 … f(x n ) = 1 f 1 (x n ) + 2 f 2 (x n ) + … + k f k (x n ) = y n

21 Solving linear systems in LS sense In matrix form:

22 Solving linear systems in LS sense In matrix form: Av = y

23 Solving linear systems in LS sense More constrains than variables – no exact solutions generally exist We want to find something that is an “approximate solution”:

24 Finding the LS solution v  R k Av  R n As we vary v, Av varies over the linear subspace of R n spanned by the columns of A: Av = A2A2 A1A1 AkAk 1 2. k = 1 A1A1 A2A2 AkAk + 2 +…+ k

25 Finding the LS solution We want to find the closest Av to y : Subspace spanned by columns of A y RnRn Av closest to y

26 Finding the LS solution The vector Av closest to y satisfies: (Av – y)  {subspace of A ’s columns}  column A i, = 0  i, A i T (Av – y) = 0 A T (Av – y) = 0 (A T A)v = A T y These are called the normal equations

27 Finding the LS solution We got a square symmetric system (A T A)v = A T y (k  k) If A has full rank (the columns of A are linearly independent) then (A T A) is invertible.

28 Weighted least squares Sometimes the problem also has weights to the constraints:

29 Local surface fitting to 3D points Normals? Lighting? Upsampling?

30 Local surface fitting to 3D points Locally approximate a polynomial surface from points

31 Fitting local polynomial X Y Z Reference plane Fit a local polynomial around a point P P

32 Fitting local polynomial surface Compute a reference plane that fits the points close to P Use the local basis defined by the normal to the plane! z x y

33 Fitting local polynomial surface Fit polynomial z = p(x,y) = ax 2 + bxy + cy 2 + dx + ey + f z x y

34 Fitting local polynomial surface Fit polynomial z = p(x,y) = ax 2 + bxy + cy 2 + dx + ey + f z x y

35 Fitting local polynomial surface Fit polynomial z = p(x,y) = ax 2 + bxy + cy 2 + dx + ey + f z x y

36 Fitting local polynomial surface Again, solve the system in LS sense: ax bx 1 y 1 + cy dx 1 + ey 1 + f = z 1 ax bx 2 y 2 + cy dx 2 + ey 2 + f = z 1... ax n 2 + bx n y n + cy n 2 + dx n + ey n + f = z n Minimize  ||z i – p(x i, y i )|| 2

37 Fitting local polynomial surface Also possible (and better) to add weights:  w i ||z i – p(x i, y i )|| 2, w i > 0 The weights get smaller as the distance from the origin point grows.

See you next time