Computer Graphics Recitation 11. 2 The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.

Slides:



Advertisements
Similar presentations
Ordinary Least-Squares
Advertisements

Eigen Decomposition and Singular Value Decomposition
3D Geometry for Computer Graphics
How to find intersection of lines? Snehal Poojary.
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Numeriska beräkningar i Naturvetenskap och Teknik Today’s topic: Approximations Least square method Interpolations Fit of polynomials Splines.
Robot Vision SS 2005 Matthias Rüther 1 ROBOT VISION Lesson 3: Projective Geometry Matthias Rüther Slides courtesy of Marc Pollefeys Department of Computer.
4.4 Application--OLS Estimation. Background When we do a study of data and are looking at the relationship between 2 variables, and have reason to believe.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Principal Component Analysis
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics. 2 The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Curve-Fitting Regression
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
1 MF-852 Financial Econometrics Lecture 2 Matrix Operations in Econometrics, Optimization with Excel Roy J. Epstein Fall 2003.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
3D Geometry for Computer Graphics
Lecture 12 Least Square Approximation Shang-Hua Teng.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
Last lecture summary independent vectors x
Normal Estimation in Point Clouds 2D/3D Shape Manipulation, 3D Printing March 13, 2013 Slides from Olga Sorkine.
Matrices CS485/685 Computer Vision Dr. George Bebis.
Chapter 10 Real Inner Products and Least-Square (cont.)
5.1 Orthogonality.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
SVD(Singular Value Decomposition) and Its Applications
Summarized by Soo-Jin Kim
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
Principal Component Analysis Adapted by Paul Anderson from Tutorial by Doug Raiford.
BMI II SS06 – Class 3 “Linear Algebra” Slide 1 Biomedical Imaging II Class 3 – Mathematical Preliminaries: Elementary Linear Algebra 2/13/06.
Computing the Fundamental matrix Peter Praženica FMFI UK May 5, 2008.
Unit 4: Modeling Topic 6: Least Squares Method April 1, 2003.
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Chapter Content Real Vector Spaces Subspaces Linear Independence
AN ORTHOGONAL PROJECTION
SVD: Singular Value Decomposition
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
Orthogonality and Least Squares
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Quadratics Solving equations Using “Completing the Square”
Elementary Linear Algebra Anton & Rorres, 9th Edition
Parameter estimation. 2D homography Given a set of (x i,x i ’), compute H (x i ’=Hx i ) 3D to 2D camera projection Given a set of (X i,x i ), compute.
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
11/6/ :55 Graphics II Introduction to Parametric Curves and Surfaces Session 2.
Signal & Weight Vector Spaces
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Single-view geometry Odilon Redon, Cyclops, 1914.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Parameter estimation class 5 Multiple View Geometry CPSC 689 Slides modified from Marc Pollefeys’ Comp
Camera Calibration Course web page: vision.cis.udel.edu/cv March 24, 2003  Lecture 17.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Lecture 16: Image alignment
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
Some useful linear algebra
Orthogonal Projection
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Approximation of Functions
Presentation transcript:

Computer Graphics Recitation 11

2 The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial surface fitting

3 y = f (x) Motivation Given data points, fit a function that is “close” to the points y x P i = (x i, y i )

4 Motivation Local surface fitting to 3D points

5 Line fitting Orthogonal offsets minimization – we learned x y P i = (x i, y i )

6 Line fitting The origin of the line is the center of mass The direction v is the eigenvector corresponding to the largest eigenvalue of the scatter matrix S (2  2) In 2D this requires solving a square characteristic equation In higher dimensions – higher order equations…

7 Line fitting y-offsets minimization x y P i = (x i, y i )

8 Line fitting Find a line y = ax + b that minimizes Err(a, b) = [y i – (ax i + b)] 2 Err is quadratic in the unknown parameters a, b Another option would be, for example: But – it is not differentiable, harder to minimize…

9 Line fitting – LS minimization To find optimal a, b we differentiate Err(a, b) : Err(a, b) = (–2x i )[y i – (ax i + b)] = 0 Err(a, b) = (–2)[y i – (ax i + b)] = 0

10 Line fitting – LS minimization We get two linear equations for a, b : (–2x i )[y i – (ax i + b)] = 0 (–2)[y i – (ax i + b)] = 0

11 Line fitting – LS minimization We get two linear equations for a, b : [x i y i – ax i 2 – bx i ] = 0 [y i – ax i – b] = 0

12 Line fitting – LS minimization We get two linear equations for a, b : ( x i 2 ) a + ( x i ) b = x i y i ( x i ) a + ( 1) b = y i

13 Line fitting – LS minimization Solve for a, b using e.g. Gauss elimination Question: why the solution is minimum for the error function? Err(a, b) = [y i – (ax i + b)] 2

14 Fitting polynomials y x

15 Fitting polynomials Decide on the degree of the polynomial, k Want to fit f(x) = a k x k + a k-1 x k-1 + … a 1 x+ a 0 Minimize: Err(a 0, a 1, …, a k ) = [y i – (a k x k +a k-1 x k-1 + …+a 1 x+a 0 )] 2 Err(a 0,…,a k ) = (– 2x m )[y i – (a k x k +a k-1 x k-1 +…+ a 0 )] = 0

16 Fitting polynomials We get a linear system of k+1 in k+1 variables

17 General parametric fitting We can use this approach to fit any function f(x)  Specified by parameters a, b, c, …  The expression f(x) linearly depends on the parameters a, b, c, …

18 General parametric fitting Want to fit function f abc… (x) to data points (x i, y i )  Define Err(a,b,c,…) = [y i – f abc… (x i )] 2  Solve the linear system

19 General parametric fitting It can even be some crazy function like Or in general:

20 Solving linear systems in LS sense Let’s look at the problem a little differently:  We have data points (x i, y i )  We want the function f(x) to go through the points:  i =1, …, n: y i = f(x i )  Strict interpolation is in general not possible In polynomials: n+1 points define a unique interpolation polynomial of degree n. So, if we have 1000 points and want a cubic polynomial, we probably won’t find it…

21 Solving linear systems in LS sense We have an over-determined linear system n  k : f(x 1 ) = 1 f 1 (x 1 ) + 2 f 2 (x 1 ) + … + k f k (x 1 ) = y 1 f(x 2 ) = 1 f 1 (x 2 ) + 2 f 2 (x 2 ) + … + k f k (x 2 ) = y 2 … f(x n ) = 1 f 1 (x n ) + 2 f 2 (x n ) + … + k f k (x n ) = y n

22 Solving linear systems in LS sense In matrix form:

23 Solving linear systems in LS sense In matrix form: Av = y

24 Solving linear systems in LS sense More constrains than variables – no exact solutions generally exist We want to find something that is an “approximate solution”:

25 Finding the LS solution v  R k Av  R n As we vary v, Av varies over the linear subspace of R n spanned by the columns of A: Av = A2A2 A1A1 AkAk 1 2. k = 1 A1A1 A2A2 AkAk + 2 +…+ k

26 Finding the LS solution We want to find the closest Av to y : Subspace spanned by columns of A y RnRn Av closest to y

27 Finding the LS solution The vector Av closest to y satisfies: (Av – y)  {subspace of A ’s columns}  column A i, = 0  i, A i T (Av – y) = 0 A T (Av – y) = 0 (A T A)v = A T y These are called the normal equations

28 Finding the LS solution We got a square symmetric system (A T A)v = A T y (n  n) If A has full rank (the columns of A are linearly independent) then (A T A) is invertible.

29 Weighted least squares Sometimes the problem also has weights to the constraints:

30 Local surface fitting to 3D points Normals? Lighting? Upsampling?

31 Local surface fitting to 3D points Locally approximate a polynomial surface from points

32 Fitting local polynomial X Y Z Reference plane Fit a local polynomial around a point P P

33 Fitting local polynomial surface Compute a reference plane that fits the points close to P Use the local basis defined by the normal to the plane! z x y

34 Fitting local polynomial surface Fit polynomial z = p(x,y) = ax 2 + bxy + cy 2 + dx + ey + f z x y

35 Fitting local polynomial surface Fit polynomial z = p(x,y) = ax 2 + bxy + cy 2 + dx + ey + f z x y

36 Fitting local polynomial surface Fit polynomial z = p(x,y) = ax 2 + bxy + cy 2 + dx + ey + f z x y

37 Fitting local polynomial surface Again, solve the system in LS sense: ax bx 1 y 1 + cy dx 1 + ey 1 + f = z 1 ax bx 2 y 2 + cy dx 2 + ey 2 + f = z 1... ax n 2 + bx n y n + cy n 2 + dx n + ey n + f = z n Minimize  ||z i – p(x i, y i )|| 2

38 Fitting local polynomial surface Also possible (and better) to add weights:  w i ||z i – p(x i, y i )|| 2, w i > 0 The weights get smaller as the distance from the origin point grows.

39 Geometry compression using relative coordinates Given a mesh:  Connectivity  Geometry – (x, y, z) of each vertex

40 Geometry compression using relative coordinates The size of the geometry is large (compared to connectivity) (x, y, z) coordinates are hard to compress  Floating-point numbers – have to quantize  No correlation

41 Geometry compression using relative coordinates Represent each vertex with relative coordinates: average of the neighbours the relative coordinate vector

42 Geometry compression using relative coordinates We call them  –coordinates: average of the neighbours the relative coordinate vector

43 Geometry compression using relative coordinates When the mesh is smooth, the  –coordinates are small.  –coordinates can be better compressed.

44 Geometry compression using relative coordinates Matrix form to compute the  –coordinates:

45 Geometry compression using relative coordinates Matrix form to compute the  –coordinates: The same L matrix for y and z … L is called the Laplacian of the mesh

46 Geometry compression using relative coordinates How do we restore the (x, y, z) from the  –coordinates? Need to solve the linear system: Lx =  (x)

47 Geometry compression using relative coordinates Lx =  (x) But:  L is singular   (x) contains quantization error

48 Geometry compression using relative coordinates Lx =  (x) Solution: choose some anchor vertices whose (x, y, z) position is known (in addition to  )

49 Geometry compression using relative coordinates We add the anchor vertices to our linear system: constrained anchor vertices L

50 Geometry compression using relative coordinates Now we have more equations than unknowns Solve in least squares sense!

See you next time