Ordinary Least-Squares Emmanuel Iarussi Inria. Many graphics problems can be seen as finding the best set of parameters for a model, given some data Surface.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Ordinary Least-Squares
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Chapter 10 Curve Fitting and Regression Analysis
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Nonlinear Regression Ecole Nationale Vétérinaire de Toulouse Didier Concordet ECVPT Workshop April 2011 Can be downloaded at
Empirical Maximum Likelihood and Stochastic Process Lecture VIII.
Read Chapter 17 of the textbook
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Motion Analysis Slides are from RPI Registration Class.
GG 313 Geological Data Analysis # 18 On Kilo Moana at sea October 25, 2005 Orthogonal Regression: Major axis and RMA Regression.
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Curve-Fitting Regression
Linear Regression  Using a linear function to interpolate the training set  The most popular criterion: Least squares approach  Given the training set:
Chapter 4 Multiple Regression.
The Widrow-Hoff Algorithm (Primal Form) Repeat: Until convergence criterion satisfied return: Given a training set and learning rate Initial:  Minimize.
NOTES ON MULTIPLE REGRESSION USING MATRICES  Multiple Regression Tony E. Smith ESE 502: Spatial Data Analysis  Matrix Formulation of Regression  Applications.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Development of Empirical Models From Process Data
Maximum-Likelihood estimation Consider as usual a random sample x = x 1, …, x n from a distribution with p.d.f. f (x;  ) (and c.d.f. F(x;  ) ) The maximum.
1 University of Freiburg Computer Networks and Telematics Prof. Christian Schindelhauer Wireless Sensor Networks 17th Lecture Christian Schindelhauer.
Unconstrained Optimization Problem
Chapter 11 Multiple Regression.
Ordinary least squares regression (OLS)
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
PETE 603 Lecture Session #29 Thursday, 7/29/ Iterative Solution Methods Older methods, such as PSOR, and LSOR require user supplied iteration.
Classification and Prediction: Regression Analysis
DERIVING LINEAR REGRESSION COEFFICIENTS
Collaborative Filtering Matrix Factorization Approach
Objectives of Multiple Regression
Least-Squares Regression
CSE 330: Numerical Methods.  Regression analysis gives information on the relationship between a response (dependent) variable and one or more predictor.
Lecture 12: Image alignment CS4670/5760: Computer Vision Kavita Bala
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Nonlinear least squares Given m data points (t i, y i ) i=1,2,…m, we wish to find a vector x of n parameters that gives a best fit in the least squares.
Linear Regression Andy Jacobson July 2006 Statistical Anecdotes: Do hospitals make you sick? Student’s story Etymology of “regression”
1 Chapter 3 Multiple Linear Regression Multiple Regression Models Suppose that the yield in pounds of conversion in a chemical process depends.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Gu Yuxian Wang Weinan Beijing National Day School.
Curve-Fitting Regression
Assume correspondence has been determined…
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Matrix Notation for Representing Vectors
Trees Example More than one variable. The residual plot suggests that the linear model is satisfactory. The R squared value seems quite low though,
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
MathematicalMarketing Slide 5.1 OLS Chapter 5: Ordinary Least Square Regression We will be discussing  The Linear Regression Model  Estimation of the.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
GEOMETRIC CAMERA CALIBRATION The Calibration Problem Least-Squares Techniques Linear Calibration from Points Reading: Chapter 3.
CSE 330: Numerical Methods. What is regression analysis? Regression analysis gives information on the relationship between a response (dependent) variable.
Lecture 2 Linear Inverse Problems and Introduction to Least Squares.
Data Modeling Patrice Koehl Department of Biological Sciences
MAN-522 Computer Vision Spring
Lecture 16: Image alignment
The simple linear regression model and parameter estimation
Multiple Regression.
Linear Regression.
Collaborative Filtering Matrix Factorization Approach
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Linear regression Fitting a straight line to observations.
Discrete Least Squares Approximation
Lecture 8: Image alignment
Image Stitching Linda Shapiro ECE/CSE 576.
Multiple linear regression
Image Stitching Linda Shapiro ECE P 596.
Presentation transcript:

Ordinary Least-Squares Emmanuel Iarussi Inria

Many graphics problems can be seen as finding the best set of parameters for a model, given some data Surface reconstruction 2

Color diffusion 3 Many graphics problems can be seen as finding the best set of parameters for a model, given some data

Shape registration 4 Many graphics problems can be seen as finding the best set of parameters for a model, given some data

Temperature (a)Gas pressure (b) Example: estimation of gas pressure for a given temperature sample 5

Assuming linear relationship 6

7

Temperature measurements (m) Pressure measurements (m) 8

Assuming linear relationship Temperature measurements (m) Pressure measurements (m) = 9

Assuming linear relationship The relationship might not be exact Temperature measurements (m) Pressure measurements (m) = 10

Assuming linear relationship The relationship might not be exact Temperature measurements (m) Pressure measurements (m) … = Match “as best as possible” the observations 11

Assuming linear relationship The relationship might not be exact Match “as best as possible” the observations Temperature measurements (m) Pressure measurements (m) = 12

Many of CG problems can be formulated as minimizing the sum of squares of the residuals between some features in the model and the data. or 13

Matrix notation (observations) We can rewrite the residual function using linear algebra as: 14

Temperature Gas pressure Example: estimation of gas pressure for a given temperature and altitude samples Altitude 15

Multidimensional linear regression, using a model with n parameters 16

Multidimensional linear regression, using a model with n parameters before + +… 17

Multidimensional linear regression, using a model with n parameters Temperature measurements (m) Pressure measurements (m) Altitude measurements (m) 18

Multidimensional linear regression, using a model with n parameters Temperature measurements (m) Pressure measurements (m) Altitude measurements (m) = 19

= 20

= 21

= 22

Objective: find x subject to minimize: 23

Objective: find x subject to minimize: 24

Objective: find x subject to minimize: Convex bowl function has minima when: 25

Objective: find x subject to minimize: Convex bowl function has minima when: gradient 26

Expanding square term 27

Expanding square term 28

Differentiating with respect to x (gradient) Expanding square term 29

Expanding square term Differentiating with respect to x (gradient) 30

Expanding square term Normal equation x = (A.transpose()*A).solve(A.transpose()*b); Differentiating with respect to x (gradient) 31

Expanding square term Normal equation x = (A.transpose()*A).solve(A.transpose()*b); Issue: matrix multiplication Differentiating with respect to x (gradient) 32

Differentiating with respect to x Expanding square term Normal equation x = (A.transpose()*A).solve(A.transpose()*b); Issue: matrix multiplication Solution: find expression 33

Let's go back to gradient = 0 34

Let's go back to gradient = 0 Differentiating again with respect to x 35

Let's go back to gradient = 0 Differentiating again with respect to x 36

Let's go back to gradient = 0 Differentiating again with respect to x hessian 37

Let's go back to gradient = 0 Differentiating again with respect to x hessian 38

Let's go back to gradient = 0 Differentiating again with respect to x x = (A.transpose()*A).solve(A.transpose()*b); Normal equation: 39

Let's go back to gradient = 0 Differentiating again with respect to x x = (A.transpose()*A).solve(A.transpose()*b); Normal equation: 40

Let's go back to gradient = 0 Differentiating again with respect to x x = (A.transpose()*A).solve(A.transpose()*b); x = (H).solve(c); Two alternatives for solving: Constant terms in gradient 41

Example: Diffusion 42

Example: Diffusion Minimize difference between neighbors … 43

Example: Diffusion Minimize difference between neighbors … 44

Example: Diffusion Minimize difference between neighbors … 45

Example: Diffusion Minimize difference between neighbors … 46

Solving with normal equation: … 1.Build A 47

Solving with normal equation: … 1.Build A 48

Solving with normal equation: … 1.Build A 49

Solving with normal equation: … 1.Build A 50

Solving with normal equation: … 1.Build A 51

Solving with normal equation: … 1.Build A 52

Solving with normal equation: 2. Compute and 53

Solving with normal equation: 3.Solve using: Cholesky decomposition Conjugate Gradient … x = (A.transpose()*A).solve(A.transpose()*b); in Eigen (i.e): 54

Solving with normal equation: 3.Solve using: Cholesky decomposition Conjugate Gradient … x = (A.transpose()*A).solve(A.transpose()*b); in Eigen (i.e): x wil have: 55

Solving with Hessian: 1. Gradient vector: first-order partial derivatives of energy terms 56

Solving with Hessian: 1. Gradient vector: first-order partial derivatives of energy terms 57

Solving with Hessian: 1. Gradient vector: first-order partial derivatives of energy terms 58

Solving with Hessian: 1. Gradient vector: first-order partial derivatives of energy terms 59

Solving with Hessian: 1. Gradient vector: first-order partial derivatives of energy terms 60

Solving with Hessian: 2. Hessian matrix: second-order partial derivatives of energy terms 61

Solving with Hessian: 2. Hessian matrix: second-order partial derivatives of energy terms 62

Solving with Hessian: 2. Hessian matrix: second-order partial derivatives of energy terms 63

Solving with Hessian: 2. Hessian matrix: second-order partial derivatives of energy terms 64

Solving with Hessian: 65

Solving with Hessian: 1. Gradient vector: first-order partial derivatives of energy terms 66

Solving with Hessian: 1. Gradient vector: first-order partial derivatives of energy terms 67

Solving with Hessian: 1. Gradient vector: first-order partial derivatives of energy terms 68

Solving with Hessian: 1. Gradient vector: first-order partial derivatives of energy terms 69

Solving with Hessian: 2. Hessian matrix: second-order partial derivatives of energy terms 70

Solving with Hessian: 2. Hessian matrix: second-order partial derivatives of energy terms 71

Solving with Hessian: 2. Hessian matrix: second-order partial derivatives of energy terms 72

Solving with Hessian: 2. Hessian matrix: second-order partial derivatives of energy terms 73

Solving with Hessian: 74

Solving with Hessian: To recap: 75

Solving with Hessian: 3. Build system … 76

Solving with Hessian: 3. Build system … 77

Solving with Hessian: 3. Build system: add constraint … 78

Solving with Hessian: … 3. Build system: add constraint 79

Solving with Hessian: … 3. Build system: add constraint 80

Solving with Hessian: … 3. Build system: add constraint 81

Solving with Hessian: … 3. Build system: add constraint 82

Solving with Hessian: … 3. Build system: add constraint 83

Solving with Hessian: … 3. Build system: add constraint 84

Solving with Hessian: … 3. Build system: add constraint 85

Solving with Hessian: … 3. Build system: add constraint 86

Solving with Hessian: 4. Solve using: Cholesky decomposition Conjugate Gradient … 87

Solving with Hessian: x = (H).solve(b`); in Eigen (i.e): x will have: 4. Solve using: Cholesky decomposition Conjugate Gradient … 88

References: 89 Solving Least Squares Problems. C. L. Lawson and R. J. Hanson Practical Least Squares for Computer Graphics. Fred Pighin and J.P. Lewis. Course notes 2007.