Today’s class Multiple Variable Linear Regression Polynomial Interpolation Lagrange Interpolation Newton Interpolation Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Mid-term Exam 2 Average = 83, Standard deviation = 27 Max = 100, Min = 60 Numerical Methods Fall 2010 Lecture 21 Prof. Lei Wang ECE, UConn
Multiple Variable Linear Least Square Approximation Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Multiple Linear Regression Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
General Linear Least Squares Simple linear, polynomial, and multiple linear regressions are special cases of the general linear least squares model Linear in ai, but zi may be highly nonlinear Examples: Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
General Linear Least Squares General equation in matrix form Where Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
General Linear Least Squares As usual, take partial derivatives to minimize the square errors Sr This leads to the normal equations Solve this for {a} Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Interpolation & Extrapolation data to be found are within the range of observed data. Extrapolation data to be found are beyond the range of observation data. (may not be reliable) Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Interpolation Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Polynomial Interpolation Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Linear Interpolation Errors are larger when the interval is larger. Smaller interval provides a better estimate. Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Linear Interpolation Quadratic interpolation gives a better estimate than linear interpolation when the change in a function is smooth and slowly. Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Direct Method Solve the system of equations for the coefficients. It is time consuming and the coefficient matrix may be ill-conditioned. Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Newton (divided difference) Interpolation polynomials Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Newton (divided difference) Interpolation polynomials x2-x0 Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
General Scheme for Divided Difference Coefficients Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
General Scheme for Divided Difference Coefficients Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Example Estimate ln2 with data points at (1,0) and (4,1.386294) Linear interpolation Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Example Estimate ln2 with data points at (1,0), (4,1.386294), and (5,1.609438) Quadratic interpolation Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Example Estimate ln2 with data points at (1,0), (4,1.386294), (5,1.609438), (6,1.791759) Cubic interpolation Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Newton’s Interpolating Polynomial Error Similar to Taylor series remainder Proportional to n+1th finite divided difference Also proportional to the distance of the data points from x Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Lagrange Interpolation Polynomials Nth order polynomial: interpolate n+1 points The Lagrangian coefficient, Li(x) will be 1 at x=xi and 0 at all other data points Thus, fn(xi)=f(xi) for all i, meaning that the Lagrangian polynomial passes through all the data points as expected. Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Lagrange Interpolating Polynomials Linear polynomial: interpolate 2 points 2nd order polynomial: interpolate 3 points Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Lagrange Interpolating Polynomials Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Example Estimate ln2 with data points at (1,0), (4,1.386294), (5,1.609438) 2nd order Lagrangian interpolation Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Lagrange & Newton Interpolation The forms of the Lagrangian and Newton polynomials look different, but they are actually equivalent if the data points are the same The Lagrangian coefficients are easier to calculate since the divided difference computation is not required However, if you are adding new data points, the Newton method is more efficient since the earlier coefficients remain the same Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn
Next class Spline Interpolation Fourier Approximation HW8 due Dec 8 Numerical Methods Lecture 20 Prof. Jinbo Bi CSE, UConn