Chapter 14 General Linear Squares and Nonlinear Regression.

Slides:



Advertisements
Similar presentations
3.3 Hypothesis Testing in Multiple Linear Regression
Advertisements

Topic 12: Multiple Linear Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Data Modeling and Parameter Estimation Nov 9, 2005 PSCI 702.
P M V Subbarao Professor Mechanical Engineering Department
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Function Approximation
Curve-Fitting Regression
General Linear Least-Squares and Nonlinear Regression
Least Square Regression
Chapter 4 Multiple Regression.
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Revision.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Martin Mendez UASLP Chapter 61 Unit II.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Lecture 5 Curve fitting by iterative approaches MARINE QB III MARINE QB III Modelling Aquatic Rates In Natural Ecosystems BIOL471 © 2001 School of Biological.
ECIV 301 Programming & Graphics Numerical Methods for Engineers REVIEW II.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Calibration & Curve Fitting
Objectives of Multiple Regression
Least-Squares Regression
CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Regression Analysis What is regression ?What is regression ? Best-fit lineBest-fit line Least squareLeast square What is regression ?What is regression.
Taylor Series.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Chem Math 252 Chapter 5 Regression. Linear & Nonlinear Regression Linear regression –Linear in the parameters –Does not have to be linear in the.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
1 General Linear Squares and Nonlinear Regression.
Today’s class Spline Interpolation Quadratic Spline Cubic Spline Fourier Approximation Numerical Methods Lecture 21 Prof. Jinbo Bi CSE, UConn 1.
Curve-Fitting Regression
Regression Regression relationship = trend + scatter
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
1 Multiple Regression and Correlation KNN Ch. 6 CC Ch 3.
Method of Least Squares Advanced Topic of Lecture on Astrometry.
MathematicalMarketing Slide 5.1 OLS Chapter 5: Ordinary Least Square Regression We will be discussing  The Linear Regression Model  Estimation of the.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
Part 5 - Chapter
ENME 392 Regression Theory
Regression Chapter 6 I Introduction to Regression
Multiple Regression.
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Linear Regression.
LESSON 23: MULTIPLE REGRESSION
Today’s class Multiple Variable Linear Regression
Linear regression Fitting a straight line to observations.
Nonlinear regression.
General Linear Least-Squares and Nonlinear Regression
Nonlinear Fitting.
Discrete Least Squares Approximation
Least Square Regression
Regression and Correlation of Data
Regression and Correlation of Data
Multiple linear regression
Presentation transcript:

Chapter 14 General Linear Squares and Nonlinear Regression

y =  x Error S r = Correlation r = x = [ ]; y = [ ];

Preferable to fit a parabola Large error, poor correlation

Polynomial Regression w Quadratic Least Squares w y = f(x) = a 0 + a 1 x + a 2 x 2 w Minimize total square error

Quadratic Least Squares w Use Cholesky decomposition to solve for the symmetric matrix w or use MATLAB function z = A\r

Standard error for 2 nd polynomial regression where n observations 2 nd order polynomial (3 coefficients) (start off with n degrees of freedom, use up m+1 for m th -order polynomial)

» [x,y]=example2 » z=Quadratic_LS(x,y) x y (a0+a1*x+a2*x^2) (y-a0-a1*x-a2*x^2) err = Syx = r = z = y = x x 2 Correlation coefficient r Standard error of the estimate function [x,y] = example2 x = [ ]; y = [ ];

Quadratic Least Square: y = x  x 2 Error S r = Correlation r =

Cubic Least Squares

» [x,y]=example2; » z=Cubic_LS(x,y) x y p(x)=a0+a1*x+a2*x^2+a3*x^3 y-p(x) err = Syx = r = z = y = x – x 2  x 3 Correlation coefficient r =

» [x,y]=example2; » z1=Linear_LS(x,y); z1 z1 = » z2=Quadratic_LS(x,y); z2 z2 = » z3=Cubic_LS(x,y); z3 z3 = » x1=min(x); x2=max(x); xx=x1:(x2-x1)/100:x2; » yy1=z1(1)+z1(2)*xx; » yy2=z2(1)+z2(2)*xx+z2(3)*xx.^2; » yy3=z3(1)+z3(2)*xx+z3(3)*xx.^2+z3(4)*xx.^3; » H=plot(x,y,'r*',xx,yy1,'g',xx,yy2,'b',xx,yy3,'m'); » xlabel('x'); ylabel('y'); » set(H,'LineWidth',3,'MarkerSize',12); » print -djpeg075 regres4.jpg Linear Least Square Quadratic Least Square Cubic Least Square

Linear Least Square: y = – x Quadratic: y = x  x 2 Cubic: y = x – x 2  x 3

Standard error for polynomial regression where n observations m order polynomial (start off with n degrees of freedom, use up m+1 for m th -order polynomial)

w Dependence on more than one variable w e.g. dependence of runoff volume on soil type and land cover, w or dependence of aerodynamic drag on automobile shape and speed Multiple Linear Regression

w With two independent variables, get a surface w Find the best-fit “plane” to the data

Multiple Linear Regression w Much like polynomial regression w Sum of squared residuals

w Rearrange the equations w Very similar to polynomial regression

Multiple Linear Regression w Once again, solve by any matrix method w Cholesky decomposition is appropriate - symmetric and positive definite w Very useful for fitting power equation

Example: Strength of concrete depends on cure time and cement/water ratio (or water content W/C)

» x1=[ ]; » x2=[ ]; » y=[ ]; » H=plot3(x1,x2,y,'ro'); grid on; set(H,'LineWidth',5); » H1=xlabel('Cure Time (days)'); set(H1,'FontSize',12) » H2=ylabel('Water Content'); set(H2,'FontSize',12) » H3=zlabel('Strength (psi)'); set(H3,'FontSize',12)

Hand Calculations

Solve by Cholesky decomposition Forward and Back Substitutions

» [x1,x2,y]=concrete; » z=Multi_Linear(x1,x2,y) x1 x2 y (a0+a1*x1+a2*x2) (y-a0-a1*x1-a2*x2) Syx = r = z = function [x1,x2,y] = concrete x1=[ ]; x2=[ ]; y=[ ]; Correlation coefficient (a 0, a 1, a 2 )

Multiple Linear Regression

» xx=0:0.02:1; yy=0:0.02:1; [x,y]=meshgrid(xx,yy); » z=2*x+3*y+2; » surfc(x,y,z); grid on » axis([ ]) » xlabel('x1'); ylabel('x2'); zlabel('y')

w Simple linear, polynomial, and multiple linear regressions are special cases of the general linear least squares model w Examples: w Linear in a i, but z i may be highly nonlinear General Linear Least Squares

w General equation in matrix form w Where General Linear Least Squares Dependent variables Regression coefficients Residuals

w As usual, take partial derivatives to minimize the square errors S r w This leads to the normal equations w Solve this for {A} using Cholesky LU decomposition, or matrix inverse General Linear Least Squares

w Use Taylor series expansion to linearize the original equation w Gauss-Newton method w Nonlinear function of a 1, a 2, …, a m w Where f is a nonlinear function of x w (x i, y i ) are one of a set of n observations Nonlinear Regression

w Use Taylor series for f, and truncate the higher-order terms w j = the initial guess w j+1 = the prediction (improved guess) Nonlinear Regression

w Plug the Taylor series into original equation w or Nonlinear Regression

Gauss-Newton Method w Given all n equations w Set up matrix equation

where

w Using the same least squares approach w Minimizing sum of squares of residuals e w Get  A from w Now modify a 1, a 2, …, a m with  A and repeat the procedure until convergence is reached Gauss-Newton Method

function [x,y] = mass_spring x = [ ]; y = [ ]; Example: Damped Sinusoidal Model it with

Gauss-Newton Method

» [x,y]=mass_spring; » a=gauss_newton(x,y) Enter the initial guesses [a0,a1] = [2,3] Enter the tolerance tol = Enter the maximum iteration number itmax = 50 n = 21 iter a0 a1 da0 da Gauss-Newton method has converged a = Choose initial a 0 = 2, a 1 = 3 21 data points

a 0 = , a 1 =