Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae.

Slides:



Advertisements
Similar presentations
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Advertisements

Data Modeling and Parameter Estimation Nov 9, 2005 PSCI 702.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Chapter 10 Curve Fitting and Regression Analysis
Ch11 Curve Fitting Dr. Deshi Ye
Chapter 14 General Linear Squares and Nonlinear Regression.
Read Chapter 17 of the textbook
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Curve Fitting and Interpolation: Lecture (IV)
Function Approximation
Curve-Fitting Regression
General Linear Least-Squares and Nonlinear Regression
Least Square Regression
EGR 105 Foundations of Engineering I Fall 2007 – week 7 Excel part 3 - regression.
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Part 4 Chapter 13 Linear Regression
NOTES ON MULTIPLE REGRESSION USING MATRICES  Multiple Regression Tony E. Smith ESE 502: Spatial Data Analysis  Matrix Formulation of Regression  Applications.
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Classification and Prediction: Regression Analysis
Least-Squares Regression
CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Introduction to MATLAB for Engineers, Third Edition Chapter 6 Model Building and Regression PowerPoint to accompany Copyright © The McGraw-Hill Companies,
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Analytical vs. Numerical Minimization Each experimental data point, l, has an error, ε l, associated with it ‣ Difference between the experimentally measured.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
Curve Fitting and Regression EEE 244. Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics.
ES 240: Scientific and Engineering Computation. Chapter 13: Linear Regression 13. 1: Statistical Review Uchechukwu Ofoegbu Temple University.
Curve-Fitting Regression
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Chapter 13 Objectives Familiarizing yourself with some basic descriptive statistics and the normal distribution. Knowing how to compute the slope and intercept.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
y=a+bx Linear Regression: Method of Least Squares slope y intercept y
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
ELEC 413 Linear Least Squares. Regression Analysis The study and measure of the statistical relationship that exists between two or more variables Two.
CSE 330: Numerical Methods. What is regression analysis? Regression analysis gives information on the relationship between a response (dependent) variable.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Applied Numerical Methods With MATLAB ® for Engineers.
Chapter 7. Classification and Prediction
Part 5 - Chapter
Part 5 - Chapter 17.
Regression Chapter 6 I Introduction to Regression
PROGRAMME F6 POLYNOMIAL EQUATIONS.
Multiple Regression.
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Part 5 - Chapter 17.
Linear regression Fitting a straight line to observations.
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
Nonlinear regression.
General Linear Least-Squares and Nonlinear Regression
Nonlinear Fitting.
Discrete Least Squares Approximation
Least Square Regression
Topic 11: Matrix Approach to Linear Regression
Regression and Correlation of Data
Multiple linear regression
Presentation transcript:

Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae

13.1 Polynomial Regression -> poorly represented by a straight line. As discussed in Chap. 12, one method to accomplish this objective is to use transformations. Another alternative is to fit polynomials to the data using polynomial regression.

Least Squares Regression  Minimize some measure of the difference between the approximating function and the given data points.  In least squares method, the error is measured as :  The minimum of E occurs when the partial derivatives of E with respect to each of the variables are 0.

Linear Least Squares Regression  f(x) is in a linear form : f(x)=ax+b  the error :  Is minimized when : 

Quadratic Least Squares Approximation  f(x) is in a quadratic form : f(x)=ax 2 +bx+c  the error :  Is minimized when :

Cubic Least Squares Approximation  f(x) is in a cubic form : f(x)=ax 3 +bx 2 +cx+d  the error :  Is minimized when :  This case can be easily extended to an mth-order polynomial.

 Determining the coefficients of an mth-order polynomial is equivalent to solving a system of m+1 simutaneous linear equations.  The standard error is formulated as (m+1) data-drived coefficients- a 0, a 1, … a m, - were used to compute S r.

Example 13.1 Fit a second-order polynomial to the data in the first two columns of Table 13.1

 Sol>

>> N = [ ; ; ]; >> r = [ ] >> a = N\r a = a =

The standard error The coefficient of determination The correlation coefficient Sum of the squares of the residuals between the data points(yi) and the mean Sum of the squares of the residuals between the data points(y i ) and regression curve

Fit of a second-order polynomial. Figure 13.2

13.2 Multiple Linear Regression  An extension of linear regression : y is a linear function of two or more independent variables.  For this two-dimensional case, the regression line becomes a plane(Fig. 13.3).  The sum of the squares of the residuals:

Graphical depiction of multiple linear regression where y is a linear function of x 1 and x 2. Figure 13.3

Example 13.2 Multiple Linear Regression  Use multiple linear regression to fit this data.

Which gives us

Extension to m dimensions  Power equations of the form Standard error

13.3 General Linear Least Squares When We have simple or multiple linear regression. When We have polynomial regression.

 The functions can be highly nonlinear.  For example:  Or

 Equation (13.7) can be expressed in matrix notation as where m is the number of variables in the model and n is the number of data points. where m is the number of variables in the model and n is the number of data points.

 Because n>m, mostly Z is not a square matrix.

 By taking its partial derivative with respect to each of the coefficients and setting the resulting equation equal to zero.

Example 13.3 Polynomial Regression with MATLAB Repeat example 13.1 >> x=[ ]'; >> y=[ ]'; >> Z=[ones(size(x)) x x.^2] Z =

>> Z'*Z ans = ans = >> a=(Z'*Z)\(Z'*y) a = a =

>> Sr = sum((y-Z*a).^2) Sr = Sr = >> r2=1-Sr/sum((y-mean(y)).^2) r2 = r2 = >> Syx= sqrt(Sr/(length(x)-length(a))) Syx = Syx =

13.4 QR factorization and the backslash operator  QR factorization and singular value decomposition : beyond the scope of this book but we can use it in MATLAB which is implemented as polyfit and backslash {y} = [Z]{a} : general model Eq.(13.8) {y} = [Z]{a} : general model Eq.(13.8) >> x = [ ]' ; >> y=[ ]‘; >> z=[ones(size(x)) x x.^2]; >> a=polyfit(x,y,2) >> a =z\y

13.5 Nonlinear regression  Ex :  The sum of the square  Find a 0 and a 1 that minimize the function f  Matlab’s fminsearch function can be used for this purpose. [x, fval] =fminsearch(fun, x0, options, p1, p2,…) [x, fval] =fminsearch(fun, x0, options, p1, p2,…)

Example 13.4 Nonlinear Regression with MATLAB  Recall example 12.4 with the table 12.1: we have  This time use nonlinear regression. Employ initial guesses of 1 for the coefficient.

 Sol  M-file function f = fSSR(a, xm, ym) yp = a(1)*xm.^a(2); f =sum((ym-yp).^2); >> x=[ ]; >> y = [ ]; >> [1,1],[], x,y) ans =

Comparison of transformed and untransformed model fits for force versus velocity data from Table Figure 13.4