Least-Squares Regression

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Managerial Economics in a Global Economy
Kin 304 Regression Linear Regression Least Sum of Squares
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Chapter 10 Curve Fitting and Regression Analysis
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
P M V Subbarao Professor Mechanical Engineering Department
Regression Analysis Using Excel. Econometrics Econometrics is simply the statistical analysis of economic phenomena Here, we just summarize some of the.
Lecture (14,15) More than one Variable, Curve Fitting, and Method of Least Squares.
Read Chapter 17 of the textbook
Curve Fitting and Interpolation: Lecture (IV)
Function Approximation
Least Square Regression
Curve-Fitting Regression
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 26 Regression Analysis-Chapter 17.
Least Square Regression
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 25 Regression Analysis-Chapter 17.
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Part 4 Chapter 13 Linear Regression
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Lecture Notes Dr. Rakhmad Arief Siregar Universiti Malaysia Perlis
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Simple Linear Regression and Correlation
Calibration & Curve Fitting
9 - 1 Intrinsically Linear Regression Chapter Introduction In Chapter 7 we discussed some deviations from the assumptions of the regression model.
CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 5 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
Lecture Notes Dr. Rakhmad Arief Siregar Universiti Malaysia Perlis
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Ch4 Describing Relationships Between Variables. Section 4.1: Fitting a Line by Least Squares Often we want to fit a straight line to data. For example.
Lecture Notes Dr. Rakhmad Arief Siregar Universiti Malaysia Perlis
Chapter 8 Curve Fitting.
ES 240: Scientific and Engineering Computation. Chapter 13: Linear Regression 13. 1: Statistical Review Uchechukwu Ofoegbu Temple University.
Curve-Fitting Regression
CHAPTER 3 Model Fitting. Introduction Possible tasks when analyzing a collection of data points: Fitting a selected model type or types to the data Choosing.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Chapter 13. General Least-Squares and Nonlinear Regression Gab-Byung Chae.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Chapter 13 Objectives Familiarizing yourself with some basic descriptive statistics and the normal distribution. Knowing how to compute the slope and intercept.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
OLS Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is.
1 Approximations and Round-Off Errors Lecture Notes Dr. Rakhmad Arief Siregar Universiti Malaysia Perlis Applied Numerical Method for Engineers Chapter.
Chapter 4: Basic Estimation Techniques
Chapter 4 Basic Estimation Techniques
Chapter 7. Classification and Prediction
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Part 5 - Chapter
Part 5 - Chapter 17.
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Part 5 - Chapter 17.
Linear regression Fitting a straight line to observations.
Nonlinear regression.
Discrete Least Squares Approximation
Least Square Regression
SKTN 2393 Numerical Methods for Nuclear Engineers
Presentation transcript:

Least-Squares Regression Chapter 17 Least-Squares Regression Lecture Notes Dr. Rakhmad Arief Siregar Universiti Malaysia Perlis Applied Numerical Method for Engineers

Curve Fitting

Curve Fitting

Curve Fitting

Simple Statistics

Regression Polynomial fit Experimental data Least-squares fit

Linear Regression The simplest example of a least-squares approximation is fitting a straight line a0 and a1 are coefficients representing the intercept and the slope e is the error or residual between the model and the observations

Linear Regression By rearranging: e is the error or residual, the discrepancy between the true value of y a0+a1x is the approximate value

Criteria for a “Best” Fit By minimizing the sum of the residual error n is total number of points

Criteria for a “Best” Fit By minimizing the sum of absolute residual error n is total number of points

Criteria for a “Best” Fit By minimizing the sum of the squares of the residuals between the measured y and the y calculated with the linear mode

Best fit Minimizes the sum of the residuals Minimizes the sum of the absolute value of residuals Minimizes the maximum error of any individual point

Least-Squares Fit of a Straight Line Differentials:

Least-Squares Fit of a Straight Line After several mathematical steps, a0 and a1 will yields: Where y and x are the means of y and x, respectively

Ex. 17.1 Fit a straight line to the x and y values in the first two columns of Table below.

Ex. 17.1 The following quantities can be computed

Ex. 17.1 a1 and a0 can be computed:

Ex. 17.1 The least-squares fit is:

Problem 17.4 Use least-squares regression to fit a straight line to:

Problem 17.4

Quantification of Error of Linear Regression Squared of residual error:

Quantification of Error of Linear Regression If those criteria are met, a “standard deviation” for regression line can be determined as: where: Sy/x is called standard error of estimate. Subscript y/x means the error is for a predicted value of y corresponding to a particular value of x

Quantification of Error of Linear Regression The spread of the data around the mean The spread of the data around best fit line

Quantification of Error of Linear Regression Small residual errors Large residual errors

Quantification of Error of Linear Regression The difference between the two quantities, St –Sr, quantifies the improvement or error reduction due to describing the data in terms of a straight line. The difference is normalized to St to yield: r2 : coefficient of determination r : correlation coefficient

Ex. 17.2 Compute the total standard deviation, the standard error of the estimate and the correlation coefficient for the data in Ex. 17.1

Ex. 17.2 Solution Standard deviation: Standard error of estimate: The extent of the improvement is qualified because sy/x < sy the linear regression model has merit

Ex. 17.2 Solution The correlation coefficient: These results indicate 86.8 percent of the original uncertainty has been explained by the linear model

Linearization of Nonlinear Relationships Linear regression provides a powerful technique for fitting a best line to data. How about data shown below?

Linearization of Nonlinear Relationships Exponential equation Linearization of Nonlinear Relationships Transformations can be used to express the data in form that is compatible with linear regression A straight line with a slope 1 and intercept of ln 1 By natural logarithm

Linearization of Nonlinear Relationships Power equation A straight line with a slope 2 and intercept of log 2 By base-10 logarithm

Linearization of Nonlinear Relationships The saturation-growth-rate equation A straight line with a slope 3 / 3 and intercept of 1/3 By inverting

Ex. 17.4 Fit Eq. below to the data in table 17.3 using a logarithmic transformation of the data.

Ex. 17.4 Intercept of log 2 Slope of 1 Intercept data

Polynomial Regression

Polynomial Regression This method can utilize the least-squares procedure to fit the data to a higher-order polynomial.

Polynomial Regression Derivation with respect to each unknown coefficients of polynomial as in

Polynomial Regression Derivations can be set equal to zero and rearranged as: How to solve it?

Polynomial Regression In matrix form What method can be used?

Polynomial Regression The two dimensional case can be easily extended to an m-th order polynomial as: The standard error for this case is formulated as

Ex. 17.5 Fit a second-order polynomial to the data in table below:

Ex. 17.5 Solution: m=2, n=6

Ex. 17.5 Solution: The simultaneous linear equation are:

Ex. 17.5 Solution: By using gauss elimination it will yield: a0=2.47857, a1=2.35929 and a2=1.86071 The least-square quadratic equation:

Ex. 17.5 The standard error:

Ex. 17.5 The coefficient of determination:

Ex. 17.5 99.851% of the original uncertainty has been explain by the model

Assignment 3 Do Problems 17.5, 17.6, 17.7, 17.10 and 17.12 Submit next week

Multiple Linear Regression For this section, two-dimensional case, regression line become a plane

Multiple Linear Regression This method can utilize the least-squares procedure to fit the data to a higher-order polynomial.

Multiple Linear Regression Derivation with respect to each unknown coefficients of polynomial as in

Multiple Linear Regression Derivations can be set equal to zero and rearranged as in matrix form

Multiple Linear Regression The two dimensional case can be easily extended to an m-th order polynomial as: The standard error for this case is formulated as

Ex. 17.6 The following data was calculated from equation: y=5+4x1-3x2 Use multiple linear regression to fit this data

Ex. 17.6

Ex. 17.6 Solution

Ex. 17.6 solution a0=5, a1=4 and a2=-3

Problems 17.17 Use multiple linear regression to fit. Compute the coefficients, the standard error of estimate and the correlation coefficient

Problems 17.17

Problems 17.17 Solution

Nonlinear Regression The Gauss-Newton method is one algorithm for minimizing the sum of the squares of the residuals between data and nonlinear equation. For convenience

Nonlinear Regression The nonlinear model can be expanded in a Tailor series around the parameter values and curtailed after the first derivative Ex. For a two-parameter case:

Nonlinear Regression It needs to be linearized by substituting into It will yields

Nonlinear Regression In matrix form

Nonlinear Regression By applying least-square theory to It will yield in normal equation: By using ave Eq. we can compute values for:

Ex. 17.9