Presentation is loading. Please wait.

Presentation is loading. Please wait.

Least-Squares Regression

Similar presentations


Presentation on theme: "Least-Squares Regression"— Presentation transcript:

1 Least-Squares Regression
Chapter 17 Least-Squares Regression Lecture Notes Dr. Rakhmad Arief Siregar Universiti Malaysia Perlis Applied Numerical Method for Engineers

2

3 Curve Fitting

4 Curve Fitting

5 Curve Fitting

6 Simple Statistics

7 Regression Polynomial fit Experimental data Least-squares fit

8 Linear Regression The simplest example of a least-squares approximation is fitting a straight line a0 and a1 are coefficients representing the intercept and the slope e is the error or residual between the model and the observations

9 Linear Regression By rearranging:
e is the error or residual, the discrepancy between the true value of y a0+a1x is the approximate value

10 Criteria for a “Best” Fit
By minimizing the sum of the residual error n is total number of points

11 Criteria for a “Best” Fit
By minimizing the sum of absolute residual error n is total number of points

12 Criteria for a “Best” Fit
By minimizing the sum of the squares of the residuals between the measured y and the y calculated with the linear mode

13 Best fit Minimizes the sum of the residuals
Minimizes the sum of the absolute value of residuals Minimizes the maximum error of any individual point

14 Least-Squares Fit of a Straight Line
Differentials:

15 Least-Squares Fit of a Straight Line
After several mathematical steps, a0 and a1 will yields: Where y and x are the means of y and x, respectively

16 Ex. 17.1 Fit a straight line to the x and y values in the first two columns of Table below.

17 Ex. 17.1 The following quantities can be computed

18 Ex. 17.1 a1 and a0 can be computed:

19 Ex. 17.1 The least-squares fit is:

20 Problem 17.4 Use least-squares regression to fit a straight line to:

21 Problem 17.4

22 Quantification of Error of Linear Regression
Squared of residual error:

23 Quantification of Error of Linear Regression
If those criteria are met, a “standard deviation” for regression line can be determined as: where: Sy/x is called standard error of estimate. Subscript y/x means the error is for a predicted value of y corresponding to a particular value of x

24 Quantification of Error of Linear Regression
The spread of the data around the mean The spread of the data around best fit line

25 Quantification of Error of Linear Regression
Small residual errors Large residual errors

26 Quantification of Error of Linear Regression
The difference between the two quantities, St –Sr, quantifies the improvement or error reduction due to describing the data in terms of a straight line. The difference is normalized to St to yield: r2 : coefficient of determination r : correlation coefficient

27 Ex. 17.2 Compute the total standard deviation, the standard error of the estimate and the correlation coefficient for the data in Ex. 17.1

28 Ex. 17.2 Solution Standard deviation: Standard error of estimate:
The extent of the improvement is qualified because sy/x < sy the linear regression model has merit

29 Ex. 17.2 Solution The correlation coefficient:
These results indicate 86.8 percent of the original uncertainty has been explained by the linear model

30 Linearization of Nonlinear Relationships
Linear regression provides a powerful technique for fitting a best line to data. How about data shown below?

31 Linearization of Nonlinear Relationships
Exponential equation Linearization of Nonlinear Relationships Transformations can be used to express the data in form that is compatible with linear regression A straight line with a slope 1 and intercept of ln 1 By natural logarithm

32 Linearization of Nonlinear Relationships
Power equation A straight line with a slope 2 and intercept of log 2 By base-10 logarithm

33 Linearization of Nonlinear Relationships
The saturation-growth-rate equation A straight line with a slope 3 / 3 and intercept of 1/3 By inverting

34

35 Ex. 17.4 Fit Eq. below to the data in table 17.3 using a logarithmic transformation of the data.

36 Ex. 17.4 Intercept of log 2 Slope of 1 Intercept data

37 Polynomial Regression

38 Polynomial Regression
This method can utilize the least-squares procedure to fit the data to a higher-order polynomial.

39 Polynomial Regression
Derivation with respect to each unknown coefficients of polynomial as in

40 Polynomial Regression
Derivations can be set equal to zero and rearranged as: How to solve it?

41 Polynomial Regression
In matrix form What method can be used?

42 Polynomial Regression
The two dimensional case can be easily extended to an m-th order polynomial as: The standard error for this case is formulated as

43 Ex. 17.5 Fit a second-order polynomial to the data in table below:

44 Ex. 17.5 Solution: m=2, n=6

45 Ex. 17.5 Solution: The simultaneous linear equation are:

46 Ex. 17.5 Solution: By using gauss elimination it will yield:
a0= , a1= and a2= The least-square quadratic equation:

47 Ex. 17.5 The standard error:

48 Ex. 17.5 The coefficient of determination:

49 Ex. 17.5 99.851% of the original uncertainty has been explain by the model

50 Assignment 3 Do Problems 17.5, 17.6, 17.7, 17.10 and 17.12
Submit next week

51 Multiple Linear Regression
For this section, two-dimensional case, regression line become a plane

52 Multiple Linear Regression
This method can utilize the least-squares procedure to fit the data to a higher-order polynomial.

53 Multiple Linear Regression
Derivation with respect to each unknown coefficients of polynomial as in

54 Multiple Linear Regression
Derivations can be set equal to zero and rearranged as in matrix form

55 Multiple Linear Regression
The two dimensional case can be easily extended to an m-th order polynomial as: The standard error for this case is formulated as

56 Ex. 17.6 The following data was calculated from equation: y=5+4x1-3x2
Use multiple linear regression to fit this data

57 Ex. 17.6

58 Ex. 17.6 Solution

59 Ex. 17.6 solution a0=5, a1=4 and a2=-3

60 Problems 17.17 Use multiple linear regression to fit.
Compute the coefficients, the standard error of estimate and the correlation coefficient

61 Problems 17.17

62 Problems 17.17 Solution

63 Nonlinear Regression The Gauss-Newton method is one algorithm for minimizing the sum of the squares of the residuals between data and nonlinear equation. For convenience

64 Nonlinear Regression The nonlinear model can be expanded in a Tailor series around the parameter values and curtailed after the first derivative Ex. For a two-parameter case:

65 Nonlinear Regression It needs to be linearized by substituting into
It will yields

66 Nonlinear Regression In matrix form

67 Nonlinear Regression By applying least-square theory to
It will yield in normal equation: By using ave Eq. we can compute values for:

68 Ex. 17.9


Download ppt "Least-Squares Regression"

Similar presentations


Ads by Google