Linear regression Fitting a straight line to observations
Equation for straight line Difference between observation and line ei is the residual or error
Goal in linear regression is to minimize To find minimum, take derivatives And set to zero
Some algebra The Normal Equations
Solve these simultaneously These are the least-squares linear regression coefficients
Example
and
Error in linear regression a0 and a1 are maximum likelihood estimates standard error of estimate Quantifies spread around regression line
Another measure of goodness of fit - coefficient of determination r2 or correlation coefficient r Can also write
For our example
Linearization of nonlinear relationships
Polynomial regression - extend linear regression to higher order polynomials Sum of squared residuals becomes
Take derivatives to minimize Sr Set equal to zero
Can write as
We can solve this with any number of matrix methods Example
After Gauss elimination
Best fit curve
Standard error for polynomial regression where n observations m order polynomial (start off with n degrees of freedom, use up m+1 for m order polynomial)