Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions.

Similar presentations


Presentation on theme: "Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions."— Presentation transcript:

1 week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions for the simple linear regression model are: 1) E(ε i )=0 2) Var(ε i ) = σ 2 3) ε i ’s are uncorrelated. These assumptions are also called Gauss-Markov conditions. The above assumptions can be stated in terms of Y’s…

2 week 22 Possible Violations of Assumptions Straight line model is inappropriate… Var(Y i ) increase with X i …. Linear model is not appropriate for all the data…

3 week 23 Properties of Least Squares Estimates The least-square estimates b 0 and b 1 are linear in Y’s. That it, there exists constants c i, d i such that, Proof: Exercise.. The least squares estimates are unbiased estimators for β 0 and β 1. Proof:…

4 week 24 Gauss-Markov Theorem The least-squares estimates are BLUE (Best Linear, Unbiased Estimators). Of all the possible linear, unbiased estimators of β 0 and β 1 the least squares estimates have the smallest variance. The variance of the least-squares estimates is…

5 week 25 Estimation of Error Term Variance σ 2 The variance σ 2 of the error terms ε i ’s needs to be estimated to obtain indication of the variability of the probability distribution of Y. Further, a variety of inferences concerning the regression function and the prediction of Y require an estimate of σ 2. Recall, for random variable Z the estimates of the mean and variance of Z based on n realization of Z are…. Similarly, the estimate of σ 2 is S 2 is called the MSE – Mean Square Error it is an unbiased estimator of σ 2 (proof in Chapter 5).

6 week 26 Normal Error Regression Model In order to make inference we need one more assumption about ε i ’s. We assume that ε i ’s have a Normal distribution, that is ε i ~ N(0, σ 2 ). The Normality assumption implies that the errors ε i ’s are independent (since they are uncorrelated). Under the Normality assumption of the errors, the least squares estimates of β 0 and β 1 are equivalent to their maximum likelihood estimators. This results in additional nice properties of MLE’s: they are consistent, sufficient and MVUE.

7 week 27 Example: Calibrating a Snow Gauge Researchers wish to measure snow density in mountains using gamma ray transitions called “gain”. The measuring device needs to be calibrated. It is done with polyethylene blocks of known density. We want to know what density of snow results in particular readings from gamma ray detector. The variables are: Y- gain, X – density. Data: 9 densities in g/cm 3 and 10 measurements of gain for each.


Download ppt "Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions."

Similar presentations


Ads by Google