ANOVA for Regression ANOVA tests whether the regression model has any explanatory power. In the case of simple regression analysis the ANOVA test and the test for b1 are identical.
ANOVA for Regression MSE = SSE/(n-2) MSR = SSR/p where p=number of independent variables F = MSR/MSE
ANOVA Hypothesis Test H0: b1 = 0 Ha: b1 ≠ 0 Reject H0 if: F > Fa Or if: p < a
Regression and ANOVA Source of variation Sum of squares Degrees of freedom Mean Square F Regression SSR 1 MSR=SSR/1 F=MSR/MSE Error SSE n-2 MSE=SSE/(n-2) Total SST n-1
ANOVA and Regression ANOVA df SS MS F Significance F Regression 1 3364 df SS MS F Significance F Regression 1 3364 273 1.23E-15 Residual 27 3334 12.3 Total 28 3697 Fa = 4.21 given a=.05, df num. = 1, df denom. = 27
Issues with Hypothesis Test Results Correlation does NOT prove causation The test does not prove we used the correct functional form
Output with Temperature as Y SUMMARY OUTPUT Regression Statistics Multiple R 0.953884648 R Square 0.909895922 Adjusted R Square 0.906558734 Standard Error 5.053605155 Observations 29 ANOVA df SS MS F Significance F Regression 1 6963.27661 272.6535 1.23118E-15 Residual 27 689.5509766 25.5389251 Total 28 7652.827586 Coefficients t Stat P-value Lower 95% Upper 95% Intercept 67.59301867 1.358242515 49.7650588 4.24E-28 64.80613526 70.3799021 Thousands of cubic feet -1.372438825 0.083116544 -16.512222 1.23E-15 -1.542979885 -1.20189776
Confidence Interval for Estimated Mean Value of y xp = particular or given value of x yp = value of the dependent variable for xp E(yp) = expected value of yp or E(y|x= xp)
Confidence Interval for Estimated Mean Value of y
Computing b0 and b1, Example From example of car age, price: x y 1 15 -3 3 -9 9 14 -1 2 -2 11 4 12 8 5 -4 -20 25 Sum = 20 60 -30 36 Mean = b1 = -0.83 b0 = 15.33
Confidence Interval of Conditional Mean x y 1 15 9 14.5 6.2 0.3 3 14 12.84 0.7 1.3 4 11 3.4 12 12.01 0.0 8 25 7.86 17.4 16 Sum=20 Sum=60 36 SSR=25.0 SSE=5.0 SST=30 Mean=4 Mean=12 b1=-0.833 b0=15.33 r2 = 25/30 = .833
Confidence Interval of Conditional Mean
Confidence Interval of Conditional Mean Given 1-a = .95 and df = 3:
Confidence Interval for Predicted Values of y A confidence interval for a predicted value of y must take into account both random error in the estimate of b1 and the random deviations of individual values from the regression line.
Confidence Interval for Estimated Mean Value of y
Confidence Interval of Individual Value
Confidence Interval of Conditional Mean Given 1-a = .95 and df = 3:
Residual Plots Against x Residual – the difference between the observed value and the predicted value Look for: Evidence of a nonconstant variance Nonlinear relationship
Regression and Outliers Outliers can have a disproportionate effect on the estimated regression line. Coefficients Intercept 36.19972 X Variable 1 -0.44381
Regression and Outliers One solution is to estimate the model with and without the outlier. Questions to ask: Is the value a error? Does the value reflect some unique circumstance? Is the data point providing unique information about values outside of the range of other observations?
Chapter 15 Multiple Regression
Regression Multiple Regression Model y = b0 + b1x1 + b2x2 + … + bpxp + e Multiple Regression Equation y = b0 + b1x1 + b2x2 + … + bpxp Estimated Multiple Regression Equation
Car Data MPG Weight Year Cylinders 18 3504 70 8 15 3693 3436 16 3433 17 3449 4341 14 4354 4312 4425 3850 3563 3609 …
Multiple Regression, Example Coefficients Standard Error t Stat Intercept 46.3 0.800 57.8 Weight -0.00765 0.000259 -29.4 R Square 0.687 Coefficients Standard Error t Stat Intercept -14.7 3.96 -3.71 Weight -0.00665 0.000214 -31.0 Year 0.763 0.0490 15.5 R Square 0.807
Multiple Regression, Example Coefficients Standard Error t Stat Intercept -14.4 4.03 -3.58 Weight -0.00652 0.000460 -14.1 Year 0.760 0.0498 15.2 Cylinders -0.0741 0.232 -0.319 R Square 0.807 Predicted MPG for car weighing 4000 lbs built in 1980 with 6 cylinders: -14.4 -.00652(4000)+.76(80)-.0741(6) =-14.4-26.08+60.8-.4446=19.88
Multiple Regression Model SST = SSR + SSE
Multiple Coefficient of Determination The share of the variation explained by the estimated model. R2 = SSR/SST
F Test for Overall Significance H0: b1 = b1 = . . . = bp Ha: One or more of the parameters is not equal to zero Reject H0 if: F > Fa Or Reject H0 if: p-value < a F = MSR/MSE
ANOVA Table for Multiple Regression Model Source Sum of Squares Degrees of Freedom Mean Squares F Regression SSR p MSR = SSR/p F=MSR/MSE Error SSE n-p-1 MSE = SSE/(n-p-1) Total SST n-1
t Test for Coefficients H0: b1 = 0 Ha: b1 ≠ 0 Reject H0 if: t < -ta/2 or t > ta/2 Or if: p < a t = b1/sb1 With a t distribution of n-p-1 df
Multicollinearity When two or more independent variables are highly correlated. When multicollinearity is severe the estimated values of coefficients will be unreliable Two guidelines for multicollinearity: If the absolute value of the correlation coefficient for two independent variables exceeds 0.7 If the correlation coefficient for independent variable and some other independent variable is greater than the correlation with the dependent variable
Multicollinearity MPG Weight Year Cylinders 1 -0.829 0.578 -0.300 MPG Weight Year Cylinders 1 -0.829 0.578 -0.300 -0.773 0.895 -0.344