Review of Chapter 3 where Multiple Linear Regression Model: LS Estimators where 11/21/2018 ST3131, Review of chapter 3
Noise variance estimator: Fitted values: Residuals: Noise variance estimator: Decompositions Observation Squares SST = SSR + SSE Degrees of Freedom df(SST) = df(SSR) + df(SSE) 11/21/2018 ST3131, Review of chapter 3
Interpretation of the Coefficients Another Interpretation: Assumption: 11/21/2018 ST3131, Review of chapter 3
3). BLUE estimators (Best Linear Unbiased Estimators) Properties: 1). Linearity 2). Unbiased 3). BLUE estimators (Best Linear Unbiased Estimators) 4) Normality 11/21/2018 ST3131, Review of chapter 3
Variances of the Estimates Noise variance estimator and Standard Errors 11/21/2018 ST3131, Review of chapter 3
Inferences for Individual Coefficients/T-test Hypothesis Testing Confidence Interval 11/21/2018 ST3131, Review of chapter 3
Steps for Model Comparison/F-test : RM H0: The RM is adequate vs FM H1: The FM is adequate Step1: Fit the FM and get SSE (in the ANOVA table) df (in the ANOVA table) R_sq (under the Coefficient Table) Step 2: Fit the RM and get SSE, df, and R_sq. Step 3: Compute F-statistic: Step 4: Conclusion: Reject H0 if F>F(r,df(SSE,F),alpha) Can’t Reject H0 otherwise. 11/21/2018 ST3131, Review of chapter 3
Special Case: ANOVA Table (Analysis of Variance) Source Sum of Squares df Mean Square F-test P-value Regression SSR p MSR=SSR/p F=MSR/MSE Residuals SSE n-p-1 MSE=SSE/(n-p-1) Total SST n-1 11/21/2018 ST3131, Review of chapter 3
Coefficient Table 11/21/2018 ST3131, Review of chapter 3
Multiple Correlation Coefficients R-square measures the percentage of the total variability in Y Explained by the MLR. 11/21/2018 ST3131, Review of chapter 3