Download presentation
Presentation is loading. Please wait.
Published byJune Moody Modified over 8 years ago
1
EXTRA SUMS OF SQUARES Extra sum squares Decomposition of SSR Usage of Extra sum of Squares Coefficient of partial determination
2
A different way to look at the comparison of models. It measures the marginal reduction in the error sum of squares when one or several independent variablea are added to the regression model, given that other indep variables are already in the model. It shows the marginal increasing in the regression sum of squares squares when one or several independent variablea are added to the regression model.
3
Look at the difference in SSE In SSM Because SSM+SSE=SST, these two ways are equivalent Models we compare are hierarchical in the sense that one includes all of the explanatory variables of the other
4
We can compare models with different explanatory variables X 1, X 2 vs X 1 X 1, X 2, X 3, X 4, X 5 vs X 1, X 2, X 3 Note first includes all Xs of second We will get an F test that compares the two models We are testing a null hypothesis that the regression coefficients for the extra variables are all zero
5
For X 1, X 2, X 3, X 4, X 5 vs X 1, X 2, X 3 H 0 : β 4 = β 5 = 0 H 1 : β 4 and β 5 are not both 0 Degrees of freedom for the F statistic are the number of extra variables and the dfE for the model with larger number of explanatory variables Suppose n=100 and we compare models with X 1, X 2, X 3, X 4, X 5 vs X 1, X 2, X 3 Numerator df is 2 Denominator df is n-6 = 94
6
Predict bone density using age, weight and height; does diet add any useful information? Predict faculty salaries using highest degree (categorical), rank (categorical), time in rank, and department (categorical); does race (gender) add any useful information?
7
Predict GPA using 3 HS grade variables; do SAT scores add any useful information? Predict yield of an industrial process using temperature and pH; does the supplier of the raw material (categorical) add any useful information?
8
Suppose we have the case of two X variables The total sum of squares that only X 1 is in the model given by: SST = SSR(X 1 ) + SSE(X 1 ) When we add X 2 in the model while X 1 is already in the model, SST become: SST = SSR(X 1 ) + SSR(X 2 |X 1 ) + SSE(X 2,X 1 ) Equivalent to: SST = SSR(X 2,X 1 ) + SSE(X 2,X 1 )
9
Suppose we have the case of two X variables The total sum of squares that only X 1 is in the model given by: SST = SSR(X 1 ) + SSE(X 1 ) When we add X 2 in the model while X 1 is already in the model, SST become: SST = SSR(X 1 ) + SSR(X 2 |X 1 ) + SSE(X 2,X 1 ) Equivalent to: SST = SSR(X 2,X 1 ) + SSE(X 2,X 1 )
10
Hence decomposition of the regression sum of squares SSR(X 2,X 1 ) into two marginal components are: 1. SSR (X 1 ), measuring the contribution by including X 1 alone in the model, and 2. SSR(X 2 |X 1 ), measuring the additional contribution when X 2 is included, given that X 1 is already in the model.
11
Therefore, when the regression model contains three X variables, a variety decomposition of SSR(X 1,X 2,X 3 ) can be obtained such as: SSR(X 1,X 2,X 3 ) = SSR(X 1 ) + SSR(X 2 |X 1 ) + SSR(X 3 |X 1,X 2 ) SSR(X 1,X 2,X 3 ) = SSR(X 2 ) + SSR(X 3 |X 2 ) + SSR(X 1 |X 2,X 3 ) SSR(X 1,X 2,X 3 ) = SSR(X 1 ) + SSR(X 2,X 3 |X 1 )
12
ANOVA tables containing decomposition of SSR is as follows: Source of Variation Sum of Squares Degrees of Freedom Mean Squares RegressionSSR(X 1,X 2,X 3 )3MSR(X 1,X 2,X 3 ) X1X1 SSR(X 1 )1MSR(X 1 ) X 2 |X 1 SSR(X 2 |X 1 )1MSR(X 2 |X 1 ) X 3 |X 1,X 2 SSR(X 3 |X 1,X 2 )1MSR(X 3 |X 1,X 2 ) ErrorSSE(X 1,X 2,X 3 )n-4MSE(X 1,X 2,X 3 ) TotalSSTn-1
13
Test whether a single β k = 0 Suppose we have a regression with 3 variables indpendent. A full model is given by: and SSE(F) = SSE(X 1,X 2,X 3 ) with df=n-4 To test whether β 2 = 0,we have an alternative model: As a reduced model and SSE(R) = SSE(X 1, X 3 ) with df=n-3
14
The general linear test statistics is given by
15
Test whether several β k = 0 Suppose we have a regression with 3 variables indpendent. A full model is given by: and SSE(F) = SSE(X 1,X 2,X 3 ) with df=n-4 To test whether β 2 = β 3 = 0,we have an alternative model: As a reduced model and SSE(R) = SSE(X 1 ) with df=n-2
16
The general linear test statistics is
17
Full model with p-1 input variables is given by: The least squares estimator : b F =(X’X) -1 X’Y and the error sumof squares is given by: SSE(F) = ( Y’Y – b’ F X’Y ) Reduced model with a single or several β k = 0 is given by: where C β = h, C is sxp matrix of rank s and h is a specified sx1 vector.
18
Example 1: To test whether β 2 = 0 in a regression model contains 2 indep variables, then C = [0 0 1] and h = [0], and we have: Example 2: To test whether β 1 = β 2 = 0 in a regression model contains 2 indep variables, then
19
and we have: And the least squares estimator : b R =b F - (X’X) -1 C’(C (X’X) -1 C’) -1 (Cb F -h) and the error sumof squares is given by: SSE(R) = ( Y’Y – Xb R )’(Y’Y – Xb R ) Which has associated with it df R = n – (p-s)
20
and we have: And the least squares estimator : b R =b F - (X’X) -1 C’(C (X’X) -1 C’) -1 (Cb F -h) and the error sumof squares is given by: SSE(R) = ( Y’Y – Xb R )’(Y’Y – Xb R ) Which has associated with it df R = n – (p-s)
21
Test Statistics where
22
A study of the realtion of amount of body fat (Y) to several possible explanatory, independent variables, based on a sample of 20 healthy females age 25-34 years old. The possible independent variables are triceps skinfold thickness (X 1 ), thigh circumference (X 2 ) and midarm circumference(X 3 ). Underwater weighing is the alternative
23
Suppose we modelize based on all X variables, We have anova table as follows: And the least square estimators for regr coeff: SourceSSdfMSFPr>F Model396.983132.3321.52<.0001 Error98.41166.15 Total495.3919 VarbS(b)tPr>F int117.08100.0681.170.2578 skinfold4,3343.0161.440.1699 thigh-2.8572.582-1.110.2849 midarm-2.1861.596-1.370.1896
24
The P value for F(3, 16) is <.0001 But the P values for the individual regression coefficients are 0.1699, 0.2849, and 0.1896 None of these are near our standard of 0.05 What is the explanation?
25
Var Type I SS Type II SS skinfold 352.26 12.70 thigh 33.16 7.52 midarm 11.54 11.54 Total 495.38 Fact: the Type I and Type II SS are very different If we reorder the variables in the model statement we will get Different Type I SS The same Type II SS
26
Rerun with skinfold as the explanatory variable: And the least square estimators for regr coeff: SourceSSdfMSFPr>F Model352.271 44.31<.0001 Error143.12187.95 Total495.3919 VarbS(b)tPr>F int-1.496 skinfold0.85720.12886.66<0.0001
27
Rerun with skinfold as the explanatory variable: SourceSSdfMSFPr>F Model396.983132.3321.52<.0001 X1X1 352.271 X 2,X 3 |X 1 44.71222.353.640.055 Error98.41166.15 Total495.3919
28
A coeff of multiple determination R 2, measures the proportionate reduction in the variation of Y achieved by the introduction of the entire set of X variables considered in the model. A coeff of partial determination r 2, measures the marginal contribution of one X variable, when all others are already in cluded in the model.
29
Suppose we have a regression with 2 variables indpendent. A full model is given by: The coeff of partial determination between Y and X 1, given that X 2 is in the model is measured by
30
The coeff of partial determination between Y and X 2, given that X 1 is in the model is measured by
31
General Case The coeff of partial determination to 3 or more indep. variables in the model is immediate. For instance:
32
Coeff of partial correlation is the square root of a coeff of partial determination, it sign is the same as the corresponding regression coeff. The coeff of partial determination can be expressed in terms of simple or other partial correlations. For example:
33
Can help reduce round off errors in calculations Puts regression coefficients in common units Units for the usual coefficients are units for Y divided by units for X
34
Standardized can be obtained from the usual ones by multiplying by the ratio of the standard deviation of X to the standard deviation of Y Interpretation is that a one sd increase in X corresponds to a ‘standardized beta’ increase in Y
35
Y = … + β X + … = … + β (s X /s Y )(s Y /s X )X + … = … + ( β (s X /s Y )) ((s Y /s X )X) + … = … + ( β (s X /s Y )) (s Y ) (X/s X ) + …
36
Standardize Y and all X’s (subtract mean and divide by standard deviation Then divide by n-1 The regression coefficients for variables transformed in this way are the standardized regression coefficients
37
Reading NKMW 8.1 to 8.3 Exercise NKMW page 308-312 no 8.3, 8.12 Homework NKMW page 308-312 no 8.25, 8.31
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.