LECTURE 5 MULTIPLE REGRESSION TOPICS –SQUARED MULTIPLE CORRELATION –B AND BETA WEIGHTS –HIERARCHICAL REGRESSION MODELS –SETS OF INDEPENDENT VARIABLES –SIGNIFICANCE TESTING SETS –POWER –ERROR RATES
SQUARED MULTIPLE CORRELATION Measure of variance accounted for by predictors Always increases (or stays same) with additional predictors Always >= 0 in OLS More stable than individual predictors (compensatory effect across samples)
Multiple regression analysis The test of the overall hypothesis that y is unrelated to all predictors, equivalent to H 0 : 2 y 123… = 0 H 1 : 2 y 123… = 0 is tested by F = [ R 2 y 123… / p] / [ ( 1 - R 2 y 123… ) / (n – p – 1) ] F = [ SS reg / p ] / [ SS e / (n – p – 1)]
ss x 1 ss x 2 SSy SSe Fig. 8.4: Venn diagram for multiple regression with two predictors and one outcome measure SS reg
ss x 1 ss x 2 SSy SSe Fig. 8.4: Venn diagram for multiple regression with two predictors and one outcome measure SS reg
Type I ss x 1 Type III ss x 2 SSy SSe Fig. 8.5: Type I and III contributions SSx 1 SSx 2
B and Beta Weights B weights –are t-distributed under multinormality –Give change in y per unit change in predictor x –“raw” or “unstandardized” coefficients
B and Beta Weights Beta weights –are NOT t-distributed- no correct significance test –Give change in y in standard deviation units per standard deviation change in predictor x –“standardized” coefficients –More easily interpreted
X1X1 X2X2 Y e =.5 =.6 r =.4 R 2 = (.74)(.8)(.4) ( ) = PATH DIAGRAM FOR REGRESSION – Beta weight form
Depression DEPRESSION LOC. CON. SELF-EST SELF-REL R 2 =.60 e
X1X1 X2X2 Y1Y1 e1 =.2 =.3 r =.35* R 2 y= PATH DIAGRAM FOR REGRESSIONS – Beta weight form Y2Y2 e2 =.2 =.5 =.3 R 2 y=.2
HIERARCHICAL REGRESSION Predictors entered in SETS First set either causally prior, existing conditions, or theoretically/empirically established structure Next set added to decide if model changes Mediation effect Independent contribution to R-square
HIERARCHICAL REGRESSION Sample-focused procedures: Forward regression Backward regression Stepwise regression Criteria may include: R-square change in sample, error reduction
STATISTICAL TESTING – Single additional predictor R-square change: F-test for increase in SS per predictor in relation to MSerror for complete model: F (1,dfe) = (SS A+B – SS A )/ MSe AB SSe A B A B Y Y b yB t = b yB / s e b yB
STATISTICAL TESTING –Sets of predictors R-square change: F-test for increase in SS per p predictors in relation to MSerror for complete model: F (p,dfe) = ((SS A+B – SS A )/p)/ MSe AB SSe A B Y B is a set of p predictors
Experimentwise Error Rate Bonferroni error rate: p total <= p1 + p2 + p3 + … Allocate error differentially according to theory: –Predicted variables should have liberal error for deletion (eg..05 to retain in model) –Unpredicted additional variables should have conservative error to add (eg..01 to add to model)