Presentation is loading. Please wait.

Presentation is loading. Please wait.

Incremental Partitioning of Variance (aka Hierarchical Regression)

Similar presentations


Presentation on theme: "Incremental Partitioning of Variance (aka Hierarchical Regression)"— Presentation transcript:

1 Incremental Partitioning of Variance (aka Hierarchical Regression)
Incremental Partitioning=التقسيم التدريجي. Hierarchical Regression=الانحدار الهرمي SRM 625 Applied Multiple Regression, Hutchinson

2 What is hierarchical regression?
Use with survey any thing except experimental data. SRM 625 Applied Multiple Regression, Hutchinson

3 Why is it used? To test effects of one or more IV's after controlling for one or more other IV's To determine the amount of variance a particular variable (or set of variables) explains, above and beyond what another variable (or set of variables) already explains SRM 625 Applied Multiple Regression, Hutchinson

4 How does this differ from examination of part/partial correlations in a standard regression analysis? SRM 625 Applied Multiple Regression, Hutchinson

5 When to use hierarchical regression
When your purpose is explanation When you have a solid theoretical basis for determining which variables to control order of entry into the equation SRM 625 Applied Multiple Regression, Hutchinson

6 When should you not use hierarchical regression?
When your purpose is prediction When you are interested in testing relative importance of predictors Why is this inappropriate for hierarchical regression? SRM 625 Applied Multiple Regression, Hutchinson

7 Determining order of variable entry
Based on theory or prior research e.g., prior research suggests a particular variable may "cause" both Y and your IV of interest Based on logic e.g., you would not control for a variable which occurs after your primary IV SRM 625 Applied Multiple Regression, Hutchinson

8 What is wrong with the logic of this analysis?
Example: A researcher wishes to determine if certain dispositional or personality characteristics account for whether or not freshmen drop out of school. He administers several personality inventories during the 1st week of class. At the end of the freshman year he administers a survey to measure college experiences. He then conducts a hierarchical regression to determine if dispositional characteristics explain a significant proportion of variance in likelihood of dropping out, after controlling for college experiences. What is wrong with the logic of this analysis? SRM 625 Applied Multiple Regression, Hutchinson

9 Other considerations in determining order of entry
Typically the variable(s) of primary interest is(are) entered last Order should not be based on "giving variables a chance" by entering first Order should not be based on examination of bivariate correlations SRM 625 Applied Multiple Regression, Hutchinson

10 Sets of variables may be used either as control variables or as variables of primary interest
Use sets when you have conceptually and empirically related IV's e.g., subscales on a personality inventory Use sets when you have non-causally correlated exogeneous variables e.g., SES and school resources SRM 625 Applied Multiple Regression, Hutchinson

11 ses mot ach res SRM 625 Applied Multiple Regression, Hutchinson

12 How does one conduct a hierarchical regression analysis?
Run the regression analysis in a series of steps A hierarchical regression requires a minimum of two steps Step 1 enter control variable(s) Step 2 enter variable(s) of primary interest SRM 625 Applied Multiple Regression, Hutchinson

13 Then conduct a test of the increment (or change) in R2 at step 2
i.e., this is a test of the increase in R2 produced by the additional variable(s) above and beyond what the control variables explained Hierarchical regression may involve > 2 steps, in which case successive improvements in R2 are tested at each step Step 2: how much R2 increased. SRM 625 Applied Multiple Regression, Hutchinson

14 Determining the increment in R2
The increment in R2 is the same as a squared part correlation i.e., R2 change = R2 step 2 - R2 step 1 or SRM 625 Applied Multiple Regression, Hutchinson

15 increment added at step 3 = R2 step 3 - R2 step 2 or
Same general procedure applies when you have > 2 steps in the analysis increment added at step 3 = R2 step 3 - R2 step 2 or SRM 625 Applied Multiple Regression, Hutchinson

16 Likewise, the increment in R2 added by a “set” of variables is tested using squared, part correlations. SRM 625 Applied Multiple Regression, Hutchinson

17 Testing the increment in R2
There are two general approaches to testing the R2 at each step They differ in choice of error term to use SRM 625 Applied Multiple Regression, Hutchinson

18 Approach I: Using Model I Error
Model I error uses the error term at the step being tested, with subsequent IV's included in the error term (i.e., based on the full model) The error term and its degrees of freedom change at each step as variables are added into the equation SRM 625 Applied Multiple Regression, Hutchinson

19 For example, for a 2-step analysis with two variables at
step 1 and one variable added at step 2, you would conduct two F tests -- one for the control variables and one for the IV of primary interest. Step 1: Step 2:

20 How would you more generally express the F test equations when using the model 1 error approach (regardless of the number of variables at each step)? SRM 625 Applied Multiple Regression, Hutchinson

21 Approach II: Using Model II Error
Model II error uses the error term from the last step (i.e., full model) after all variables have been entered All tests of increments in R2 are based on the same mean square error and degrees of freedom Analogous to error term in ANOVA where all IV's are tested against a single error term SRM 625 Applied Multiple Regression, Hutchinson

22 Using the same example on Slide 19, how would you
conduct the two F tests -- one for the control variables and one for the primary IV of interest, using the model II error (aka "pure" error)? Step 1: Step 2:

23 Advantages of Model II approach
Removes systematic sources of variance from the error term Generally more powerful than the Model I approach since error term will tend to be smaller SRM 625 Applied Multiple Regression, Hutchinson

24 Disadvantages of Model II approach
Can be less powerful due to loss of error degrees of freedom if IV's at subsequent steps add only small increments to R2, and there are numerous IV's in the overall model Usually requires hand calculation since not available in most standard statistical packages SRM 625 Applied Multiple Regression, Hutchinson

25 Advantages of Model I approach
Can be more powerful if increments in R2 are trivial and overall model contains many variables Easily obtained in standard statistical packages SRM 625 Applied Multiple Regression, Hutchinson

26 Disadvantages of Model I approach
May violate assumption of random error if IV's left in the error term are significant Generally less powerful since error term contains variance due to subsequent IV's SRM 625 Applied Multiple Regression, Hutchinson

27 How do you choose between the two approaches?
If IV's are trivial, and numerous IV's are included in the model, use the Model I approach If IV's add substantial increments to R2 and/or the model does not have many IV's use the Model II approach SRM 625 Applied Multiple Regression, Hutchinson

28 Bottom line: take the approach which has greater power for your variables and data set
SRM 625 Applied Multiple Regression, Hutchinson

29 Other cautions and issues
Increment in R2 will depend not only upon the correlation between the variable(s) of interest and Y, but also upon how correlated the independent variables are with each other Why? SRM 625 Applied Multiple Regression, Hutchinson

30 cautions – cont’d Results of hierarchical regression should not be used to compare increments in R2 across samples R2 is sample specific and is affected by a number of factors including variability and reliability of the variables Reliability=trust SRM 625 Applied Multiple Regression, Hutchinson


Download ppt "Incremental Partitioning of Variance (aka Hierarchical Regression)"

Similar presentations


Ads by Google