Incremental Partitioning of Variance (aka Hierarchical Regression)

Slides:



Advertisements
Similar presentations
Multiple Regression.
Advertisements

Selection of Research Participants: Sampling Procedures
MULTIPLE REGRESSION. OVERVIEW What Makes it Multiple? What Makes it Multiple? Additional Assumptions Additional Assumptions Methods of Entering Variables.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 14 Using Multivariate Design and Analysis.
Lecture 6: Multiple Regression
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
Today Concepts underlying inferential statistics
Multiple Regression – Basic Relationships
Chapter 7 Correlational Research Gay, Mills, and Airasian
Multiple Regression Dr. Andy Field.
Multiple Linear Regression A method for analyzing the effects of several predictor variables concurrently. - Simultaneously - Stepwise Minimizing the squared.
Hypothesis Testing. Outline The Null Hypothesis The Null Hypothesis Type I and Type II Error Type I and Type II Error Using Statistics to test the Null.
Chapter 12 Inferential Statistics Gay, Mills, and Airasian
Objectives of Multiple Regression
ANCOVA Lecture 9 Andrew Ainsworth. What is ANCOVA?
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Chapter 13: Inference in Regression
Chapter Eleven Inferential Tests of Significance I: t tests – Analyzing Experiments with Two Groups PowerPoint Presentation created by Dr. Susan R. Burns.
One-Factor Experiments Andy Wang CIS 5930 Computer Systems Performance Analysis.
Understanding Statistics
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Analysis of Covariance, ANCOVA (GLM2)
Soc 3306a Multiple Regression Testing a Model and Interpreting Coefficients.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Funded through the ESRC’s Researcher Development Initiative Prof. Herb MarshMs. Alison O’MaraDr. Lars-Erik Malmberg Department of Education, University.
Regression Analyses. Multiple IVs Single DV (continuous) Generalization of simple linear regression Y’ = b 0 + b 1 X 1 + b 2 X 2 + b 3 X 3...b k X k Where.
Multiple Linear Regression. Purpose To analyze the relationship between a single dependent variable and several independent variables.
By: Amani Albraikan.  Pearson r  Spearman rho  Linearity  Range restrictions  Outliers  Beware of spurious correlations….take care in interpretation.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
Aron, Aron, & Coups, Statistics for the Behavioral and Social Sciences: A Brief Course (3e), © 2005 Prentice Hall Chapter 12 Making Sense of Advanced Statistical.
Adjusted from slides attributed to Andrew Ainsworth
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
Chapter 13 Repeated-Measures and Two-Factor Analysis of Variance
DTC Quantitative Methods Bivariate Analysis: t-tests and Analysis of Variance (ANOVA) Thursday 14 th February 2013.
Applied Quantitative Analysis and Practices LECTURE#28 By Dr. Osman Sadiq Paracha.
Chapter 13 Understanding research results: statistical inference.
Week of March 23 Partial correlations Semipartial correlations
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 18 Multivariate Statistics.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
Chapter 13 Linear Regression and Correlation. Our Objectives  Draw a scatter diagram.  Understand and interpret the terms dependent and independent.
Mediation. 1.Definition 2. Testing mediation using multiple regression Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction.
Methods of Presenting and Interpreting Information Class 9.
Stats Methods at IC Lecture 3: Regression.
Multiple Regression.
University of Warwick, Department of Sociology, 2014/15 SO 201: SSAASS (Surveys and Statistics) (Richard Lampard)   Week 5 Multiple Regression  
Chapter 15 Multiple Regression Model Building
Regression Analysis.
DTC Quantitative Methods Bivariate Analysis: t-tests and Analysis of Variance (ANOVA) Thursday 20th February 2014  
Hypothesis Testing.
Multiple Regression Prof. Andy Field.
Regression Analysis Part D Model Building
Analysis of Covariance, ANCOVA (GLM2)
CJT 765: Structural Equation Modeling
Regression Analysis.
Making Sense of Advanced Statistical Procedures in Research Articles
Multiple Regression – Part II
Multiple Regression.
LEARNING OUTCOMES After studying this chapter, you should be able to
Psych 231: Research Methods in Psychology
Chapter 7: The Normality Assumption and Inference with OLS
Product moment correlation
Inferential Statistics
Multiple Regression – Split Sample Validation
Regression Analysis.
One-Factor Experiments
Regression Part II.
Testing Causal Hypotheses
MGS 3100 Business Analysis Regression Feb 18, 2016
Presentation transcript:

Incremental Partitioning of Variance (aka Hierarchical Regression) Incremental Partitioning=التقسيم التدريجي. Hierarchical Regression=الانحدار الهرمي SRM 625 Applied Multiple Regression, Hutchinson

What is hierarchical regression? Use with survey any thing except experimental data. SRM 625 Applied Multiple Regression, Hutchinson

Why is it used? To test effects of one or more IV's after controlling for one or more other IV's To determine the amount of variance a particular variable (or set of variables) explains, above and beyond what another variable (or set of variables) already explains SRM 625 Applied Multiple Regression, Hutchinson

How does this differ from examination of part/partial correlations in a standard regression analysis? SRM 625 Applied Multiple Regression, Hutchinson

When to use hierarchical regression When your purpose is explanation When you have a solid theoretical basis for determining which variables to control order of entry into the equation SRM 625 Applied Multiple Regression, Hutchinson

When should you not use hierarchical regression? When your purpose is prediction When you are interested in testing relative importance of predictors Why is this inappropriate for hierarchical regression? SRM 625 Applied Multiple Regression, Hutchinson

Determining order of variable entry Based on theory or prior research e.g., prior research suggests a particular variable may "cause" both Y and your IV of interest Based on logic e.g., you would not control for a variable which occurs after your primary IV SRM 625 Applied Multiple Regression, Hutchinson

What is wrong with the logic of this analysis? Example: A researcher wishes to determine if certain dispositional or personality characteristics account for whether or not freshmen drop out of school. He administers several personality inventories during the 1st week of class. At the end of the freshman year he administers a survey to measure college experiences. He then conducts a hierarchical regression to determine if dispositional characteristics explain a significant proportion of variance in likelihood of dropping out, after controlling for college experiences. What is wrong with the logic of this analysis? SRM 625 Applied Multiple Regression, Hutchinson

Other considerations in determining order of entry Typically the variable(s) of primary interest is(are) entered last Order should not be based on "giving variables a chance" by entering first Order should not be based on examination of bivariate correlations SRM 625 Applied Multiple Regression, Hutchinson

Sets of variables may be used either as control variables or as variables of primary interest Use sets when you have conceptually and empirically related IV's e.g., subscales on a personality inventory Use sets when you have non-causally correlated exogeneous variables e.g., SES and school resources SRM 625 Applied Multiple Regression, Hutchinson

ses mot ach res SRM 625 Applied Multiple Regression, Hutchinson

How does one conduct a hierarchical regression analysis? Run the regression analysis in a series of steps A hierarchical regression requires a minimum of two steps Step 1 enter control variable(s) Step 2 enter variable(s) of primary interest SRM 625 Applied Multiple Regression, Hutchinson

Then conduct a test of the increment (or change) in R2 at step 2 i.e., this is a test of the increase in R2 produced by the additional variable(s) above and beyond what the control variables explained Hierarchical regression may involve > 2 steps, in which case successive improvements in R2 are tested at each step Step 2: how much R2 increased. SRM 625 Applied Multiple Regression, Hutchinson

Determining the increment in R2 The increment in R2 is the same as a squared part correlation i.e., R2 change = R2 step 2 - R2 step 1 or SRM 625 Applied Multiple Regression, Hutchinson

increment added at step 3 = R2 step 3 - R2 step 2 or Same general procedure applies when you have > 2 steps in the analysis increment added at step 3 = R2 step 3 - R2 step 2 or SRM 625 Applied Multiple Regression, Hutchinson

Likewise, the increment in R2 added by a “set” of variables is tested using squared, part correlations. SRM 625 Applied Multiple Regression, Hutchinson

Testing the increment in R2 There are two general approaches to testing the R2 at each step They differ in choice of error term to use SRM 625 Applied Multiple Regression, Hutchinson

Approach I: Using Model I Error Model I error uses the error term at the step being tested, with subsequent IV's included in the error term (i.e., based on the full model) The error term and its degrees of freedom change at each step as variables are added into the equation SRM 625 Applied Multiple Regression, Hutchinson

For example, for a 2-step analysis with two variables at step 1 and one variable added at step 2, you would conduct two F tests -- one for the control variables and one for the IV of primary interest. Step 1: Step 2:

How would you more generally express the F test equations when using the model 1 error approach (regardless of the number of variables at each step)? SRM 625 Applied Multiple Regression, Hutchinson

Approach II: Using Model II Error Model II error uses the error term from the last step (i.e., full model) after all variables have been entered All tests of increments in R2 are based on the same mean square error and degrees of freedom Analogous to error term in ANOVA where all IV's are tested against a single error term SRM 625 Applied Multiple Regression, Hutchinson

Using the same example on Slide 19, how would you conduct the two F tests -- one for the control variables and one for the primary IV of interest, using the model II error (aka "pure" error)? Step 1: Step 2:

Advantages of Model II approach Removes systematic sources of variance from the error term Generally more powerful than the Model I approach since error term will tend to be smaller SRM 625 Applied Multiple Regression, Hutchinson

Disadvantages of Model II approach Can be less powerful due to loss of error degrees of freedom if IV's at subsequent steps add only small increments to R2, and there are numerous IV's in the overall model Usually requires hand calculation since not available in most standard statistical packages SRM 625 Applied Multiple Regression, Hutchinson

Advantages of Model I approach Can be more powerful if increments in R2 are trivial and overall model contains many variables Easily obtained in standard statistical packages SRM 625 Applied Multiple Regression, Hutchinson

Disadvantages of Model I approach May violate assumption of random error if IV's left in the error term are significant Generally less powerful since error term contains variance due to subsequent IV's SRM 625 Applied Multiple Regression, Hutchinson

How do you choose between the two approaches? If IV's are trivial, and numerous IV's are included in the model, use the Model I approach If IV's add substantial increments to R2 and/or the model does not have many IV's use the Model II approach SRM 625 Applied Multiple Regression, Hutchinson

Bottom line: take the approach which has greater power for your variables and data set SRM 625 Applied Multiple Regression, Hutchinson

Other cautions and issues Increment in R2 will depend not only upon the correlation between the variable(s) of interest and Y, but also upon how correlated the independent variables are with each other Why? SRM 625 Applied Multiple Regression, Hutchinson

cautions – cont’d Results of hierarchical regression should not be used to compare increments in R2 across samples R2 is sample specific and is affected by a number of factors including variability and reliability of the variables Reliability=trust SRM 625 Applied Multiple Regression, Hutchinson