Moderated Multiple Regression Class 23. STATS TAKE HOME EXERCISE IS DUE THURSDAY DEC. 12 Deliver to Kent’s Mailbox or Place under his door (Rm. 352)

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Getting More out of Multiple Regression Darren Campbell, PhD.
Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with.
Forecasting Using the Simple Linear Regression Model and Correlation
Basic Data Analysis IV Regression Diagnostics in SPSS
1 Module II Lecture 4:F-Tests Graduate School 2004/2005 Quantitative Research Methods Gwilym Pryce
Multiple regression analysis
Chapter 13 Additional Topics in Regression Analysis
Additional Topics in Regression Analysis
Linear Regression and Correlation Analysis
Multiple Regression and Correlation Analysis
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Brown, Suter, and Churchill Basic Marketing Research (8 th Edition) © 2014 CENGAGE Learning Basic Marketing Research Customer Insights and Managerial Action.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Multiple Regression Research Methods and Statistics.
Correlation and Regression Analysis
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Multiple Regression Dr. Andy Field.
Relationships Among Variables
Introduction to Linear Regression and Correlation Analysis
Multiple Linear Regression and Correlation Analysis
Multiple Regression Class 22.
Moderated Multiple Regression Class 22. STATS TAKE HOME EXERCISE IS DUE THURSDAY DEC. 12.
Understanding Regression Analysis Basics. Copyright © 2014 Pearson Education, Inc Learning Objectives To understand the basic concept of prediction.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
Regression Analyses II Mediation & Moderation. Review of Regression Multiple IVs but single DV Y’ = a+b1X1 + b2X2 + b3X3...bkXk Where k is the number.
Soc 3306a Lecture 9: Multivariate 2 More on Multiple Regression: Building a Model and Interpreting Coefficients.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Moderated Multiple Regression Class 18. Functions of Regression 1. Establishing relations between variables Do frustration and aggression co-occur? 2.
Regression Analyses. Multiple IVs Single DV (continuous) Generalization of simple linear regression Y’ = b 0 + b 1 X 1 + b 2 X 2 + b 3 X 3...b k X k Where.
Lab 5 instruction.  a collection of statistical methods to compare several groups according to their means on a quantitative response variable  Two-Way.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
SW388R6 Data Analysis and Computers I Slide 1 Multiple Regression Key Points about Multiple Regression Sample Homework Problem Solving the Problem with.
Department of Cognitive Science Michael J. Kalsher Adv. Experimental Methods & Statistics PSYC 4310 / COGS 6310 Regression 1 PSYC 4310/6310 Advanced Experimental.
Chapter 13 Multiple Regression
September 18-19, 2006 – Denver, Colorado Sponsored by the U.S. Department of Housing and Urban Development Conducting and interpreting multivariate analyses.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
General Linear Model.
ANOVA, Regression and Multiple Regression March
Ch15: Multiple Regression 3 Nov 2011 BUSI275 Dr. Sean Ho HW7 due Tues Please download: 17-Hawlins.xls 17-Hawlins.xls.
Handout Twelve: Design & Analysis of Covariance
Multiple Regression David A. Kenny January 12, 2014.
Moderated Multiple Regression II Class 25. Regression Models Basic Linear Model Features: Intercept, one predictor Y = b 0 + b 1 + Error (residual) Do.
More on regression Petter Mostad More on indicator variables If an independent variable is an indicator variable, cases where it is 1 will.
ALISON BOWLING MODERATION AND MEDIATION IN REGRESSION.
Week of March 23 Partial correlations Semipartial correlations
Lecture 7: Bivariate Statistics. 2 Properties of Standard Deviation Variance is just the square of the S.D. If a constant is added to all scores, it has.
Michael J. Kalsher PSYCHOMETRICS MGMT 6971 Regression 1 PSYC 4310 Advanced Experimental Methods and Statistics © 2014, Michael Kalsher.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 9 l Simple Linear Regression 9.1 Simple Linear Regression 9.2 Scatter Diagram 9.3 Graphical.
رگرسیون چندگانه Multiple Regression
Week 2 Normal Distributions, Scatter Plots, Regression and Random.
Correlation & Simple Linear Regression Chung-Yi Li, PhD Dept. of Public Health, College of Med. NCKU 1.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
Predicting Energy Consumption in Buildings using Multiple Linear Regression Introduction Linear regression is used to model energy consumption in buildings.
Categorical Variables in Regression
Chapter 15 Multiple Regression Model Building
Chapter 4: Basic Estimation Techniques
LINEAR REGRESSION 1.
Multiple Regression Prof. Andy Field.
Moderation, Mediation, and Other Issues in Regression
Regression Analysis Simple Linear Regression
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
Multiple Regression Example
Multiple Regression.
Remaining Classes Class 25 (Dec. 5): Moderated Multiple Regression Quiz 3 postponed to Dec. 7. Class 26 (Dec. 7): Quiz 3; Designing.
Multiple Regression Chapter 14.
Regression Analysis.
Presentation transcript:

Moderated Multiple Regression Class 23

STATS TAKE HOME EXERCISE IS DUE THURSDAY DEC. 12 Deliver to Kent’s Mailbox or Place under his door (Rm. 352)

Regression Model for Esteem and Affect as Information Model Y = b 0 + b 1 X + b 2 Z + b 3 XZ Where Y = cry rating X = upset Z = esteem XZ = esteem*upset And b 0 = X.XX = MEANING? b 1 = = X.XX = MEANING? b 2 = = X.XX = MEANING? b 3 = =X.XX = MEANING?

Regression Model for Esteem and Affect as Information Model: Y = b 0 + b 1 X + b 2 Z + b 3 XZ Where Y = cry rating X = upset Z = esteem XZ = esteem*upset And b 0 = 6.53 = intercept (average score when upset, esteem, upsetXexteem = 0) b 1 = = slope (influence) of upset b 2 = = slope (influence) of esteem b 3 = 0.18 = slope (influence) of upset X esteem interaction

Plotting Outcome: Baby Cry Ratings as a Function of Listener's Upset and Listener's Self Esteem ???

Plotting Outcome: Baby Cry Ratings as a Function of Listener's Upset and Listener's Self Esteem cry rating Upset Self Esteem

Plotting Interactions with Two Continuous Variables Y = b 0 + b 1 X + b 2 Z + b 3 XZ equals Y = (b 1 + b 3 Z)X + (b 2 Z + b 0 ) Y = (b 1 + b 3 Z)X is simple slope of Y on X at Z. Means "the effect X has on Y, conditioned by the interactive contribution of Z." Thus, when Z is one value, the X slope takes one shape, when Z is another value, the X slope takes other shape.

Plotting Simple Slopes 1.Compute regression to obtain values of Y = b 0 + b 1 X + b 2 Z + b 3 XZ 2. Transform Y = b 0 + b 1 X + b 2 Z + b 3 XZ into Y = (b 1 + b 3 Z)X + (b 2 Z + b 0 ) and insert values Y = (? + ?Z)X + (?Z + ?) 3. Select 3 values of Z that display the simple slopes of X when Z is low, when Z is average, and when Z is high. Standard practice: Z at one SD above the mean = Z H Z at the mean= Z M Z at one SD below the mean = Z L

Interpreting SPSS Regression Output (a) Regression page A1

4.Insert values for all the regression coefficients (i.e., b 1, b 2, b 3 ) and the intercept (i.e., b 0 ), from computation (i.e., SPSS print-out). 5.Insert Z H into (b 1 + b 3 Z)X + (b 2 Z + b 0 ) to get slope when Z is high Insert Z M into (b 1 + b 3 Z)X + (b 2 Z + b 0 ) to get slope when Z is moderate Insert Z L into (b 1 + b 3 Z)X + (b 2 Z + b 0 ) to get slope when Z is low Plotting Simple Slopes (continued)

Example of Plotting Baby Cry Study, Part I Y (cry rating) = b 0 (rating when all predictors = zero) + b 1 X (effect of upset) + b 2 Z (effect of esteem) + b 3 XZ (effect of upset X esteem interaction). Y= X Z +.18XZ. Y = (b 1 + b 3 Z)X + (b 2 Z + b 0 ) [conversion for simple slopes] Y= ( Z )X + (-.48 Z ) Compute Z H, Z M, Z L via “Frequencies" for esteem, 3.95 = mean,.76 = SD Z H, = ( ) = 4.71 Z M = ( ) = 3.95 Z L = ( ) = 3.19 Slope at Z H = ( * 4.71 )X + ([-.48 * 4.71 ] ) =.32X Slope at Z M = ( * 3.95 )X + ([-.48 * 3.95 ] ) =. 18X Slope at Z L = ( * 3.19 )X + ([-.48 * 3.19 ] ) =. 04X

Example of Plotting, Baby Cry Study, Part II 1. Compute mean and SD of main predictor ("X") i.e., Upset Upset mean = 2.94, SD = Select values on the X axis displaying main predictor, e.g. upset at: Low upset = 1 SD below mean` = 2.94 – 1.21 = 1.73 Medium upset = mean = 2.94 – 0.00 = 2.94 High upset = 1SD above mean = = Plug these values into Z H, Z M, Z L simple slope equations Simple Slope FormulaLow Upset (X = 1.73) Medium Upset (X = 2.94) High Upset (X = 4.15) ZHZH Y =.32X ZMZM Y =.18X ZLZL Y =.04X Plot values into graph

Graph Displaying Simple Slopes

Are the Simple Slopes Significant? Question: Do the slopes of each of the simple effects lines (Z H, Z M, Z L ) significantly differ from zero? Procedure to test, using as an example Z H (the slope when esteem is high): 1. Transform Z to Z cvh (CV = conditional value) by subtracting Z H from Z. Z cvh = Z - Z H = Z – 4.71 Conduct this transformation in SPSS as: COMPUTE esthigh = esteem Create new interaction term specific to Z cvh, i.e., (X* Z cvh ) COMPUTE upesthi = upset*esthigh. 3. Run regression, using same X as before, but substituting Z cvh for Z, and X* Z cvh for XZ

Are the Simple Slopes Significant?--Programming COMMENT SIMPLE SLOPES FOR CLASS DEMO COMPUTE esthigh = esteem COMPUTE estmed = esteem COMPUTE estlow = esteem COMPUTE upesthi = esthigh*upset. COMPUTE upestmed = estmed*upset. COMPUTE upestlow = estlow*upset. REGRESSION [for the simple effect of high esteem (esthigh)] /MISSING LISTWISE /STATISTICS COEFF OUTS BCOV R ANOVA CHANGE /CRITERIA=PIN(.05) POUT(.10) /NOORIGIN /DEPENDENT crytotl /METHOD=ENTER upset esthigh /METHOD=ENTER upset esthigh upesthi.

Simple Slopes Significant?—Results Regression NOTE: Key outcome is B of "upset", Model 2. If significant, then the simple effect of upset for the high esteem slope is signif.

Moderated Multiple Regression with Continuous Predictor and Categorical Moderator (Aguinis, 2004) Problem : Does caffeine lead to more arguments, but mainly for people with hostile personalities? Criterion: Weekly arguments Continuous Var Predictor: Caffeinated coffee Categorical Var. 0 = decaff, 1 = caffeinated Moderator: Hostility Continuous var

Regression Models to Test Moderating Effect of Tenure on Salary Increase Without Interaction Arguments = b 0 (ave.arguments) + b 1 (coffee.type) + b 2 (hositility.score) With Interaction Salary increase = b 0 (ave. salary) + b 1 (coffee) + b 2 (hostility) + b 3 (coffee*hostility) Coffee is categorical, therefore a " dummy variable ", values = 0 or 1 These values are markers, do not convey quantity Interaction term = Predictor * moderator, = coffee*hositility. That simple. Conduct regression, plotting, simple slopes analyses same as when predictor and moderator are both continuous variables.

Coffee Hostility Args. Coff.hostile

DATASET ACTIVATE DataSet1. COMPUTE coffee.hostile=coffee * hostile.personality. EXECUTE. REGRESSION /DESCRIPTIVES MEAN STDDEV CORR SIG N /MISSING LISTWISE /STATISTICS COEFF OUTS R ANOVA CHANGE /CRITERIA=PIN(.05) POUT(.10) /NOORIGIN /DEPENDENT arguments /METHOD=ENTER coffee hostile.personality /METHOD=ENTER coffee.hostile.

Plotting of Arguments due to Caffeine & Hostility Y (arguments) = b 0 (args when all predictors = zero) + b 1 X (effect of coffee) + b 2 Z (effect of hostility) + b 3 XZ (effect of coffee X hostility). Y= X+ 0.74Z XZ. Y = (b 1 + b 3 Z)X + (b 2 Z + b 0 ) [conversion for simple slopes] Y= ( Z )X + (.74 Z +.84) Compute Z H, Z M, Z L via “Frequencies" for esteem, 3.95 = mean,.76 = SD Z H, = ( ) = 5.32 Z M = ( ) = 3.60 Z L = ( ) = 1.88 Slope at Z H = ( * 5.32 )X + ([.74 * 5.32 ] +.84) = 2.34X Slope at Z M = ( * 3.60 )X + ([.74 * 3.60 ] +.84) = 1.58X Slope at Z L = ( * 1.88 )X + ([.74 * 1.88 ] +.84) = 0.83X

Plotting Dummy Variable Interaction 1. Main predictor has only 2 values, 0 and 1 2.Select values on the X axis displaying main predictor, e.g. upset at: No Caffeine = 0 Caffeine = 1 3.Plug these values into Z H, Z M, Z L simple slope equations Simple Slope FormulaNo Caff. (X = 0) Caffeinated (X = 1) ZHZH Y= 2.34X ZMZM Y =1.58X ZLZL Y =.83X Plot values into graph

Graph Displaying Simple Slopes

Centering Data Centering data is done to standardize it. Aiken and West recommend doing it in all cases. * Makes zero score meaningful * Has other benefits Aguinas recommends doing it in some cases. * Sometimes uncentered scores are meaningful Procedure upset M = 2.94, SD = 1.19; esteem M = 3.94, SD = 0.75 COMPUTE upcntr = upset – COMPUTE estcntr = esteem = 3.94 upcntr M = 0, SD = 1.19; esteem M = 0, SD = 0.75 Centering may affect the slopes of predictor and moderator, BUT it does not affect the interaction term.

Requirements and Assumptions (Continued) Independent Errors: Residuals for Sub. 1 ≠ residuals for Sub. 2. For example Sub. 2 sees Sub 1 screaming as Sub 1 leaves experiment. Sub 1 might influence Sub 2. If each new sub is affected by preceding sub, then this influence will reduce independence of errors, i.e., create autocorrelation. Autocorrelation is bias due to temporal adjacency. Assess: Durbin-Watson test. Values range from 0 - 4, "2" is ideal. Closer to 0 means neg. correl, closer to 4 = pos. correl. Sub 1 Funny movie Sub 2 Funny movie Sub 3 Sad movie Sub 4 Sad movie Sub 5 Funny movie Sub 6 Funny movie r (s1 s2) + r (s2 s3) + r (s3 s4) - r (s4 s5) - r (s5 s6) +

DATASET ACTIVATE DataSet1. REGRESSION /DESCRIPTIVES MEAN STDDEV CORR SIG N /MISSING LISTWISE /STATISTICS COEFF OUTS R ANOVA CHANGE /CRITERIA=PIN(.05) POUT(.10) /NOORIGIN /DEPENDENT crytotl /METHOD=ENTER age upset /RESIDUALS DURBIN. Durbin-Watson Test of Autocorrelation

Multicollinearity In multiple regression, statistic assumes that each new predictor is in fact a unique measure. If two predictors, A and B, are very highly correlated, then a model testing the added effect of Predictors A and B might, in effect, be testing Predictor A twice. If so, the slopes of each variable are not orthogonal (go in different directions, but instead run parallel to each other (i.e., they are co-linear ). Orthogonal Non-orthogonal

Mac Collinearity: A Multicollinearity Saga Suffering negative publicity regarding the health risks of fast food, the fast food industry hires the research firm of Fryes, Berger, and Shayque (FBS) to show that there is no intrinsic harm in fast food. FBS surveys a random sample, and asks: a.To what degree are you a meat eater? (carnivore) b.How often do you purchase fast food? (fast.food) c.What is your health status? (health) FBS conducts a multiple regression, entering fast.food in step one and carnivore in step 2.

FBS Fast Food and Carnivore Analysis “See! See!” the FBS researchers rejoiced “Fast Food negatively predicts health in Model 1, BUT the effect of fast food on health goes away in Model 2, when being a carnivore is considered.”

Not So Fast, Fast Food Flacks Colinearity Diagnostics 1.Correlation table 2.Collinearity Statistics VIF (should be < 10) and/or Tolerance should be more than. 20