Continuous Moderator Variables

Slides:



Advertisements
Similar presentations
Multiple Correlation & Regression SPSS. Analyze, Regression, Linear Notice that we have added “ideal” to the model we tested earlier.
Advertisements

Multicollinearity.
Introduction to Predictive Modeling with Examples Nationwide Insurance Company, November 2 D. A. Dickey.
Simple Linear Regression 1. Correlation indicates the magnitude and direction of the linear relationship between two variables. Linear Regression: variable.
Comparing Regression Lines
Creating Graphs on Saturn GOPTIONS DEVICE = png HTITLE=2 HTEXT=1.5 GSFMODE = replace; PROC REG DATA=agebp; MODEL sbp = age; PLOT sbp*age; RUN; This will.
Analysis of Economic Data
Some Terms Y =  o +  1 X Regression of Y on X Regress Y on X X called independent variable or predictor variable or covariate or factor Which factors.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
Multiple regression analysis
Statistics for the Social Sciences
Plots, Correlations, and Regression Getting a feel for the data using plots, then analyzing the data with correlations and linear regression.
Two-Way ANOVA in SAS Multiple regression with two or
Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Apr. 15, 2008.
Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014.
Continuous Moderator Variables
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Correlation and Regression Analysis
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Example of Simple and Multiple Regression
Introduction to Multilevel Modeling Using SPSS
Moderation & Mediation
1 Experimental Statistics - week 4 Chapter 8: 1-factor ANOVA models Using SAS.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved OPIM 303-Lecture #9 Jose M. Cruz Assistant Professor.
1 1 Slide Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple Coefficient of Determination n Model Assumptions n Testing.
Copyright © 2010 Pearson Education, Inc Chapter Seventeen Correlation and Regression.
1 Experimental Statistics - week 10 Chapter 11: Linear Regression and Correlation Note: Homework Due Thursday.
Regression Analyses II Mediation & Moderation. Review of Regression Multiple IVs but single DV Y’ = a+b1X1 + b2X2 + b3X3...bkXk Where k is the number.
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
Statistics for clinicians Biostatistics course by Kevin E. Kip, Ph.D., FAHA Professor and Executive Director, Research Center University of South Florida,
Multiple Regression The Basics. Multiple Regression (MR) Predicting one DV from a set of predictors, the DV should be interval/ratio or at least assumed.
6-1 Introduction To Empirical Models Based on the scatter diagram, it is probably reasonable to assume that the mean of the random variable Y is.
Multiple Linear Regression Partial Regression Coefficients.
Regression Models w/ 2 Quant Variables Sources of data for this model Variations of this model Main effects version of the model –Interpreting the regression.
Regression Mediation Chapter 10. Mediation Refers to a situation when the relationship between a predictor variable and outcome variable can be explained.
Chapter 13 Multiple Regression
Lecture 3 Topic - Descriptive Procedures Programs 3-4 LSB 4:1-4.4; 4:9:4:11; 8:1-8:5; 5:1-5.2.
Introduction to Correlation & Regression with SAS Sam Gordji Weir 107.
 Relationship between education level, income, and length of time out of school  Our new regression equation: is the predicted value of the dependent.
Multiple regression.
Linear Regression and Correlation Chapter GOALS 1. Understand and interpret the terms dependent and independent variable. 2. Calculate and interpret.
Moderated Multiple Regression II Class 25. Regression Models Basic Linear Model Features: Intercept, one predictor Y = b 0 + b 1 + Error (residual) Do.
Two-Group Discriminant Function Analysis. Overview You wish to predict group membership. There are only two groups. Your predictor variables are continuous.
ALISON BOWLING MODERATION AND MEDIATION IN REGRESSION.
Mediation. 1.Definition 2. Testing mediation using multiple regression Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction.
Predicting Energy Consumption in Buildings using Multiple Linear Regression Introduction Linear regression is used to model energy consumption in buildings.
Categorical Variables in Regression
Learning Objectives For two quantitative IVs, you will learn:
Why is this important? Requirement Understand research articles
Correlation and Simple Linear Regression
Regression Chapter 6 I Introduction to Regression
Essentials of Modern Business Statistics (7e)
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
Multiple Regression.
John Loucks St. Edward’s University . SLIDES . BY.
Least Squares ANOVA & ANCOV
The Practice of Statistics in the Life Sciences Fourth Edition
CHAPTER 29: Multiple Regression*
6-1 Introduction To Empirical Models
Introduction to ANOVA.
Trivariate Regression
Type=Corr SAS.
Correlation and Regression
Multiple Regression Chapter 14.
Correlation and the Pearson r
Prediction/Regression
ProcessR Shiny App
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Continuous Moderator Variables in Multiple Regression Analysis

What is a Moderator? A variable that alters the relationship between two or more other variables. If the relationship between X and Y varies across levels of M, then M is a moderator. “Moderation” is nothing more than what we called “interaction” in factorial ANOVA

Misanthropy, Idealism, and Attitudes About Animals Same data I used to illustrate an interaction between a dichotomous variable and a continuous variable. But idealism will not be dichotomized. The criterion variable is score on the first subscale of the Animal Attitudes scale. The Animal Rights subscale 12 Likert-type items Cronbach alpha = .87.

Download Moderate.dat from my data files page. Moderate.sas from my SAS programs page. Point the program to the data file. Run the program.

Data AR1; infile <point to data file, \Moderate.dat'; *MODERATE.SAS; Data AR1; infile <point to data file, \Moderate.dat'; input AR Misanth Ideal; Proc Means stddev min max nmiss; var AR Misanth Ideal; run; Proc Standard mean=0 std=1 out=Zs; Data Interaction; set Zs; Interact = Misanth*Ideal; Proc Corr; var AR Misanth Ideal; Proc Reg; Model AR = Misanth Ideal interact / tol; run; QUIT;

Center the Variables ? Subtract mean from each score For all predictor variables that are involved in the interaction(s) This is commonly done and believed to prevent problems with Multicollinearity And other things (see Howell) May center the outcome variable too, but that is not necessary.

Don’t Center the Variables As recently demonstrated by Andrew Hayes, it is NOT necessary to center the predictors. It may, however, be easier to interpret the results if they are centered.

Standardize the Variables Unstandardized regression coefficients are rarely useful for the psychologist. Just standardize all of the variables to z scores. Which, of course, are centered. Proc Standard mean=0 std=1 out=Zs;

Create the Interaction Term(s) & Run the Regression Data Interaction; set Zs; Interact = Misanth*Ideal; Proc Corr; var AR Misanth Ideal; Proc Reg; Model AR = Misanth Ideal interact / tol; run; QUIT;

Regression Output R2 = .113 , p < .001 ZAR = .303ZMisanth + .067ZIdeal  .146Zinteract -.02 The interaction is significant, p = .049. What does the regression look like for low (-1), medium (0), and high (+1) values of the moderator?

Do It Yourself Obtaining the simple slopes yourself is a bit of a pain in the arse. See http://core.ecu.edu/psyc/wuenschk/PP/Moderator.pptx It is much easier to get them by using Hayes’ Process software.

Process Hayes Hayes, A. F. (2013). Introduction to mediation, moderation, and conditional process analysis. New York, NY: Guilford. Highly recommended for those into mediation, moderation, and moderated mediation (aka conditional process analysis).

Process Hayes Hayes provides SAS and SPSS macros that make it much easier to conduct these analyses. Download the macros, data files, etc. The first step is to identify the model that matches the analysis you wish to do. Run Process.sas. If the data are not already in SAS, bring them in.

Process Hayes Here I shall use Version 2.16, which is installed on the computers in our labs. Version 3.0 is available. It covers some models not covered by 2.16 and has syntax that is a bit different from in 2.16. You must run the Process program, within SAS or SPSS, prior to invoking the macro.

Moderate_Process.sas %process (data=Zs,vars=AR Misanth Ideal, y=ar,x=Misanth,m=Ideal,model=1,jn=1,plot=1); Model Summary R R-sq F df1 df2 p 0.3362 0.1130 6.3721 3.0000 150.0000 0.0004

R-square increase due to interaction(s): Model   coeff se t p LLCI ULCI constant -0.0202 0.0773 -0.2615 0.7940 -0.1730 0.1326 IDEAL 0.0672 0.0777 0.8643 0.3888 -0.0864 0.2207 MISANTH 0.3028 3.8990 0.0001 0.1494 0.4563 INT_1 -0.1456 0.0733 -1.9861 0.0488 -0.2905 -0.0007 R-square increase due to interaction(s):   R2-chng F df1 df2 p INT_1 0.0233 3.9446 1.0000 150.0000 0.0488

The Simple Slopes Conditional effect of X (Misanthropy) on Y at values of the moderator(s) IDEAL Effect se t p LLCI ULCI -1.0000 0.4485 0.1074 4.1738 0.0001 0.2361 0.6608 -0.0000 0.3028 0.0777 3.8990 0.1494 0.4563 1.0000 0.1572 0.1062 1.4803 0.1409 -0.0526 0.3670 Values for “Effect” here are the standardized simple slopes at three levels of standardized Idealism.

Johnson-Neyman Technique Moderator values(s) defining Johnson-Neyman significance region(s) Value % below % above 0.7788 77.2727 22.7273 Misanthropy is significantly correlated with support for animal rights when the standardized value of idealism is .7788 or lower.

Predicted Values of zar Data for visualizing conditional effect of X on Y MISANTH IDEAL yhat -1.0000 -0.5358 -0.0000 -0.0874 1.0000 0.3611 -0.3230 -0.0202 0.2826 -0.1102 0.0469 0.2041

Data for Plot of Simple Slopes data plot; input Misanthropy Idealism Animal_Rights; cards; -1.0000 -1.0000 -0.5358 -0.0000 -1.0000 -0.0874 1.0000 -1.0000 0.3611 -1.0000 -0.0000 -0.3230 -0.0000 -0.0000 -0.0202 1.0000 -0.0000 0.2826 -1.0000 1.0000 -0.1102 -0.0000 1.0000 0.0469 1.0000 1.0000 0.2041

Code for Plot of Simple Slopes proc sgplot; reg x = misanthropy y = Animal_Rights / group = Idealism nomarkers; yaxis label='Standardized Support of Animal Rights'; xaxis label='Standardized Misanthropy'; run;

Plot of Simple Slopes

Table of Conditional Effects Conditional effect of X on Y at values of the moderator (M) IDEAL Effect se t p LLCI ULCI -2.0275 0.5981 0.1686 3.5483 0.0005 0.2650 0.9312 -1.0140 0.4505 0.1082 4.1651 0.0001 0.2368 0.6642 0.7597 0.1922 0.0950 2.0220 0.0450 0.0044 0.3800 0.7788 0.1894 0.0959 1.9759 0.0500 0.0000 0.3788 1.0131 0.1553 0.1068 1.4533 0.1482 -0.0558 0.3664 2.0267 0.0076 0.1669 0.0458 0.9635 -0.3221 0.3374 I have trimmed this table a lot, so it would fit on this slide.

2 Leftmost & 2 Rightmost Cols data plot_JN; input Idealism Effect llci ulci; cards; -2.0275 0.5981 0.2650 0.9312 -1.0140 0.4505 0.2368 0.6642 0.7788 0.1894 0.0000 0.3788 1.0131 0.1553 -0.0558 0.3664 I have trimmed out most of the rows here to make this fit on this slide.

Code for Johnson-Neyman Plot proc sgplot; series x=Idealism y=ulci/curvelabel = '95% Upper Limit' linesattr=(color=red pattern=ShortDash); series x=Idealism y=effect/curvelabel = 'Point Estimate' linesattr=(color=blue pattern=Solid); series x=Idealism y=llci/curvelabel = '95% Lower Limit' linesattr=(color=red pattern=ShortDash); xaxis label = 'Idealism'; yaxis label = 'Conditional effect of misanthropy'; refline 0/axis=y transparency=0.5; refline .7788/axis=x transparency=0.5; run;

Johnson-Neyman Plot

Version 3 of Process %process (data=Zs,y=ar,x=Misanth,w=Ideal,model=1,jn=1,plot=1); With Version 3 of Process the simple slopes are obtained at the 16th, 50th, and 84th percentiles of the moderator. If the moderator were perfectly normally distributed in the sample, one standard deviation below the mean would be the 15.87th percentile, the 50th percentile would be the mean, and one standard deviation above the mean would be the 84.13th percentile.

Main Effects vs Moderation Imagine a study where the variables are Level of stress (state) Dose, in mg, of a new drug (nopressor) designed to reduce blood pressure. SBP, patients resting systolic blood pressure minus the systolic blood pressure considered to be normal/healthy for a person of the subject’s age and sex.

I frequently find my students writing statements (hypotheses) like this: “Dose of nopressor will moderate the effect of stress on systolic blood pressure such that those with high stress will exhibit lower blood pressure when the dose of nopressor is high. That is, nopressor will mitigate the hypertension caused by high stress.” Then I have to explain that the presence of a mitigating effect does not establish moderation.

Mitigation With No Moderation

Mitigation With Moderation