Plotting Linear Main Effects Models Interpreting 1 st order terms w/ & w/o interactions Coding & centering… gotta? oughta? Plotting single-predictor models.

Slides:



Advertisements
Similar presentations
Bivariate &/vs. Multivariate
Advertisements

Quantitative Methods Interactions - getting more complex.
Transformations & Data Cleaning
Research Hypotheses and Multiple Regression: 2 Comparing model performance across populations Comparing model performance across criteria.
Cluster Analysis Purpose and process of clustering Profile analysis Selection of variables and sample Determining the # of clusters.
ANCOVA Workings of ANOVA & ANCOVA ANCOVA, Semi-Partial correlations, statistical control Using model plotting to think about ANCOVA & Statistical control.
Section 3.4 Systems of Equations in 3 Variables
Some Terms Y =  o +  1 X Regression of Y on X Regress Y on X X called independent variable or predictor variable or covariate or factor Which factors.
Section 4.2 Fitting Curves and Surfaces by Least Squares.
Statistics for the Social Sciences
Regression Models w/ k-group & Quant Variables Sources of data for this model Variations of this model Main effects version of the model –Interpreting.
Multiple Group X² Designs & Follow-up Analyses X² for multiple condition designs Pairwise comparisons & RH Testing Alpha inflation Effect sizes for k-group.
Solving Linear Equation Create by Ching Yun Liu For CI752R.
Introduction to the General Linear Model (GLM) l 1 quantitative variable & 1 2-group variable l 1a  main effects model with no interaction l 1b  interaction.
ANCOVA Workings of ANOVA & ANCOVA ANCOVA, Semi-Partial correlations, statistical control Using model plotting to think about ANCOVA & Statistical control.
January 6, afternoon session 1 Statistics Micro Mini Multiple Regression January 5-9, 2008 Beth Ayers.
Multiple Regression Models Advantages of multiple regression Important preliminary analyses Parts of a multiple regression model & interpretation Differences.
Multiple Regression Models: Some Details & Surprises Review of raw & standardized models Differences between r, b & β Bivariate & Multivariate patterns.
Introduction the General Linear Model (GLM) l what “model,” “linear” & “general” mean l bivariate, univariate & multivariate GLModels l kinds of variables.
Simple Regression correlation vs. prediction research prediction and relationship strength interpreting regression formulas –quantitative vs. binary predictor.
Introduction to Path Analysis & Mediation Analyses Review of Multivariate research & an Additional model Structure of Regression Models & Path Models Direct.
General Linear Models -- #1 things to remember b weight interpretations 1 quantitative predictor 1 quantitative predictor & non-linear component 1 2-group.
Bivariate & Multivariate Regression correlation vs. prediction research prediction and relationship strength interpreting regression formulas process of.
Elaboration Elaboration extends our knowledge about an association to see if it continues or changes under different situations, that is, when you introduce.
Introduction the General Linear Model (GLM) l 2 quantitative variables l 3a  main effects model with no interaction l 3b  interaction model l 2 categorical.
An Introduction to Classification Classification vs. Prediction Classification & ANOVA Classification Cutoffs, Errors, etc. Multivariate Classification.
Chapter 7 Graphing Linear Equations
Systems of Linear Equations
EXAMPLE 2 Graph direct variation equations Graph the direct variation equation. a.a. y = x 2 3 y = –3x b.b. SOLUTION a.a. Plot a point at the origin. The.
Multiple Linear Regression A method for analyzing the effects of several predictor variables concurrently. - Simultaneously - Stepwise Minimizing the squared.
Unit Organizers feel free to expand, edit, or transform any of these for your own use.
Jon Curwin and Roger Slater, QUANTITATIVE METHODS: A SHORT COURSE ISBN © Thomson Learning 2004 Jon Curwin and Roger Slater, QUANTITATIVE.
Evaluate each equation for x = –1, 0, and y = 3x 2. y = x – 7 3. y = 2x y = 6x – 2 –3, 0, 3 –8, –7, –6 3, 5, 7 –8, –2, 4 Pre-Class Warm Up.
EDUC 200C Section 3 October 12, Goals Review correlation prediction formula Calculate z y ’ = r xy z x for a new data set Use formula to predict.
Warm Up Lesson Presentation Lesson Quiz Class work/Homework.
UNDERSTANDING RESEARCH RESULTS: DESCRIPTION AND CORRELATION © 2012 The McGraw-Hill Companies, Inc.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Chapter 20 Linear Regression. What if… We believe that an important relation between two measures exists? For example, we ask 5 people about their salary.
Coding Multiple Category Variables for Inclusion in Multiple Regression More kinds of predictors for our multiple regression models Some review of interpreting.
Regression Models w/ 2 Quant Variables Sources of data for this model Variations of this model Main effects version of the model –Interpreting the regression.
Advanced Meta-Analyses Heterogeneity Analyses Fixed & Random Efffects models Single-variable Fixed Effects Model – “Q” Wilson Macro for Fixed Efffects.
First, a little review: Consider: then: or It doesn’t matter whether the constant was 3 or -5, since when we take the derivative the constant disappears.
Regression analysis and multiple regression: Here’s the beef* *Graphic kindly provided by Microsoft.
Regression Models w/ 2-group & Quant Variables Sources of data for this model Variations of this model Main effects version of the model –Interpreting.
Regression Models for Quantitative (Numeric) and Qualitative (Categorical) Predictors KNNL – Chapter 8.
Writing and Graphing Linear Equations
Writing and Graphing Linear Equations Linear equations can be used to represent relationships.
GRAPHING EQUATIONS AND INEQUALITIES By: Mrs. Camuto.
Graphs of Equations Objective: To use many methods to sketch the graphs of equations.
1 Psych 5510/6510 Chapter 13: ANCOVA: Models with Continuous and Categorical Predictors Part 3: Within a Correlational Design Spring, 2009.
Ultimatealgebra.com. WELCOME TO THIS CHAPTER 12 part 1 WE WOULD BE LEARNING ABOUT INEQUALITIES PLEASE MASTER THESE BEFORE YOU MOVE TO THE NEXT PART BECAUSE.
3-way Designs defining 2-way & 3-way interactions all the effects conditional and unconditional effects orthogonality, collinearity & the like kxkxk kxkxq.
ANCOVA Workings of ANOVA & ANCOVA ANCOVA, partial correlations & multiple regression Using model plotting to think about ANCOVA & Statistical control Homogeneity.
General Linear Model What is the General Linear Model?? A bit of history & our goals Kinds of variables & effects It’s all about the model (not the data)!!
Linear Discriminant Function Classification vs. Prediction Classification & ANOVA Classification Cutoffs, Errors, etc. Multivariate Classification & LDF.
1. Refresher on the general linear model, interactions, and contrasts UCL Linguistics workshop on mixed-effects modelling in R May 2016.
Plotting Non-linear & Complex Main Effects Models
Systems of Linear Equations
Statistics 200 Lecture #5 Tuesday, September 6, 2016
Regression Chapter 6 I Introduction to Regression
Multiple Regression.
Introduction to the General Linear Model (GLM)
Soc 3306a: ANOVA and Regression Models
Introduction to bivariate data
Line of best fit.
Soc 3306a Lecture 11: Multivariate 4
Math I Quarter I Standards
Objective Find slope by using the slope formula..
Prediction/Regression
Regression and Categorical Predictors
Presentation transcript:

Plotting Linear Main Effects Models Interpreting 1 st order terms w/ & w/o interactions Coding & centering… gotta? oughta? Plotting single-predictor models – Q, 2 & k Plotting 2-predictor models – 2xQ, kxQ & QxQ

Coding & Transforming predictors for MR models Categorical predictors will be converted to dummy codes Quantitative predictors will be centered, usually to the mean Is this absolutely necessary? Not usually… Many sources, and nearly all older ones, used unit- coded & un-centered predictors and their multiplicative combinations. Much of the time it works just fine… So, why is dummy-coding and centering a good idea? Mathematically – 0s (as control group & mean) simplify the math & minimize collinearity complications Interpretively – the “controlling for” included in multiple regression weight interpretations is really “controlling for all other variables in the model at the value 0” – “0” as the comparison group & mean will make b interpretations simpler and more meaningful

Very important things to remember… All 1 st order predictor regression weights have the same interpretation… The expected direction and extent of change in Y for a 1-unit increase in the X after controlling for the other variable(s) in the model at the value 0 If that 1 st order predictor is not involved in a higher-order effect (interaction), then the regression weight is interpreted as a main or unconditional effect - not being part of an interaction, the effect of that variable is the same for all values of all other variables. If that 1 st order predictor is involved in a higher-order effect, then the regression weight is interpreted as a conditional effect when the other variable(s) involved in the interaction = 0 – being part of an interaction, the effect is different for different values of those other variable(s) & the interaction weight describes the direction and extent of those differences

Plotting Single-Predictor models Let’s start by getting, interpreting and plotting a model with each of the different kinds of predictor we will be working with… Be sure to “get” how each regression weight represents a particular slope, height or height differences in the plot!!!

y’ = bX + a Models with a single quantitative predictor a  regression constant expected value of y if x = 0 height of predictor-criterion y-x regression line b  regression weight expected direction and extent of change in y for a 1-unit increase in x slope of y-x regression line

 Y y’ = bX1 + a a + b  X a = height of line b = slope of line Graphing & Interpreting Models with a single quantitative predictor

y’ = bX cen + a Models with a single centered quantitative predictor a  regression constant expected value of y when x = 0 (re-centered to mean=0 ) height of y-x regression line b  regression weight expected direction and extent of change in y for a 1-unit increase in x slope of y-x regression line X cen = X – X mean X will be X cen in all of the following models

y’ = bX cen + a Models with a single centered quantitative predictor X cen = X – X mean So, how do we plot this formula? Simple  pick 2 values of x cen, substitute them into the formula to get y’ values and plot the line defined by those two x-y points What x values? doesn’t matter -- so keep it simple… 0 & 1, 0 & 10, 0 & 100, depending on the x-scale +/- 2 std isn’t as simple, but tells you what x & y ranges are needed on the plot Couple of things to remember… “0” is the center of the x-axis – X has been centered !!! the x-axis should extend about +/- 2 Std (include 96% of pop)

y’ = 1.5X + 30 Models with a single centered quantitative predictor  X cen For x cen = 0  1.6* y’ = 30 For x cen = 10  1.6* y’ = 46 If X has mean = 42 & std = 7.5 For +2std  1.6* y’ = 54 For -2std  1.6* y’ = 6 Any set of x values will lead to the same plotted line!  X We can even substitute the original x scale back into the graph !!

a +b  X cen a = ht of line b = slp of line Graphing & Interpreting Models with a single centered quantitative predictor X cen = X – X mean y’ = bX cen + a

b =  X cen a = ht of line b = slp of line a Graphing & Interpreting Models with a single centered quantitative predictor X cen = X – X mean y’ = bX cen + a

 X cen b a = ht of line b = slp of line a Graphing & Interpreting Models with a single centered quantitative predictor X cen = X – X mean y’ = -bX cen + a

y’ = bZ + a Cx = 1 Tx1 = 2 Z = Tx1 vs. Cx Plotting & Interpreting models with a single binary predictor coded 1-2

y’ = bZ + a a  regression constant expected value of y if z = 0 (the control group) mean of the control group height of control group b  regression weight expected direction and extent of change in y for a 1-unit increase in x direction and extent of y mean difference between groups coded 0 & 1 group height difference Models with a single dummy coded binary predictor The way we’re going to graph this model looks strange, but will provide simplicity & clarity for more complex models.

y’ = bZ + a a + b Cx Tx Plotting & Interpreting Models with a single dummy coded binary predictor a = ht Cx b = htdif Cx & Tx Cx = 0 Tx = 1Z = Tx1 vs. Cx

a Cx Tx b = 0 a = ht Cx b = htdif Cx & Tx y’ = bZ + a Cx = 0 Tx = 1Z = Tx1 vs. Cx Plotting & Interpreting Models with a single dummy coded binary predictor

a - b Cx Tx a = ht Cx b = htdif Cx & Tx y’ = -bZ + a Cx = 0 Tx = 1Z = Tx1 vs. Cx Plotting & Interpreting Models with a single dummy coded binary predictor

y’ = bX + a Models with a single k-category predictor coded 1-3 Can’t put the 1-3 coded variable into a regression – not a quant/interval variable Control Tx1 Tx2

y’ = b 1 Z 1 + b 2 Z 2 + a a  regression constant expected value of y if Z1 & Z2 = 0 (the control group) mean of the control group height of control group b 1  regression weight related to group 1 vs. target group expected direction and extent of change in y for a 1-unit increase in x direction and extent of y mean difference between comparison group and target group coded 1 on this variable group height difference between comparison & target groups (3 & 1) Models with a dummy coded k-category predictor – 2 dummy codes Again, the way we’re going to graph this model looks strange, but will provide simplicity & clarity for more complex models. Group Z 1 Z * 0 0 b 2  regression weight related to group 2 vs. target group expected direction and extent of change in y for a 1-unit increase in x direction and extent of y mean difference between comparison group and target group coded 1 on this variable group height difference between comparison & target groups (3 & 2)

y’ = b 1 Z 1 + b 2 Z 2 + a a + b 1 Z 1 = Tx1 vs. Cx(0) Cx Tx1 Z 2 = Tx2 vs. Cx(0) Tx2 + b 2 a = ht Cx b 1 = htdif Cx & Tx 1 b 2 = htdif Cx & Tx 2 Plotting & Interpreting Models with a dummy coded k-category predictor – 2 dummy codes

a b 1 =0 Cx Tx1 Tx2 b 2 =0 Z 1 = Tx1 vs. Cx(0) Z 2 = Tx2 vs. Cx(0) Plotting & Interpreting Models with a dummy coded k-category predictor – 2 dummy codes The dots should overlap – but that makes it hard to see… y’ = b 1 Z 1 + b 2 Z 2 + a a = ht Cx b 1 = htdif Cx & Tx 1 b 2 = htdif Cx & Tx 2

a - b 1 Cx Tx1 Tx2 b 2 = 0 Z 1 = Tx1 vs. Cx(0) Z 2 = Tx2 vs. Cx(0) Plotting & Interpreting Models with a dummy coded k-category predictor – 2 dummy codes The top 2 dots should overlap – but that makes it hard to see… y’ = -b 1 Z 1 + b 2 Z 2 + a a = ht Cx b 1 = htdif Cx & Tx 1 b 2 = htdif Cx & Tx 2

Now we get to the fun part – plotting multiple regression equations involving multiple variables. Three kinds (for now, more later) … centered quant variable & dummy-coded binary variable centered quant variable & dummy-coded k-category variable 2 centered quant variables What we’re trying to do is to plot these models so that we can see how both of the predictors are related to the criterion. Like when we’re plotting data from a factorial design, we have to represent 3 variables -- the criterion & the 2 predictors X & Z -- in a 2-dimensional plot. We’ll use the same solution.. We’ll plot the relationship between one predictor and the criterion for different values of the other predictor Plotting 2-Predictor Linear Main Effects Models

a  regression constant expected value of y if X=0 (mean) and Z=0 (comparison group) mean of the control group height of control group quant-criterion regression line b 2  regression weight for dummy coded binary predictor – main effect of Z expected direction and extent of change in y for a 1-unit increase in x, after controlling for the other variable(s) in the model direction and extent of y mean difference between groups coded 0 & 1, after controlling for the other variable(s) in the model group mean/reg line height difference (when X = 0, the centered mean) Models with a centered quantitative predictor & a dummy coded binary predictor y’ = b 1 X + b 2 Z + a This is called a main effects model  there are no interaction terms. b 1  regression weight for centered quant predictor – main effect of X expected direction and extent of change in y for a 1-unit increase in x, after controlling for the other variable(s) in the model expected direction and extent of change in y for a 1-unit increase in x, for the comparison group (coded 0) slope of quant-criterion regression line for the group coded 0 (comp)

Model  y’ = b 1 X + b 2 Z + a y’ = b 1 X + b 2 *0 + a y’ = b 1 X + a y’ = b 1 X + b 2 *1 + a y’ = b 1 X + ( b 2 + a) To plot the model we need to get separate regression formulas for each Z group. We start with the multiple regression model… For the Comparison Group coded Z = 0 Substitute the 0 in for Z Simplify the formula For the Target Group coded Z = 1 Substitute the 1 in for Z Simplify the formula slope height slope height

a b1b1 b2b2 Cx Tx  X cen a = ht of Cx line  mean of Cx b 1 = slp of Cx line b 2 = htdif Cx & Tx  Cx & Tx mean dif Cx slp = Tx slp No interaction X cen = X – X mean Z = Tx(1) vs. Cx(0) y’ = b 1 X + b 2 Z + a Plotting & Interpreting Models with a centered quantitative predictor & a dummy coded binary predictor This is called a main effects model  no interaction  the regression lines are parallel.

a -b 1 b 2 = 0 Cx Tx  X cen X cen = X – X mean Z = Tx(1) vs. Cx(0) a = ht of Cx line  mean of Cx b 1 = slp of Cx line b 2 = htdif Cx & Tx  Cx & Tx mean dif Cx slp = Tx slp No interaction y’ = -b 1 X + -b 2 Z + a Plotting & Interpreting Models with a centered quantitative predictor & a dummy coded binary predictor This is called a main effects model  no interaction  the regression lines are parallel.

a b 1 = 0 a b2b2 Cx Tx  X cen a = ht of Cx line  mean of Cx b 1 = slp of Cx line b 2 = htdif Cx & Tx  Cx & Tx mean dif Cx slp = Tx slp No interaction X cen = X – X mean Z = Tx(1) vs. Cx(0) Plotting & Interpreting Models with a centered quantitative predictor & a dummy coded binary predictor y’ = b 1 X + b 2 Z + a This is called a main effects model  no interaction  the regression lines are parallel.

a  regression constant expected value of y if Z 1 & Z 2 = 0 (the control group) mean of the control group height of control group quant-criterion regression line b 2  regression weight for dummy coded comparison of G1 vs G3 – main effect expected direction and extent of change in y for a 1-unit increase in x direction and extent of y mean difference between groups 1 & 3 group height difference between comparison & target groups (3 & 1) (X = 0) Models with a centered quantitative predictor & a dummy coded k-category predictor y’ = b 1 X + b 2 Z 1 + b 3 Z 2 + a b 1  regression weight for centered quant predictor – main effect of X expected direction and extent of change in y for a 1-unit increase in x slope of quant-criterion regression line (for both groups) b 3  regression weight for dummy coded comparison of G2 vs. G3 – main effect expected direction and extent of change in y for a 1-unit increase in x direction and extent of y mean difference between groups coded 0 & 1 group height difference between comparison & target groups (3 & 2) (X = 0) Group Z 1 Z * 0 0 This is called a main effects model  there are no interaction terms.

To plot the model we need to get separate regression formulas for each Z group. We start with the multiple regression model… For the Comparison Group coded Z 1 = 0 & Z 2 = 0 Model  y’ = b 1 X + b 2 Z 1 + b 3 Z 2 + a Substitute the Z code values y’ = b1X + b 2 *0 + b 3 *0 + a Simplify the formula y’ = b 1 X + a For the Target Group coded Z 1 = 1 & Z 2 = 0 Substitute the Z code values y’ = b1X + b 2 *1 + b 3 *0 + a Simplify the formula y’ = b 1 X + (b 2 + a) For the Target Group coded Z 1 = 0 & Z 2 = 1 Substitute the Z code values y’ = b1X + b 2 *0 + b 3 *1 + a Simplify the formula y’ = b 1 X + (b 3 + a) Group Z 1 Z * 0 0 slope height slope height slope height

a b1b1 -b 2 Cx Tx1 Tx2 b3b  X cen Z 1 = Tx1 vs. Cx(0)Z 2 = Tx2 vs. Cx (0) X cen = X – X mean Plotting & Interpreting Models with a centered quantitative predictor & a dummy coded k-category predictor y’ = b 1 X + -b 2 Z 1 + b 3 Z 2 + a a = ht of Cx line  mean of Cx b 1 = slp of Cx line Cx slp = Tx 1 slp = Tx 2 slp No interaction b 2 = htdif Cx & Tx 1  Cx & Tx 1 mean dif b 3 = htdif Cx & Tx 2  Cx & Tx 2 mean dif This is called a main effects model  no interaction  the regression lines are parallel.

a b 1 = 0 Cx Tx1 Tx2 b3b3 b 2 =  X cen Z 1 = Tx1 vs. Cx(0)Z 2 = Tx2 vs. Cx (0) X cen = X – X mean Plotting & Interpreting Models with a centered quantitative predictor & a dummy coded k-category predictor y’ = b 1 X + b 2 Z 1 + b 3 Z 2 + a a = ht of Cx line  mean of Cx b 1 = slp of Cx line Cx slp = Tx 1 slp = Tx 2 slp No interaction b 2 = htdif Cx & Tx 1  Cx & Tx 1 mean dif b 3 = htdif Cx & Tx 2  Cx & Tx 2 mean dif This is called a main effects model  no interaction  the regression lines are parallel.

aa -b 1 Cx Tx2 Tx1 b3b3 b 2 =  X cen Z 1 = Tx1 vs. Cx(0)Z 2 = Tx2 vs. Cx (0) X cen = X – X mean Plotting & Interpreting Models with a centered quantitative predictor & a dummy coded k-category predictor y’ = -b 1 X + b 2 Z 1 + b 3 Z 2 + a a = ht of Cx line  mean of Cx b 1 = slp of Cx line Cx slp = Tx 1 slp = Tx 2 slp No interaction b 2 = htdif Cx & Tx 1  Cx & Tx 1 mean dif b 3 = htdif Cx & Tx 2  Cx & Tx 2 mean dif This is called a main effects model  no interaction  the regression lines are parallel.

weights for main effects models w/ a centered quantitative predictor & a dummy coded binary predictor or a dummy coded k-category predictor Constant “a” the expected value of y when the value of all predictors = 0 height of the Y-X regression line for the comparison group b for a centered quantitative variable – main effect the direction and extent of the expected change in the value of y for a 1-unit increase in that predictor, holding the value of all other predictors constant at 0 slope of the Y-X regression line for the comparison group b for a dummy coded binary variable -- main effect the direction and extent of expected mean difference of the Target group from the Comparison group, holding the value of all other predictors constant at 0 difference in height of the Y-X regression lines for the comparison & target groups comparison & target groups Y-X regression lines have same slope – no interaction b for a dummy coded k-group variable – main effect the direction and extent of the expected mean difference of the Target group for that dummy code from the Comparison group, holding the value of all other predictors constant at 0 difference in height of the Y-X regression lines for the comparison & target groups comparison & target groups Y-X regression lines have same slope – no interaction

Models with 2 centered quantitative predictors y’ = b 1 X 1 + b 2 X 2 + a This is called a main effects model  there are no interaction terms. Same idea … we want to plot these models so that we can see how both of the predictors are related to the criterion Different approach … when the second predictor was binary or had k-categories, we plotted the Y-X regression line for each Z group now, however, we don’t have any groups – both the X & Z variables are centered quantitative variables what we’ll do is to plot the Y-X 1 regression line for different values of X 2 We’ll plot 3 lines the most common approach is to plot the Y-X regression line for… the mean of X 2 +1 std above the mean of X 2 -1 std below the mean of X 2

a  regression constant expected value of y if X=0 (mean) Z=0 (mean) mean criterion score of those having Z=0 (mean) height of quant-criterion regression line for those with Z=0 (mean) Models with 2 centered quantitative predictors y’ = b 1 X 1 + b 2 X 2 + a This is called a main effects model  there are no interaction terms. b 1  regression weight for X 1 centered quant predictor expected direction and extent of change in y for a 1-unit increase in X 1, after controlling for the other variable(s) in the model expected direction and extent of change in y for a 1-unit increase in X 2, for those with X 2 =0 (mean) slope of quant-criterion regression line when X 2 =0 (mean) b 2  regression weight for X 2 centered quant predictor expected direction and extent of change in y for a 1-unit increase in X 2, after controlling for the other variable(s) in the model expected direction and extent of change in y for a 1-unit increase in X 2, for those with X 1 =0 (mean) expected direction and extent of change of the height of Y-X regression line for a 1-unit change in X 2

y’ = b 1 X + b 2 *0 + a y’ = b 1 X + a y’ = b 1 X + b 2 *std + a y’ = b 1 X + ( b 2 *std + a) To plot the model we need to get separate regression formulas for each chosen value of Z. Start with the multiple regression model.. For X 2 = 0 (the mean of centered Z) Substitute the 0 in for X 2 Simplify the formula For X 2 = +1 std Substitute the std value in for X 2 Simplify the formula Model  y’ = b 1 X + b 2 X 1 + a slope height y’ = b 1 X + -b 2 *std + a y’ = b 1 X + (-b 2 *std + a) For X 2 = -1 std Substitute the std value in for X 2 Simplify the formula slope height slope

y’ = b 1 X 1cen + b 2 X 2cen + a a b1b1 b2b2 X 2 =0 +1 std X 2 -1 std X 2 b2b2 X 2cen = X 2 – X 2mean  X 2cen a = ht of X 2mean line b 1 = slp of X 2mean line b 2 = htdifs among X 2 -lines X 1cen = X 1 – X 1mean Plotting & Interpreting Models with 2 centered quantitative predictors This is called a main effects model  no interaction  the regression lines are parallel. 0 slp = +1std slp = -1std slp No interaction

a b 1 = 0 -b  X cen Plotting & Interpreting Models with 2 centered quantitative predictors This is called a main effects model  no interaction  the regression lines are parallel. y’ = b 1 X 1cen + b 2 X 2cen + a X 2cen = X 2 – X 2mean X 1cen = X 1 – X 1mean a = ht of X 2mean line b 1 = slp of X 2mean line b 2 = htdifs among X 2 -lines 0 slp = +1std slp = -1std slp No interaction X 2 =0 +1 std X 2 -1 std X 2

aa -b 2 -b  X cen Plotting & Interpreting Models with 2 centered quantitative predictors This is called a main effects model  no interaction  the regression lines are parallel. y’ = b 1 X 1cen + b 2 X 2cen + a X 2cen = X 2 – X 2mean X 1cen = X 1 – X 1mean a = ht of X 2mean line b 1 = slp of X 2mean line b 2 = htdifs among X 2 -lines 0 slp = +1std slp = -1std slp No interaction X 2 =0 +1 std X 2 -1 std X 2