Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Psych 5510/6510 Chapter 10. Interactions and Polynomial Regression: Models with Products of Continuous Predictors Spring, 2009.

Similar presentations


Presentation on theme: "1 Psych 5510/6510 Chapter 10. Interactions and Polynomial Regression: Models with Products of Continuous Predictors Spring, 2009."— Presentation transcript:

1 1 Psych 5510/6510 Chapter 10. Interactions and Polynomial Regression: Models with Products of Continuous Predictors Spring, 2009

2 2 Broadening the Scope So far we have been limiting our models by ignoring the possibility that the predictor variables might interact, and by using only straight lines for our regression (i.e. ‘linear’ regression). This chapter provides an approach that allows us to add both the interaction of variables and nonlinear regression to our models.

3 3 Our ‘Running’ Example Throughout this chapter we will be working with the following example: Y is the time (in minutes) taken to run a 5-kilometer race. X 1 is the age of the runner X 2 is how many miles per week the runner ran when in training for the race.

4 4 ‘On Your Mark’ We will begin by taking another perspective of what we have been doing so far in the text, and then use that perspective to understand interactions and nonlinear regression.

5 5 Time and Age The analysis of the data leads to the following ‘simple’ relationship between Time (Y) and Age (X 1 ). MODEL C: Ŷ i =β 0 MODEL A: Ŷ i =β 0 +β 1 X 1i Ŷ i =15.104 +.213X 1i PRE=.218. F*=21.7, p<.01

6 6 Simple Relationship between Time and Age

7 7 Time and Miles The simple relationship between Time (Y) and Miles of Training (X 2 ). MODEL C: Ŷ i =β 0 MODEL A: Ŷ i =β 0 +β 2 X 2i Ŷ i =31.91 -.280X 2i PRE=.535. F*=89.6, p<.01

8 8 Simple Relationship between Race Time and Miles of Training

9 9 Both Predictors Now regress Y, on both Age (X 1 ), and Miles of Training (X 2 ). MODEL C: Ŷ i =β 0 MODEL A: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i Ŷ i =24.716 + 1.65X 1i -.258X 2i PRE=.662. F*=75.55, p<.01

10 10 ‘Get Set’ Now we will develop another way to think about multiple regression, one that re-expresses multiple regression in the form of a simple regression. We will start with the Age (X 1 ). The simple regression of Y on X 1 has this form: Ŷ i =(intercept) + (slope)X 1i

11 11 The multiple regression model is: Ŷ i =24.716 + 1.65X 1i -.258X 2i We can make the multiple regression model fit the simple regression form: Ŷ i = (intercept) + (slope)X 1i Ŷ i = (24.716 -.258X 2i ) + (1.65)X 1i When X 2 =10, then Ŷ i = (22.136) + (1.65)X 1i When X 2 =30, then Ŷ i = (16.976) + (1.65)X 1i From this it is clear that the value of X 2 can be thought of as changing the intercept of the simple regression of Y on X 1, without changing its slope.

12 12 The simple relationship of Time (Y) and Age (X 1 ) at various levels of Training Miles (X 2 )

13 13 Of course we can also work the other direction, and change the multiple regression formula to examine the simple regression of Time (Y) on Miles of Training (X 2 )

14 14 The multiple regression model is: Ŷ i =24.716 + 1.65X 1i -.258X 2i We can make the multiple regression model fit the simple regression form: Ŷ i = (intercept) + (slope)X 2i Ŷ i = (24.716 +1.65 X 1i ) + (-.258)X 2i When X 1 =20, then Ŷ i = (57.716) + (-.258)X 2i When X 1 =60, then Ŷ i = (123.72) + (-.258)X 2i From this it is clear that the value of X 1 can be thought of as changing the intercept of the simple regression of Y on X 2, without changing its slope.

15 15 The simple relationship of Time (Y) and Training Miles (X 2 ) at various levels of Age (X 1 )

16 16 Additive Model When we look at these simplified models it is clear that the effect of one variable gets added to the effect of the other, moving the line up or down the Y axis but not changing the slope. This is known as the ‘additive model’.

17 17 Interactions Between Predictor Variables Let’s take a look at a non-additive model. In this case, we raise the possibility that the relationship between age (X1) and time (Y) may differ across levels of the other predictor variable miles of training (X2). To say that the relationship between X1 and Y may differ across levels of X2 is to say that the slope of the regression line of Y on X1 may differ across levels of X2.

18 18 The slope of the relationship between age and time is less for runners who trained a lot than for those trained less. Non-Additive Relationship Between X1 and X2

19 19 Interaction=Non-Additive Predictor variables interact when the value of one variable influences the relationship (i.e. slope) between the other predictor variables and Y.

20 20 Interaction & Redundancy Whether or not there is an interaction between two variables in predicting a third is an issue that is totally independent of whether or not the two predictor variables are redundant with each other. Expunge from your mind any connection between these two issues (if it was there in the first place).

21 21 Adding Interaction to the Model To add an interaction between variables to the model, simply add a new variable that is the product of the other two (i.e. create a new variable whose values are the score on X 1 times the score on X 2 ), then do a linear regression on that new model: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) Ŷ i =19.20 +.302X 1i +(-.076)X 2i +(-.005)(X 1i X 2i )

22 22 Testing Significance of the Interaction Test significance as you always do using the model comparison approach. First, to test the overall model that includes the interaction term: Model C: Ŷ i =β 0 Model A: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) H 0 : β 1 = β 2 = β 3 =0 H A : at least one of those betas is not zero.

23 23 Testing Significance of the Interaction Second, to test whether adding the interaction term is worthwhile compared to a purely additive model: Model C: Ŷ i = β 0 +β 1 X 1i +β 2 X 2i Model A: Ŷ i = β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) H0: β 3 =0 HA: β 3  0 The test of the partial regression coefficient gives you: PRE=.055, PC=3, PA=4, F*=4.4, p=.039

24 24 Understanding the Interaction of Predictor Variables To develop an understanding of the interaction of predictor variables, we will once again take the full model: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) And translate it into the form of the simple relationship of one predictor variable (X 1 ) and Y: Ŷ i =(intercept) + (slope)X 1i

25 25 ‘Go” Full model: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) = Ŷ i =β 0 +β 2 X 2i +β 1 X 1i +β 3 (X 1i X 2i ) = Ŷ i =β 0 +β 2 X 2i +(β 1 +β 3 X 2i ) X 1i Simple relationship of Y (time) and X1 (age): Ŷ i = (intercept) + (slope)X 1i Ŷ i = (β 0 +β 2 X 2i ) + (β 1 +β 3 X 2i )X 1i

26 26 Simple Relationship of Y (Time) and X 1 (Age) Ŷ i = (intercept) + (slope)X 1i Ŷ i = (β 0 +β 2 X 2i ) + (β 1 +β 3 X 2i )X 1i It is clear in examining the relationship between X 1 and Y, that the value of X 2 influences both the intercept and the slope of that relationship.

27 27 Simple Relationship of Time and Age (cont.) Ŷ i = (intercept) + (slope)X 1i Ŷ i = (β 0 +β 2 X 2i ) + (β 1 +β 3 X 2i )X 1i b 0 =19.20 b 1 =.302 b 2 =-.076 b 3 =-.005 Ŷ i =(19.20 +-.076X 2i ) + (.302 +-.005X 2i )X 1i When X 2 (i.e. miles) =10, then Ŷ i =18.44 +.252X 1i When X 2 (i.e. miles) =50, then Ŷ i =15.4 +.052X 1i

28 28 Interactive Model

29 29 Simple Relationship of Y (Time) and X 2 (Miles) Full model: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) = Ŷ i = β 0 +β 1 X 1i +(β 2 +β 3 X 1i ) X 2i Simple relationship of Y (time) and X 2 (miles): Ŷ i =(intercept) + (slope)X 2i Ŷ i =(β 0 +β 1 X 1i ) + (β 2 +β 3 X 1i )X 2i

30 30 Simple Relationship of Time and Miles (cont.) Ŷ i = (intercept) + (slope)X 2i Ŷ i = (β 0 +β 1 X 1i ) + (β 2 +β 3 X 1i )X 2i b 0 =19.20 b 1 =.302 b 2 =-.076 b 3 =-.005 Ŷ i =(19.20 +.302X 1i ) + (-.076 +-.005X 2i )X 1i When X 1 (i.e. age) =60, then Ŷ i =37.32 -.376X 2i When X 1 (i.e. age) =20, then Ŷ i =25.24 –.176X 2i

31 31 Interactive Model

32 32 Back to the Analysis We’ve already looked at how you test to see if it is worthwhile to move from the additive model to the interactive model: Model C: Ŷ i = β 0 +β 1 X 1i +β 2 X 2i Model A: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) H0: β 3 =0 HA: β 3  0 The next topic involves the interpretation of the partial regression coefficients.

33 33 Interpreting Partial Regression Coefficients Ŷ i = β 0 +β 1 X 1i +β 2 X 2i Additive model: we’ve covered this in previous chapters. The values of β 1 and β 2 are the slopes of the regression of Y on that variable when the other variable is held constant (i.e. the slope across values of the other variable). Look back at the scatterplots for the additive model, β 1 is the slope of the relationship between Y and X 1 across various values of X 2, note that the slope doesn’t change.

34 34 Interpreting Partial Regression Coefficients Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) Interactive model: when X 1 and X 2 interact, then the slope of the relationship between Y and X 1 changes across values of X 2 so what does β 1 reflect? Answer: β 1 is the slope of the relationship between Y and X 1 when X 2 =0. Note: the slope will be different for other values of X 2. Likewise: β 2 is the slope of the relationship between Y and X 2 when X 1 =0.

35 35 Interpreting β 1 and β 2 (cont.) So, β 1 is the slope of the regression of Y on X 1 when X 2 =0, or in other words, the slope of the regression of Time on Age for runners who trained 0 miles per week (even though none of our runners trained that little). β 2 is the slope of the regression of Y on X 2 when X 1 =0, or in other words, the slope of the regression of Time on Miles for runners who are 0 years old! This is not what we are interested in!

36 36 Better Alternative A better alternative for when scores of zero in our predictor variables are not of interest, is to use mean deviation scores instead (this is called ‘centering’ our data): Then regress Y on X’ 1 and X’ 2 Ŷ i =β 0 +β 1 X’ 1i +β 2 X’ 2i +β 3 (X’ 1i X’ 2i )

37 37 Interpreting β 1 and β 2 Now So, β 1 is still the slope of the regression of Y on X 1 when X’ 2 =0, but now X’ 2 =0 when X 2 =the mean of X 2, which is much more relevant, we now have the relationship between Time and Age for runners who trained an average amount. β 2 is the slope of the regression of Y on X 2 when X’ 1 =0, but now X’ 1 =0 when X 1 =the mean of X 1, i.e., we now have the relationship between Time and Miles for runners who were at the average age of our sample.

38 38 Interpreting β 0 For the model: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) β 0 is the value of Y when all the predictor scores equal zero (rarely of interest) For the model: Ŷ i =β 0 +β 1 X’ 1i +β 2 X’ 2i +β 3 (X’ 1i X’ 2i ) β 0 = μ Y (due to the use of mean deviation scores) and the confidence interval for β 0 is thus the confidence interval for μ Y

39 39 Interpreting β 3 Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) β 3 represents how much the slope changes in one variable as the other variable changes by 1. It is not influenced by whether you use X 1 or X’ 1, or X 2 or X’ 2. So β 3 would be the same in both of the following models: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) Ŷ i =β 0 +β 1 X’ 1i +β 2 X’ 2i +β 3 (X’ 1i X’ 2i ) But the values of β 0, β 1 and β 2 would be different in the two models.

40 40 Interpreting β 3 (cont.) Important note: β 3 represents the interaction of X 1 and X 2 only when both of those variables are included by themselves in the model. For example, in the following model β 3 would not represent the interaction of X 1 and X 2 because β 2 X 2i is not included in the model: Ŷ i =β 0 +β 1 X 1i +β 3 (X 1i X 2i )

41 41 Other Transformations As we have seen, using X’=(X-mean of X) allows us to have meaningful β’s, as the partial regression coefficient is the simple relationship of the corresponding variable when the other variable equals its mean. We can use other transformations. X 1i ”=(X 1i -50) allows us to look at the simple relationship between miles (X 2 ) and time (Y) when age (X 1 )=50.

42 42 Regular model: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) Model with transformed X 1i Ŷ i =β 0 +β 1 X” 1i +β 2 X 2i +β 3 (X’’ 1i X 2i ) Transforming the X 1i score to X” 1i will: 1.Affect the value of β 2 (as it now gives the slope for the relationship between X 2 and Y when X 1 =50). 2.Will not affect B 1 (the slope of the relationship between X 1 and Y when X 2 =0). 3.Will not affect B 3 (the slope of the interaction term is not affected by transformations of its components as long as all components are included in the model).

43 43 Power Considerations The confidence interval formula is the same for all partial regression coefficients, whether of interactive terms or not:

44 44 Power Considerations Smaller confidence intervals mean more power: 1.Smaller MSE (i.e. error in the model) means more power. 2.Larger tolerance (1-R²) means more power.

45 45 Power, Transformations, and Redundancy If you use transformed scores (e.g. mean deviations) then it can affect the redundancy of the interaction term with its component terms (which should then affect the confidence intervals and thus affect power) but this change in redundancy is completely counterbalanced by changes in MSE. Thus using transformed scores will not affect the confidence intervals or power. So...

46 46 The Point Being... If your stat package won’t let you include an interaction term because it is too redundant with its component terms (i.e. its tolerance is too low) then you can try using mean deviation component terms (which will change the redundancy of the interaction term with it components without altering the confidence interval of the interaction term).

47 47 Polynomial (Non-linear) Regressions What we have learned about how to examine the interaction of variables also provides exactly what we need to see if there might be non-linear relationships between the predictor variables and the criterion variable (Y).

48 48 Polynomial (Non-linear) Regressions Let’s say we suspect that the relationship between Time and Miles is not the same across all levels of Miles. In other words, adding 5 more miles per week of training when you are currently at 10 miles per week, will have a different effect than adding 5 more miles when you are currently training at 50 miles per week. To say that Miles+5 has a different effect when Miles=10 then when Miles=50 is to say that the slope is different at 10 than at 50.

49 49 X 2 Interacting With Itself In essence we are saying that X 2 is interacting with itself. Previous model: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i X 2i ) This model (ignore X 1 and use X 2 twice) Ŷ i =β 0 +β 1 X 2i +β 2 X 2i +β 3 (X 2i X 2i )

50 50 Interaction Model Ŷ i =β 0 +β 1 X 2i +β 2 X 2i +β 3 (X 2i X 2i ), or, Ŷ i =β 0 +β 1 X 2i +β 2 X 2i +β 3 (X 2i ²) However, we cannot calculate the b’s because the variables that go with β 1 and β 2 are completely redundant (they are the same variable, thus tolerance =0), so we drop one of them (which makes conceptual sense in terms of model building), and get: Ŷ i =β 0 +β 1 X 2i +β 2 (X 2i ²)

51 51 In Terms of Simple Relationship Now let’s once again organize this into the simple relationship between Y and X 2 so we can see how it works. Model: Ŷ i = β 0 +β 1 X 2i +β 2 X 2i ² Ŷ i = (intercept) + (slope)X 2i Ŷ i = (β 0 –β 2 X 2i ²) + (β 1 +2β 2 X 2i )X 2i Where did those terms for the intercept and slope come from? I’ll show you later, right now take my word for it.

52 52 Simple Relationship (cont.) Ŷ i = (intercept) + (slope)X 2i Ŷ i = (β 0 –β 2 X 2i ²) + (β 1 +2β 2 X 2i )X 2i Note that the value of X 2 influences both the intercept and the slope of its simple relationship with Y. Thus the relationship (i.e. the slope) between X 2 and Y changes across values of the predictor variable.

53 53 Ŷ i =b 0 +b 1 X 2i +b 2 (X 2i ²) b 0 =37.47 b 1 =.753 b 2 =.008 Ŷ i = 37.47 +.753X 2i +.008 (X 2i ²) Ŷ i = (intercept) + (slope)X 2i Ŷ i = (b 0 –b 2 X 2i ²) + (b 1 +2b 2 X 2i )X 2i Ŷ i = (37.47–.008X 2i ²) + (-.753+2(.008)X 2i )X 2i When X 2 =0 Ŷ i =37.47+(-.753 )X 2i When X 2 =20 Ŷ i =34.22+(-.433 )X 2i

54 54 Nonlinear Relationship The relationship between Time and Miles at any particular value of Miles is the line that is tangent to the curve at that point.

55 55 Nonlinear Relationship More importantly: the above line is the regression line we are fitting to the data with the squared term in the Model.

56 56 Interpreting β 0 Model: Ŷ i = β 0 +β 1 X 2i +β 2 X 2i ² β 0 is the predicted value of Y when X 2 = 0. In other words, it is the predicted time for a runner who runs zero hours per week. If we use X mean deviation scores then this would be the predicted time for a runner who runs an average number of hours per week.

57 57 Interpreting β 1 Model: Ŷ i = β 0 +β 1 X 2i +β 2 X 2i ² β 1 is the slope of the relationship between Y and X 2 when X 2 = 0. The slope will be different at other values of X 2

58 58 Interpreting β 2 Model: Ŷ i = β 0 +β 1 X 2i +β 2 X 2i ² β 2 times 2 is how much the slope of the relationship between Y and X 2 changes when X 2 increases by 1. Why times 2? Ŷ i =(intercept) + (slope)X 2i Ŷ i =(β 0 –β 2 X 2i ²) + (β 1 +2β 2 X 2i )X 2i When X 2 changes by 1, the slope is effected by 2 times β 2. Another way of saying it is that β 2 is half how much the slope changes.

59 59 Interpreting β 2 (cont.) This interpretation of the coefficient for a quadratic (or higher) term only applies if all of its component terms are included in the model. Ŷ i = β 0 +β 1 X 2i +β 2 X 2i ² The interpretation of β 2 depends upon β 1 being there. Ŷ i = β 0 +β 1 X 2i +β 2 X 2i ² +β 3 X 2i ³ The interpretation of β 3 depends upon β 1 and β 2 being there.

60 60 Testing Significance of the Quadratic (i.e. X²) Term Test significance as you always do using the model comparison approach. To test the overall model that includes the quadratic term: Model C: Ŷ i =β 0 Model A: Ŷ i =β 0 +β 1 X 2i +β 2 (X 2i ²) H 0 : β 1 = β 2 =0 H A : at least one of those betas is not zero.

61 61 Testing Significance of the Quadratic (i.e. X²) Term To test whether adding the quadratic term is worthwhile compared to a linear model: Model C: Ŷ i = β 0 +β 1 X 2i Model A: Ŷ i =β 0 +β 1 X 2i +β 2 (X 2i ²) The test of the partial regression coefficient does this for you.

62 62 What About the Linear Term (i.e. X) ? Model: Ŷ i =β 0 +β 1 X 2i +β 2 (X 2i ²) The t tests for the regression coefficients will tell you whether each β is significantly different than 0. What if, in the example above, β 2 is significant but β 1 is not? Should you drop β 1 X 2i from your model and keep β 2 (X 2i ²)? No, the components of X 2 ² (in this case just X 2 ) give the analysis of X 2 ² its meaning. If the model included X³ we would need to include X² and X in the model for the analysis of X³ to have meaning, and so on.

63 63 Why? Our goal is to move forward a step at a time in the complexity of the model. We start with what can be explained linearly, then see how much can be explained above and beyond that by including a quadratic term (i.e. the partial correlation of adding the quadratic term last to a model that contains the linear term). We lose that meaning of the quadratic partial correlation if the linear term is dropped from the model. Also note that the correlation between two powers of a variable (e.g. X and X²) tends to be very high, meaning that they are quite redundant, and it is not surprising that the linear term might be non-significant when the quadratic term is in the model.

64 64 Mean Deviation Scores If mean deviation scores are used (i.e. X’) then: Ŷ i =β 0 +β 1 X’ 2i +β 2 (X’ 2i ²) 1.The coefficient for X (i.e. β 1 ) is the slope of the simple relationship between X and Y when X equals its mean. 2.The coefficient for the quadratic term (i.e. β 2 ) is not affected (as long as all of its components are included in the model).

65 65 General Approach for Arriving at ‘Simple’ Relationships Being able to turn a complicated model into the simple relationship between Y and the various predictors can be a big aid in understanding how the model works. In other words...Ŷ i = β 0 +β 1 X 1i... β p X pi into: Ŷ i =(intercept)+(slope)X 1i Ŷ i =(intercept)+(slope)X 2i etc.

66 66 General Approach We need to find what is called the ‘partial derivative’ of the model for the particular variable whose simple relationship with Y we would like to examine. We will symbolize the partial derivate as: Model pd

67 67 General Approach Then to create the simple relationship of Ŷ i =(intercept)+(slope)X, where: 1.Intercept = Model – (Model pd )(X) 2.Slope = Model pd I would say stop there, but if you must know...combining the simple formula and the two pieces given above... Ŷ i =(Model – (Model pd )(X))+(Model pd )X, which while correct just looks confusing and there is no reason to go there.

68 68 Example Model we will be working with: Ŷ i =β 0 +β 1 X 1i +β 2 X 2i +β 3 (X 1i ²) We want to know the simple relationship between X 1 and Y. To make the notation simple, we will call the predictor variable of interest X, and the other predictor variables other letters (in this case we will use Z to stand for X 2 ). Ŷ i =β 0 +β 1 X +β 2 Z +β 3 (X²)

69 69 Rules for Arriving at the Partial Derivative 1.To find the partial derivative of items that are summed together, find the partial derivative of each item and add those together. 2.The partial derivative of aX m is amX m-1. Note that: a.X 1 = X, so the partial derivative of the term 3X² would be (3)(2)(X 1 ) = 6X b.X 0 = 1, so the partial derivative of the term 2X would be (2)(1)(X 0 ) = (2)(1)(1) = 2 3.The partial derivative of any term that doesn’t contain X is 0.

70 70 Solution for our Example Model: β 0 + β 1 X +β 2 Z + β 3 X² Model pd :0 +(1)(β 1 )(X 0 ) + 0 + (2)(β 3 )(X 1 ) Model pd : β 1 + 2β 3 X intercept=Model – (Model pd )X = β 0 + β 1 X +β 2 Z + β 3 X² – (β 1 + 2β 3 X)X = β 0 + β 1 X +β 2 Z + β 3 X² - (β 1 X + 2β 3 X²) = β 0 + β 1 X +β 2 Z + β 3 X² - β 1 X - 2β 3 X² = β 0 +β 2 Z - β 3 X² slope = Model pd =β 1 + 2β 3 X Ŷ i =(intercept) + (slope)X Ŷ i =(β 0 +β 2 Z - β 3 X²) + (β 1 + 2β 3 X)X

71 71 Interpretation Ŷ i =(β 0 +β 2 Z - β 3 X²) + (β 1 + 2β 3 X)X So what does this tell us? Well it tells us that for any particular value of X that the relationship between X and Y (i.e. the slope) is affected by the value of X. And that the intercept (which moves the regression line up or down on the Y axis) is influenced by both Z and X. This may not seem all that important, but in some complex models it might lead to a better understanding of the relationship between X and Y (to see what role the other variables, and X itself, play in that relationship).

72 72 Interpretation Ŷ i =(β 0 +β 2 Z - β 3 X²) + (β 1 + 2β 3 X)X We can also plug in specific values for X and Z, and SPSS’s estimates of the β’s, to see the relationship between Y and X at that point. For example, if SPSS computes b 0 =3, b 1 =2.5, b 2 =6.1, b 3 =7, and we want to know the relationship between Y and X when X=2 and Z=5, then we have: Ŷ i =(3 +6.1(5) – 7(4)) + (2.5 + 2(7)(2))X Ŷ i =5.5+30.5X


Download ppt "1 Psych 5510/6510 Chapter 10. Interactions and Polynomial Regression: Models with Products of Continuous Predictors Spring, 2009."

Similar presentations


Ads by Google