Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multiple Logistic Regression RSQUARE, LACKFIT, SELECTION, and interactions.

Similar presentations


Presentation on theme: "Multiple Logistic Regression RSQUARE, LACKFIT, SELECTION, and interactions."— Presentation transcript:

1 Multiple Logistic Regression RSQUARE, LACKFIT, SELECTION, and interactions

2 Introduction Just as with linear regression, logistic regression allows you to look at the effect of multiple predictors on an outcome. Consider the following example: 15- and 16- year-old adolescents were asked if they have ever had sexual intercourse. The outcome of interest is intercourse. The predictors are race (white and black) and gender (male and female). Example from Agresti, A. Categorical Data Analysis, 2 nd ed. 2002.

3 Here is a table of the data: Intercourse RaceGenderYesNo WhiteMale43134 Female26149 BlackMale2923 Female2236

4 Entering the Data in SAS The data set intercourse is created with the variables “white” (1 if white, 0 if black), “male” (1 if male, 0 if female), and “intercourse” (1 if yes, 0 if no). We want to examine the odds of having intercourse with race and gender as predictors. Enter the code on the next slide into SAS.

5 Creating the Data Set Intercourse

6 Multiple Logistic Regression: Main Effects First look at the effect of race and gender with no interaction. The SAS code is similar to that of simple logistic regression; one more independent variable has been added to the model statement.

7 Entering the following code into SAS: “descending” models the probability that intercourse = 1 (yes) rather than = 0 (no). “rsquare” requests the R 2 value from SAS; it is interpreted the same way as the R 2 from linear regression. “lackfit” requests the Hosmer and Lemeshow Goodness-of-Fit Test. This tells you if the model you have created is a good fit for the data.

8 SAS Output: R 2

9 Interpreting the R 2 value The R 2 value is 0.9907. This means that 99.07% of the variability in our outcome (intercourse) is explained by including gender and race in our model.

10 PROC LOGISTIC Output

11 Interpreting Output Notice that the race and gender terms are both statistically significant (p < 0.0001 and p = 0.0040, respectively). The logistic regression model is: log(odds) = β 0 + β 1 (white) + β 2 (male) log(odds) = -0.4555 – 1.3135(white) + 0.6478(male) The odds of having intercourse is 73.1% (1- 0.269) lower for whites than blacks. The odds of having intercourse is 1.911 times greater for males versus females.

12 Suppose you wanted to know the odds of intercourse for black males versus white females: Log(odds)black males = β 0 + β 1 (0) + β 2 (1) Log(odds)white females = β 0 + β 1 (1) + β 2 (0) Log(OR) = β 0 + β 2 – [ β 0 + β 1 ] = β 2 – β 1 Log(OR) = 0.6478 – (-1.3135) = 1.9613 OR = exp(1.9613) = 7.11 Black males have a 7.11 times greater odds of having intercourse than white females.

13 Hosmer and Lemeshow GOF Test

14 Interpreting the H-L GOF Test The Hosmer and Lemeshow Goodness-of-Fit Test tests the hypotheses: H o : the model is a good fit, vs. H a : the model is NOT a good fit With this test, we want to FAIL to reject the null hypothesis, because that means our model is a good fit (this is different from most of the hypothesis testing you have seen). Look for a pvalue > 0.10 in the H-L GOF test. This indicates the model is a good fit. In this case, the pvalue = 0.2419, so we do NOT reject the null hypothesis, and we conclude the model is a good fit.

15 Let’s consider an interaction between race and gender: We have added a third term to the model: the interaction between race and gender (“white*male”). We did not need to create this variable in the data set.

16 The new R 2 value is 0.9908, which is barely higher than the R 2 from the model with only the main effects. Adding the interaction did not help explain more variance in the model.

17 Logistic Regression Output

18 The interaction is not significant (p = 0.8092). We probably will not want to include it in our model. If it were significant, the model would be: log(odds) = β 0 + β 1 (white) + β 2 (male) + β 3 (white*male) log(odds) = - 0.4925 -1.2534(white) + 0.7243(male) – 0.1151(white*male)

19 H-L Goodness-of-Fit

20 The pvalue of the Hosmer and Lemeshow GOF Test is 0.2439, which is not much greater than that of the previous model without the interaction. Therefore, we conclude the model with just race and gender, without the interaction, is sufficient.

21 Model Selection in SAS Often, if you have multiple predictors and interactions in your model, SAS can systematically select significant predictors using forward selection, backwards selection, or stepwise selection. In forward selection, SAS starts with no predictors in the model. It then selects the predictor with the smallest pvalue and adds it to the model. It then selects another predictor from the remaining variables with the smallest pvalue and adds it to the model. It continues doing this until no more predictors have pvalues less than 0.05. In backwards selection, SAS starts with all of the predictors in the model and eliminates the non-significant predictors one at a time, refitting the model between each elimination. It stops once all the predictors remaining in the model are statistically significant.

22 Forward Selection in SAS We will let SAS select a model for us out of the three predictors: white, male, white*male. Type the following code into SAS:

23 Output from Forward Selection: “white” is added to the model

24 “male” is added to the model

25 No more predictors are found to be statistically significant

26 The Final Model:

27 Hosmer and Lemeshow GOF Test: The model is a good fit

28 You are now familiar with multiple logistic regression and model selection in SAS. If given multiple predictors, you have the tools to find an appropriate model that explains the outcome of interest.


Download ppt "Multiple Logistic Regression RSQUARE, LACKFIT, SELECTION, and interactions."

Similar presentations


Ads by Google