Download presentation
Presentation is loading. Please wait.
1
Regression analysis: linear and logistic
2
Linear correlation and linear regression
3
Example: class data
7
New concept: Covariance
8
Interpreting Covariance
Covariance between two random variables: cov(X,Y) > X and Y tend to move in the same direction cov(X,Y) < X and Y tend to move in opposite directions cov(X,Y) = X and Y are independent
9
Correlation coefficient
Pearson’s Correlation Coefficient is standardized covariance (unitless):
10
Corrrelation Measures the relative strength of the linear relationship between two variables Unit-less Ranges between –1 and 1 The closer to –1, the stronger the negative linear relationship The closer to 1, the stronger the positive linear relationship The closer to 0, the weaker any positive linear relationship
11
Scatter Plots of Data with Various Correlation Coefficients
Y Y Y X X X r = -1 r = -.6 r = 0 Y Y Y X X X r = +1 r = +.3 r = 0 ** Next 4 slides from “Statistics for Managers”4th Edition, Prentice-Hall 2004
12
Linear Correlation Linear relationships Curvilinear relationships Y Y
X X Y Y X X
13
Linear Correlation Strong relationships Weak relationships Y Y X X Y Y
14
Linear Correlation No relationship Y X Y X
15
Review Problem 1 What’s a good guess for the Pearson’s correlation coefficient (r) for this scatter plot? –1.0 +1.0 -.5 -.1
16
Review Problem 1 What’s a good guess for the Pearson’s correlation coefficient (r) for this scatter plot? –1.0 +1.0 -.5 -.1
17
Linear regression In correlation, the two variables are treated as equals. In regression, one variable is considered independent (=predictor) variable (X) and the other the dependent (=outcome) variable Y.
18
What is “Linear”? Remember this: Y=mX+B? m B
19
What’s Slope? A slope of 2 means that every 1-unit change in X yields a 2-unit change in Y.
20
Simple linear regression
intercept The linear regression model: Hours of exercise/week = *(MS writing enjoyment score) slope
21
Simple linear regression
Wake Time = *Hours of exercise/week Every additional hour of weekly exercise costs you about 12 minutes of sleep in the morning.
22
EXAMPLE The distribution of baby weights at Stanford ~ N(3400, 360000)
Your “Best guess” at a random baby’s weight, given no information about the baby, is what? 3400 grams But, what if you have relevant information? Can you make a better guess?
23
Predictor variable X=gestation time
Assume that babies that gestate for longer are born heavier, all other things being equal. Pretend (at least for the purposes of this example) that this relationship is linear. Example: suppose a one-week increase in gestation, on average, leads to a 100-gram increase in birth-weight
24
Y depends on X Y=birth- weight (g) X=gestation time (weeks)
Best fit line is chosen such that the sum of the squared (why squared?) distances of the points (Yi’s) from the line is minimized:
25
Prediction A new baby is born that had gestated for just 30 weeks. What’s your best guess at the birth-weight? Are you still best off guessing 3400? NO!
26
At 30 weeks… Y=birth- weight (g) 3000 X=gestation time (weeks) 30
27
At 30 weeks… Y=birth weight (g) 3000 (x,y)= (30,3000)
X=gestation time (weeks) 30
28
At 30 weeks… The babies that gestate for 30 weeks appear to center around a weight of 3000 grams. In Math-Speak… E(Y/X=30 weeks)=3000 grams Note the conditional expectation
29
But… Yi=3000 + random errori
Note that not every Y-value (Yi) sits on the line. There’s variability. Yi= random errori In fact, babies that gestate for 30 weeks have birth-weights that center at 3000 grams, but vary around 3000 with some variance 2 Approximately what distribution do birth-weights follow? Normal. Y/X=30 weeks ~ N(3000, 2)
30
And, if X=20, 30, or 40… Y=birth- weight (g) X=gestation time (weeks)
31
If X=20, 30, or 40… Y=baby weights (g) X=gestation times (weeks) 20 30
Y/X=40 weeks ~ N(4000, 2) Y/X=30 weeks ~ N(3000, 2) Y/X=20 weeks ~ N(2000, 2) Y=baby weights (g) X=gestation times (weeks) 20 30 40
32
X=gestation times (weeks)
The standard error of Y given X is the average variability around the regression line at any given value of X. It is assumed to be equal at all values of X. Sy/x Y=baby weights (g) Sy/x X=gestation times (weeks) 20 30 40
33
Linear Regression Model
Y’s are modeled… Yi= *X random errori Follows a normal distribution Fixed – exactly on the line
34
Review Problem 2 Using the regression equation:
Y/X = 100 grams/week*X weeks What is the expected weight of a baby born at 22 weeks? 2000g 2100g 2200g 2300g 2400g
35
Review Problem 2 Using the regression equation:
Y/X = 100 grams/week*X weeks What is the expected weight of a baby born at 22 weeks? 2000g 2100g 2200g 2300g 2400g
36
Review Problem 3 Our model predicts that:
All babies born at 22 weeks will weigh 2200 grams. Babies born at 22 weeks will have a mean weight of 2200 grams with some variation. Both of the above. None of the above.
37
Review Problem 3 Our model predicts that:
All babies born at 22 weeks will weigh 2200 grams. Babies born at 22 weeks will have a mean weight of 2200 grams with some variation. Both of the above. None of the above.
38
Assumptions (or the fine print)
Linear regression assumes that… 1. The relationship between X and Y is linear 2. Y is distributed normally at each value of X 3. The variance of Y at every value of X is the same (homogeneity of variances)
39
Non-homogenous variance
Y=birth-weight (100g) X=gestation time (weeks)
40
Residual Residual = observed value – predicted value 3350 grams
33.5 weeks This baby was actually 3380 grams. His residual is +30 grams: 3350 grams At 33.5 weeks gestation, predicted baby weight is 3350 grams
41
Review Problem 4 A medical journal article reported the following linear regression equation: Cholesterol = *(age past 40) Based on this model, what is the expected cholesterol for a 60 year old? 150 370 230 190 200
42
Review Problem 4 A medical journal article reported the following linear regression equation: Cholesterol = *(age past 40) Based on this model, what is the expected cholesterol for a 60 year old? 150 370 230 190 200
43
Review Problem 5 If a particular 60 year old in your study sample had a cholesterol of 250, what is his/her residual? +50 -50 +60 -60
44
Review Problem 5 If a particular 60 year old in your study sample had a cholesterol of 250, what is his/her residual? +50 -50 +60 -60
45
A ttest is linear regression!
In our class the average drinking in the Democrats (politics 6-10, n=15) was 3.2 drinks/week; in Republicans (n=5), this value was 0.4 drinks/week. We can evaluate these data with a ttest (assuming alcohol consumption is normally distributed):
46
As a linear regression…
alcohol = *(1=Republican; 0=not)
47
ANOVA is linear regression!
A categorical variable with more than two groups: E.g.: groups 1, 2, and 3 (mutually exclusive) = (=value for group 1) + 1*(1 if in group 2) + 2 *(1 if in group 3) This is called “dummy coding”—where multiple binary variables are created to represent being in each category (or not) of a categorical variable
48
Multiple Linear Regression
More than one predictor… = + 1*X + 2 *W + 3 *Z Each regression coefficient is the amount of change in the outcome variable that would be expected per one-unit change of the predictor, if all other variables in the model were held constant.
49
Functions of multivariate analysis:
Control for confounders Test for interactions between predictors (effect modification) Improve predictions
50
Review Problem 6 A medical journal article reported the following linear regression equation: Cholesterol = *(age past 40) + 10*(gender: 1=male, 0=female) Based on this model, what is the expected cholesterol for a 60 year-old man? 150 370 230 190 200
51
Review Problem 6 A medical journal article reported the following linear regression equation: Cholesterol = *(age past 40) + 10*(gender: 1=male, 0=female) Based on this model, what is the expected cholesterol for a 60 year-old man? 150 370 230 190 200
52
Table 3. Relationship of Combinations of Macronutrients to BP (SBP and DBP) for Men, Years 1 Through 6 of MRFIT: Multiple Linear Regression Analyses Linear Regression Coefficient (z Score) Variable SBP DBP Model 1 Total protein, % kcal (-1.10) (-3.17) Cholesterol, mg/1000 kcal (2.46) (3.51) Saturated fatty acids, % kcal (1.45) (2.86) Polyunsaturated fatty acids, % kcal (0.24) (-1.22) Starch, % kcal (4.98) (4.34) Other simple carbohydrates, % kcal (1.35) (0.04) Model 2 (-1.10) (-2.77) (2.14) (3.19) (1.73) (4.08) (0.08) (-1.07) (4.65) (4.35) Models controlled for baseline age, race (black, nonblack), education, smoking, serum cholesterol. Circulation Nov 15;94(10):
53
In math terms: SBP= -.0346*(% protein) + age *(Age) …+….
Linear Regression Coefficient (z Score) Variable SBP DBP Total protein, % kcal (-1.10) (-3.17) Translation: controlled for other variables in the model (as well as baseline age, race, etc.), every 1 % increase in the percent of calories coming from protein correlates with mmHg decrease in systolic BP. (NS) In math terms: SBP= *(% protein) + age *(Age) …+…. Also (from a separate model), every 1 % increase in the percent of calories coming from protein correlates with a mmHg decrease in diastolic BP. (significant) DBP= *(% protein) + age *(Age) …+….
54
Other types of multivariate regression
Multiple linear regression is for normally distributed outcomes Logistic regression is for binary outcomes Cox proportional hazards regression is used when time-to-event is the outcome
55
Overfitting In multivariate modeling, you can get highly significant but meaningless results if you put too many predictors in the model. The model is fit perfectly to the quirks of your particular sample, but has no predictive ability in a new sample. Example (hypothetical): In a randomized trial of an intervention to speed bone healing after fracture, researchers built a multivariate regression model to predict time to recovery in a subset of women (n=12). An automatic selection procedure came up with a model containing age, weight, use of oral contraceptives, and treatment status; the predictors were all highly significant and the model had a nearly perfect R-square of 99.5%. This is likely an example of overfitting. The researchers have fit a model to exactly their particular sample of data, but it will likely have no predictive ability in a new sample. Rule of thumb: You need at least 10 subjects for each additional predictor variable in the multivariate regression model.
56
Overfitting Pure noise variables still produce good R2 values if the model is overfitted. The distribution of R2 values from a series of simulated regression models containing only noise variables. (Figure 1 from: Babyak, MA. What You See May Not Be What You Get: A Brief, Nontechnical Introduction to Overfitting in Regression-Type Models. Psychosomatic Medicine 66: (2004).)
57
Overfitting example, class data…
PREDICTORS OF EXERCISE HOURS PER WEEK (multivariate model): Variable Beta p-VALUE Intercept Coffee wakeup engSAT mathSAT writingLove <.0001 sleep R-Square = N=20, 7 parameters in the model!
58
Univariate models… Variable Beta p-value Coffee 0.05916 0.3990
Wakeup MathSAT EngSAT Sleep WritingLove
59
Logistic Regression
60
Example: Political party and alcohol…
This association could also be analyzed with logistic regression: Republican (yes/no) becomes the binary outcome. Alcohol (continuous) becomes the predictor.
61
Example: Political party and drinking…
62
The logistic model… ln(p/1- p) = + 1*X Logit function
=log odds of the outcome
63
The Logit Model Linear function of risk factors for individual i:
Baseline odds Linear function of risk factors for individual i: 1x1 + 2x2 + 3x3 + 4x4 … Logit function (log odds)
64
Review question 7 If X=.50, what is the logit (=log odds) of X? .50
1.0 2.0 -.50
65
Review question 7 If X=.50, what is the logit (=log odds) of X? .50
1.0 2.0 -.50
66
Example: political party and drinking…
Model: Log odds of being a Republican (outcome)= Intercept+ Weekly drinks (predictor) Fit the data in logistic regression using a computer…
67
Fitted logistic model:
“Log Odds” of being a Republican = * (d/wk) Slope for drinking can be directly translated into an odds ratio: Interpretation: every 1 drink more per week decreases your odds of being a Republican by 85% (95% CI is to 1.003)
68
The Logit Model Linear function of risk factors for individual i:
Baseline odds Linear function of risk factors for individual i: 1x1 + 2x2 + 3x3 + 4x4 … Logit function (log odds)
69
To get back to OR’s…
70
“Adjusted” Odds Ratio Interpretation
71
Adjusted odds ratio, continuous predictor
72
Practical Interpretation
The odds of disease increase multiplicatively by eß for for every one-unit increase in the exposure, controlling for other variables in the model.
73
Practice interpreting the table from a case-control study:
74
Practice interpreting the table from a paper:
75
Review problem 8 In a cross-sectional study of heart disease in middle-aged men and women, 10% of men in the sample had prevalent heart disease compared with only 5% of women. After adjusting for age in multivariate logistic regression, the odds ratio for heart disease comparing males to females was 1.1 (95% confidence interval: 0.80—1.42). What conclusion can you draw? Being male increases your risk of heart disease. Age is a confounder of the relationship between gender and heart disease. There is a statistically significant association between gender and heart disease. The study had insufficient power to detect an effect.
76
Review problem 8 In a cross-sectional study of heart disease in middle-aged men and women, 10% of men in the sample had prevalent heart disease compared with only 5% of women. After adjusting for age in multivariate logistic regression, the odds ratio for heart disease comparing males to females was 1.1 (95% confidence interval: 0.80—1.42). What conclusion can you draw? Being male increases your risk of heart disease. Age is a confounder of the relationship between gender and heart disease. There is a statistically significant association between gender and heart disease. The study had insufficient power to detect an effect.
77
Homework Continue reading textbook Problem Set 8 Journal article
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.