Download presentation
Presentation is loading. Please wait.
Published byShon Willis Modified over 8 years ago
1
Multiple Linear Regression Partial Regression Coefficients
2
b i is an Unstandardized Partial Slope Predict Y from X 2 Predict X 1 from X 2 Predict from That is, predict the part of Y that is not related to X 2 from the part of X 1 that is not related to X 2 The resulting b is that for b 1 in
3
b i is the average change in Y per unit change in X i with all other predictor variables held constant
4
is a Standardized Partial Slope Predict Z Y from Z 2 Predict Z 1 from Z 2 Predict from The slope of the resulting regression is 1. 1 is the number of standard deviations that Y changes per standard deviation change in X 1 after we have removed the effect of X 2 from both X 1 and Y
5
R2R2 Can be interpreted as a simple r 2, a proportion of variance explained.
6
Unless R 2 = 1, the variance in the predicted Y scores is less than the variance in the actual Y scores. What is this phenomenon called?
7
Regression Towards the Mean
8
Squared Correlation Coefficients
9
Squared Semipartial Correlation the proportion of all the variance in Y that is associated with one predictor but not with any of the other predictors. the decrease in R 2 that results from removing a predictor from the model
10
sr i Predict X 1 from X 2 sr i is the simple correlation between ALL of Y and that part of X 1 that is not related to any of the other predictors
11
Squared Partial Correlation Of the variance in Y that is not associated with any other predictors, what proportion is associated with the variance in X i
12
sr 2 Related to pr 2
13
pr i Predict Y from X 2 Predict X 1 from X 2 is the r between Y partialled for all other predictors and X i partialled for all other predictors.
14
Commonality Analysis One can estimate the size of the redundant area C. See my document Commonality Analysis.Commonality Analysis
15
A Demonstration Partial.sas – run this SAS program to obtain an illustration of the partial nature of the coefficients obtained in a multiple regression analysis. Partial.sas
16
More Details Multiple R 2 and Partial Correlation/Regression Coefficients Multiple R 2 and Partial Correlation/Regression Coefficients
17
Relative Weights Analysis Partial regression coefficients exclude variance that is shared among predictors. It is possible to have a large R 2 but none of the predictors have substantial partial coefficients. There are now methods by which one can partition the R 2 into pseudo-orthogonal portions, each portion representing the relative contribution of one predictor variable.
18
Proportions of Variance Predictorr2r2 sr 2 Raw Relative Weight Rescaled Relative Weight Teach.646*.183*.344*.456 Knowledge.465*.071*.238*.316 Exam.355*.004.124*.164 Grade.090*.007.027.035 Enroll.057.010.022.029
19
If the predictors were orthogonal, the sum of r 2 would be equal to R 2, and The values of r 2 would be identical to the values of sr 2. The sr 2 here is.275, and R 2 =.755, so.755 -.275 = 48% of the variance in Overall is excluded from the squared semipartials due to redundancy.
20
Notice That The sum of the raw relative weights =.755 = the value of R 2. The sum of the rescaled relative weights is 100%. The sr 2 for Exam is not significant, but its raw relative weight is significant.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.