Download presentation
Presentation is loading. Please wait.
Published byAubrey Stewart Modified over 9 years ago
1
1 G89.2229 Lect 6M Comparing two coefficients within a regression equation Analysis of sets of variables: partitioning the sums of squares Polynomial curve fitting. G89.2229 Multiple Regression Week 6 (Monday)
2
2 G89.2229 Lect 6M Comparing Two Regression Coefficients Suppose we have »Y=b 0 +b 1 X 1 +b 2 X 2 +e Suppose one theory states that b 1 =b 2, but another theory says that they should differ. How do we carry out the test? Create a Wald statistic »In the numerator, »In the denominator, the standard error of the numerator. Recall for two vars W and Z: V(k 1 W+ k 2 Z) = k 1 2 w 2 + k 2 2 z 2 + 2 k 1 k 2 wz
3
3 G89.2229 Lect 6M Decomposition of Regression and Residual Variance Step 1: Estimate regression coefficients using OLS and compute predicted (fitted) values of Y (Y). Step 2: Estimate Regression Sums of Squares as (Y-Y) 2, MSR=SSR/df Step 3: Estimate Residual Sums of Squares as e 2, MSE=SSE/df Under H 0, MSR/MSE is distributed as central F on (q,n-q-1) df SourcedfSSMS Regression q (Y-Y) 2 SSR/q Residualn-q-1 e 2 SSE /(n-q-1) ^ ^ ^ ^ ^ ¯ ¯
4
4 G89.2229 Lect 6M Test of Incremental R 2 Due to X q Hierarchical Regression »Fit reference model with X 1, X 2,...,X q-1 Determine Regression Sums of Squares This determines R 2 of reference model »Fit expanded model with X q added to reference model Determine increase in Regression Sums of Squares (SS q ) »on 1 df for single predictor X q Determines R 2 increment »“semipartial squared correlation” Determine Sums of Squares & Mean Squares for residual from expanded model »MSE is mean square for residual »on (n-q-1) degrees of freedom »Under null hypothesis, H 0 :B q =0 MS q is simply fitted random variation MS q /MSE ~ F[1, (n-q-1)]
5
5 G89.2229 Lect 6M Example: Predicting Anger on Day 29 with Day 28 Measures Does Anger on day 28 improve the fit of Anger on day 29 after four other moods have been included in the model? Do two emotional support variables on day 28 improve the fit of Anger 29 after five moods have been included?
6
6 G89.2229 Lect 6M Numerical Results
7
7 G89.2229 Lect 6M What to do when relation of Y to X is not linear? E.g. Confidence regarding opinion as a function of expressed attitude level. »The stronger the attitude, the more confident respondents tend to be »Relation is not monotonic: Alliance between therapist and patient as a function of time in treatment »Alliance is initially high but decreases as the hard work of therapy begins. »Alliance recovers for successful therapy »Relationship is not monotonic X Y
8
8 G89.2229 Lect 6M Other Examples of Nonlinear Relations Developmental growth as a function of exposure to environmental toxins such as lead. »Low levels of exposure may not reveal deficits. Relation of productivity to impact
9
9 G89.2229 Lect 6M Modeling Nonlinear Patterns It is sometimes possible to transform the outcome so that it can be fit by a truly linear function »Typically works for monotonic nonlinear relationships »Often nonlinear shape is accompanied by heteroscedascity »Box-Cox transformation provide a large class of alternatives Call possible transformations, h(Y). »E.g. h(Y)=Y h(Y) = ln(Y) h(Y) = (Y a - 1)/a
10
10 G89.2229 Lect 6M Polynomial models provide fitting flexibility Consider the polynomial equation: »Y = b 0 + b 1 X+b 2 X 2 +...+b p X p +e b1 is the (partial) linear effect) b2 is the (partial) quadratic effect b3 is the (partial) cubic effect and so on »Each term adds the ability to fit certain patterns of variation.
11
11 G89.2229 Lect 6M Application of polynomials Polynomial equations are very flexible in fitting a variety of curves »Often we need only consider quadratic or cubic level equations »Sometimes theoretical considerations tell us how many terms we need »Many researchers use polynomial equations simply to be descriptive When reviewing research that uses polynomials, ask whether they are being used as structural or descriptive models.
12
12 G89.2229 Lect 6M Constructing polynomial fits Two approaches for constructing polynomial fits »Simply create squared, cubed versions of X »Center first: Create squared, cubed versions of (X-C) X c =(X- X) X c and X c 2 will have little or no correlation Both approach yield identical fits Centered polynomials are easier to interpret.
13
13 G89.2229 Lect 6M Interpreting polynomial regression Suppose we have the model »Y=b 0 +b 1 X 1 +b 2 X 2 +e »b 1 is interpreted as the effect of X 1 when X 2 is adjusted Suppose X 1 =W, X 2 =W 2 What does it mean to "hold constant" X 2 in this context? When the zero point is interpretable »Linear term is slope at point 0 »Quadratic is acceleration at point 0 »Cubic is change in acceleration at point 0
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.