1 G Lect 6W Polynomial example Orthogonal polynomials Statistical power for regression G Multiple Regression Week 6 (Wednesday)
2 G Lect 6W Constructing polynomial fits Two approaches for constructing polynomial fits »Simply create squared, cubed versions of X »Center first: Create squared, cubed versions of (X-C) X c =(X- X) X c and X c 2 will have little or no correlation Both approach yield identical fits Centered polynomials are easier to interpret.
3 G Lect 6W Example from Cohen Interest in minor subject as a function of credits in minor
4 G Lect 6W Interpreting polynomial regression Suppose we have the model »Y=b 0 +b 1 X 1 +b 2 X 2 +e »b 1 is interpreted as the effect of X 1 when X 2 is adjusted Suppose X 1 =W, X 2 =W 2 What does it mean to "hold constant" X 2 in this context? When the zero point is interpretable »Linear term is slope at point 0 »Quadratic is acceleration at point 0 »Cubic is change in acceleration at point 0
5 G Lect 6W Orthogonal Polynomials In experiments, one might have three or four levels of treatment with equal spacing. »0, 1, 2 »0, 1, 2, 3 These levels can be used with polynomial models to fit »Linear, quadratic or cubic trends »We would simply construct squared and cubic forms.
6 G Lect 6W Making polynomials orthogonal The linear, quadratic and cubic trends are all going up in the same way. The curve for the quadratic is like the one for cubic. Orthogonal polynomials eliminate this redundancy hierarchically. »The constant is removed from the linear trend »The const and linear are removed from quadratic »The const, lin and quad are removed from the cubic.
7 G Lect 6W Analysis with Orthogonal polynomials If we substitute orthogonal polynomials for the usual linear, squared, and cubic terms, we »Recover the same polynomial fit »Obtain effects that are useful in determining the polynomial order Even when cubic effects are included, with orthogonal effects »The linear is the average effect »The quadratic is adjusted for the linear, but not adjusted for cubic »The cubic is adjusted for all before The regression coefficients, however are difficult to interpret.
8 G Lect 6W Computing Orthogonal Polynomials Can copy values from Cohen et al or other tables »Substitute original polynomial values for orthogonal version Can use the matrix routine of SPSS to implement a special Transformation. »Read the polynomial data into Matrix. »Use program provided that essentially does four things Computes the polynomial sums or squares/cross products Finds a Cholesky factor Inverts the Cholesky factor Transforms the polynomial values to be orthogonal
9 G Lect 6W The Matrix program MATRIX. GET X /VARIABLES = X, XSQ, XCUB. COMPUTE XFULL={MAKE(100,1,1),X}. COMPUTE XX=T(XFULL)*XFULL. COMPUTE XCHOL=CHOL(XX). PRINT XCHOL. COMPUTE ICHOL=INV(XCHOL). PRINT ICHOL. COMPUTE XORTH=XFULL*ICHOL. SAVE XORTH /OUTFILE=* /VARIABLES= OTH0 ORTH1 ORTH2 ORTH3. END MATRIX. MATCH FILES /FILE=* /FILE='C:\My Documents\Pat\Courses\G Regression\Examples\Reg06W.sav'. EXECUTE.
10 G Lect 6W The transformed variables