G Lecture 91 Measurement Error Models Bias due to measurement error Adjusting for bias with structural equation models Examples Alternative models that remind us of the limits of non-experimental data
G Lecture 92 Measurement Error Models Suppose that T and V are random variables that are correlated TV in the population. For example, T might be current depressed mood and V might be level of support sought in current day Suppose we cannot measure T and V directly, but instead have fallible measures X and Y X = T + e x Y = V + e y The reliability coefficients R XX and R YY tell us what proportion of the variance of X and of Y are due respectively to the variance of T and V
G Lecture 93 Numerical Example Suppose Y 1 =V+E1 Var(Y 1 ) = Var(V) = 1.00 R 11 =.53 We normally estimate the reliability by getting replicate measures of Y and looking at their correlation pattern. Test-retest Internal consistency
G Lecture 94 Correlations of Fallible Scores Let's suppose E[X]=E[Y]=E[T]=E[V]=0 Allows us to forget about means in Var() and Cov() Cov(X,Y) = E[XY]=E[(T+e x )(V+e y )] = E[TV]+E[Te y ]+E[Ve x ]+E[e x e y ] = E(TV) = TV
G Lecture 95 Attenuation Results Correlations between fallible variables are smaller than correlations between the corresponding error free variables. If we can estimate R XX and R YY we can estimate the attenuation effect.
G Lecture 96 Example Suppose X=T+E1 R XX =.53 Suppose also Y = V+E2 R YY =.39 If Corr(F1,F2) =.58, then Corr(X,Y) =.26 =.58*(.53*.39).5 Notice that the square root of the product of the reliabilities can be thought as a geometric mean of the two reliabilities.
G Lecture 97 Attenuation in Regression Suppose we were interested in V = B 0 + B 1 T + r v But we only could observe Y = b 0 + b 1 X + r y What is the relation of b 1 to B 1 ?
G Lecture 98 Attenuation in Multiple Regression Measurement error produces bias in regression estimates in multiple regression The bias is not always positive Error attenuates correlations, but partial regression coefficients are incompletely adjusted, often leading to estimates that are too large Correcting for bias using reliability estimates can be risky Reliability is often underestimated
G Lecture 99 Numerical Example: Regressed Change Suppose there is a relation known to exist T2 =.6*T1 +.1*X + r T might be distress and X might be level of environmental stress Suppose T1 and X are correlated.32 Suppose, however, that T1 and T2 are measured with about 50% measurement error. Call the fallible measures Y1 and Y2 In a simulation with N=400 the estimates of the effects were Y2 =.225*Y *X + r The effect of Y1 is too small and the effect of X is too big
G Lecture 910 Adjusting Bias Through Latent Variables and SEM If it is possible to obtain several replicate measures of key variables, it may be possible to adjust for bias due measurement error. If the replicate measures were perfect replications ("parallel measures"), the error models for Y a, Y b, Y c would be Y a = F1 + Ea Y b = F1 + Eb Y c = F1 + Ec
G Lecture 911 Adjusting Bias Through Latent Variables and SEM A more flexible error model is the one-factor CFA model Y a = F1 + E a Y b = b F1 + E b Y c = c F1 + E c Where the lamdas are weights that adjust the possible scale or importance of each replicate measure Not only can this model be used to estimate the lamda weights, it allows us to envision F1 as a variable in a system of regression equations.
G Lecture 912 Estimating Latent Variable Models Inferences about latent variables are made by looking at the structure of the observed covariance matrix If the latent variable model is correct, it creates a pattern in the covariance matrix By fitting the pattern to the observed covariance matrix, we can obtain estimates For the CFA model Var(Y) = [Var(F)] ' + Var(E)
G Lecture 913 A simple covariance structure
G Lecture 914 Regressed Change SEM This model fits covariances from five observed variables. There are 10 correlations between the five and seven paths to be estimated. This leaves 3 degrees of freedom. Y 1a Y 1b Y 2b X F1F1 F2F2