G89.2247 Lecture 91 Measurement Error Models Bias due to measurement error Adjusting for bias with structural equation models Examples Alternative models.

Slides:



Advertisements
Similar presentations
1 Regression as Moment Structure. 2 Regression Equation Y =  X + v Observable Variables Y z = X Moment matrix  YY  YX  =  YX  XX Moment structure.
Advertisements

Topic 12: Multiple Linear Regression
Structural Equation Modeling. What is SEM Swiss Army Knife of Statistics Can replicate virtually any model from “canned” stats packages (some limitations.
1 G Lect 4M Interpreting multiple regression weights: suppression and spuriousness. Partial and semi-partial correlations Multiple regression in.
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Kin 304 Regression Linear Regression Least Sum of Squares
Structural Equation Modeling Using Mplus Chongming Yang Research Support Center FHSS College.
The Multiple Regression Model.
 A description of the ways a research will observe and measure a variable, so called because it specifies the operations that will be taken into account.
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
Correlation & Regression Chapter 15. Correlation statistical technique that is used to measure and describe a relationship between two variables (X and.
1 G Lect 5W Regression assumptions Inferences about regression Predicting Day 29 Anxiety Analysis of sets of variables: partitioning the sums of.
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 4. Further Issues.
Chapter 4 Multiple Regression.
Analysis of Covariance Goals: 1)Reduce error variance. 2)Remove sources of bias from experiment. 3)Obtain adjusted estimates of population means.
LECTURE 5 TRUE SCORE THEORY. True Score Theory OBJECTIVES: - know basic model, assumptions - know definition of reliability, relation to TST - be able.
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 4. Further Issues.
Chapter 8 Estimation: Single Population
Chapter 11 Multiple Regression.
Review of Probability and Statistics
LECTURE 16 STRUCTURAL EQUATION MODELING.
G Lect 31 G Lecture 3 SEM Model notation Review of mediation Estimating SEM models Moderation.
G Lecture 111 SEM analogue of General Linear Model Fitting structure of mean vector in SEM Numerical Example Growth models in SEM Willett and Sayer.
Lecture 5 Correlation and Regression
Correlation and Regression
LECTURE 6 RELIABILITY. Reliability is a proportion of variance measure (squared variable) Defined as the proportion of observed score (x) variance due.
MEASUREMENT MODELS. BASIC EQUATION x =  + e x = observed score  = true (latent) score: represents the score that would be obtained over many independent.
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
CORRELATION & REGRESSION
Correlation.
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
G Lecture 41 Panel Designs Panel Path Analyses Examples Lord’s Paradox Extensions.
University of Ottawa - Bio 4118 – Applied Biostatistics © Antoine Morin and Scott Findlay 08/10/ :23 PM 1 Some basic statistical concepts, statistics.
Reliability REVIEW Inferential Infer sample findings to entire population Chi Square (2 nominal variables) t-test (1 nominal variable for 2 groups, 1 continuous)
1 G Lect 8b G Lecture 8b Correlation: quantifying linear association between random variables Example: Okazaki’s inferences from a survey.
Basic Statistics Correlation Var Relationships Associations.
1 Exploratory & Confirmatory Factor Analysis Alan C. Acock OSU Summer Institute, 2009.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Ordinary Least Squares Regression.
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
Lecture 11 Multicollinearity BMTRY 701 Biostatistical Methods II.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
ITEC6310 Research Methods in Information Technology Instructor: Prof. Z. Yang Course Website: c6310.htm Office:
Controlling for Baseline
G Lecture 7 Confirmatory Factor Analysis
G Lecture 81 Comparing Measurement Models across Groups Reducing Bias with Hybrid Models Setting the Scale of Latent Variables Thinking about Hybrid.
Lecture 10: Correlation and Regression Model.
Environmental Modeling Basic Testing Methods - Statistics III.
G Lecture 3 Review of mediation Moderation SEM Model notation
SEM Basics 2 Byrne Chapter 2 Kline pg 7-15, 50-51, ,
CJT 765: Structural Equation Modeling Class 8: Confirmatory Factory Analysis.
1 ESTIMATORS OF VARIANCE, COVARIANCE, AND CORRELATION We have seen that the variance of a random variable X is given by the expression above. Variance.
1 G Lect 3M Regression line review Estimating regression coefficients from moments Marginal variance Two predictors: Example 1 Multiple regression.
Simple and multiple regression analysis in matrix form Least square Beta estimation Beta Simple linear regression Multiple regression with two predictors.
1 G Lect 10M Contrasting coefficients: a review ANOVA and Regression software Interactions of categorical predictors Type I, II, and III sums of.
The SweSAT Vocabulary (word): understanding of words and concepts. Data Sufficiency (ds): numerical reasoning ability. Reading Comprehension (read): Swedish.
5. Evaluation of measuring tools: reliability Psychometrics. 2011/12. Group A (English)
Classical Test Theory Psych DeShon. Big Picture To make good decisions, you must know how much error is in the data upon which the decisions are.
Plausible Values of Latent Variables: A Useful Approach of Data Reduction for Outcome Measures in Pediatric Studies Jichuan Wang, Ph.D. Children’s National.
Lecture 2 Survey Data Analysis Principal Component Analysis Factor Analysis Exemplified by SPSS Taylan Mavruk.
Kin 304 Regression Linear Regression Least Sum of Squares
Classical Test Theory Margaret Wu.
Chapter 5 STATISTICS (PART 4).
BPK 304W Regression Linear Regression Least Sum of Squares
Quantitative Methods Simple Regression.
BPK 304W Correlation.
CHAPTER 29: Multiple Regression*
Some issues in multivariate regression
Presentation transcript:

G Lecture 91 Measurement Error Models Bias due to measurement error Adjusting for bias with structural equation models Examples Alternative models that remind us of the limits of non-experimental data

G Lecture 92 Measurement Error Models Suppose that T and V are random variables that are correlated  TV in the population.  For example, T might be current depressed mood and V might be level of support sought in current day Suppose we cannot measure T and V directly, but instead have fallible measures X and Y  X = T + e x  Y = V + e y The reliability coefficients R XX and R YY tell us what proportion of the variance of X and of Y are due respectively to the variance of T and V

G Lecture 93 Numerical Example Suppose Y 1 =V+E1  Var(Y 1 ) =  Var(V) = 1.00  R 11 =.53 We normally estimate the reliability by getting replicate measures of Y and looking at their correlation pattern.  Test-retest  Internal consistency

G Lecture 94 Correlations of Fallible Scores Let's suppose E[X]=E[Y]=E[T]=E[V]=0  Allows us to forget about means in Var() and Cov() Cov(X,Y) = E[XY]=E[(T+e x )(V+e y )] = E[TV]+E[Te y ]+E[Ve x ]+E[e x e y ] = E(TV) =  TV

G Lecture 95 Attenuation Results Correlations between fallible variables are smaller than correlations between the corresponding error free variables. If we can estimate R XX and R YY we can estimate the attenuation effect.

G Lecture 96 Example Suppose X=T+E1  R XX =.53 Suppose also Y = V+E2  R YY =.39 If Corr(F1,F2) =.58, then  Corr(X,Y) =.26 =.58*(.53*.39).5 Notice that the square root of the product of the reliabilities can be thought as a geometric mean of the two reliabilities.

G Lecture 97 Attenuation in Regression Suppose we were interested in V = B 0 + B 1 T + r v But we only could observe Y = b 0 + b 1 X + r y What is the relation of b 1 to B 1 ?

G Lecture 98 Attenuation in Multiple Regression Measurement error produces bias in regression estimates in multiple regression  The bias is not always positive  Error attenuates correlations, but partial regression coefficients are incompletely adjusted, often leading to estimates that are too large Correcting for bias using reliability estimates can be risky  Reliability is often underestimated

G Lecture 99 Numerical Example: Regressed Change Suppose there is a relation known to exist  T2 =.6*T1 +.1*X + r T might be distress and X might be level of environmental stress Suppose T1 and X are correlated.32 Suppose, however, that T1 and T2 are measured with about 50% measurement error.  Call the fallible measures Y1 and Y2  In a simulation with N=400 the estimates of the effects were Y2 =.225*Y *X + r  The effect of Y1 is too small and the effect of X is too big

G Lecture 910 Adjusting Bias Through Latent Variables and SEM If it is possible to obtain several replicate measures of key variables, it may be possible to adjust for bias due measurement error. If the replicate measures were perfect replications ("parallel measures"), the error models for Y a, Y b, Y c would be  Y a = F1 + Ea  Y b = F1 + Eb  Y c = F1 + Ec

G Lecture 911 Adjusting Bias Through Latent Variables and SEM A more flexible error model is the one-factor CFA model  Y a = F1 + E a  Y b = b  F1 + E b  Y c = c F1 + E c Where the lamdas are weights that adjust the possible scale or importance of each replicate measure Not only can this model be used to estimate the lamda weights, it allows us to envision F1 as a variable in a system of regression equations.

G Lecture 912 Estimating Latent Variable Models Inferences about latent variables are made by looking at the structure of the observed covariance matrix  If the latent variable model is correct, it creates a pattern in the covariance matrix  By fitting the pattern to the observed covariance matrix, we can obtain estimates For the CFA model Var(Y) =  [Var(F)]  ' + Var(E)

G Lecture 913 A simple covariance structure

G Lecture 914 Regressed Change SEM  This model fits covariances from five observed variables. There are 10 correlations between the five and seven paths to be estimated. This leaves 3 degrees of freedom. Y 1a Y 1b Y 2b X F1F1 F2F2