Measurement Bias Detection Through Factor Analysis Barendse, M. T., Oort, F. J. Werner, C. S., Ligtvoet, R., Schermelleh-Engel, K.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

FACTORIAL ANOVA Overview of Factorial ANOVA Factorial Designs Types of Effects Assumptions Analyzing the Variance Regression Equation Fixed and Random.
Chapter 12 Simple Linear Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Objectives (BPS chapter 24)
Chapter 12 Simple Linear Regression
Chapter 13 Multiple Regression
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
Chapter 10 Simple Regression.
Chapter 12 Multiple Regression
Chapter 7 Sampling and Sampling Distributions
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Review for Exam 2 Some important themes from Chapters 6-9 Chap. 6. Significance Tests Chap. 7: Comparing Two Groups Chap. 8: Contingency Tables (Categorical.
Simple Linear Regression Analysis
Relationships Among Variables
Multiple Sample Models James G. Anderson, Ph.D. Purdue University.
Overview of Meta-Analytic Data Analysis
Moderation in Structural Equation Modeling: Specification, Estimation, and Interpretation Using Quadratic Structural Equations Jeffrey R. Edwards University.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 14 Comparing Groups: Analysis of Variance Methods Section 14.2 Estimating Differences.
Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.
CJT 765: Structural Equation Modeling Class 7: fitting a model, fit indices, comparingmodels, statistical power.
+ Chapter 12: Inference for Regression Inference for Linear Regression.
MULTIPLE TRIANGLE MODELLING ( or MPTF ) APPLICATIONS MULTIPLE LINES OF BUSINESS- DIVERSIFICATION? MULTIPLE SEGMENTS –MEDICAL VERSUS INDEMNITY –SAME LINE,
CJT 765: Structural Equation Modeling Class 8: Confirmatory Factory Analysis.
Chapter 14 Inference for Regression AP Statistics 14.1 – Inference about the Model 14.2 – Predictions and Conditions.
Lesson Inference for Regression. Knowledge Objectives Identify the conditions necessary to do inference for regression. Explain what is meant by.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
The Campbell Collaborationwww.campbellcollaboration.org C2 Training: May 9 – 10, 2011 Introduction to meta-analysis.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
+ Chapter 12: More About Regression Section 12.1 Inference for Linear Regression.
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.3 Using Multiple Regression to Make Inferences.
Multiple Regression BPS chapter 28 © 2006 W.H. Freeman and Company.
Lecture 10 Chapter 23. Inference for regression. Objectives (PSLS Chapter 23) Inference for regression (NHST Regression Inference Award)[B level award]
Regression Analysis © 2007 Prentice Hall17-1. © 2007 Prentice Hall17-2 Chapter Outline 1) Correlations 2) Bivariate Regression 3) Statistics Associated.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
CJT 765: Structural Equation Modeling Class 8: Confirmatory Factory Analysis.
Statistical Data Analysis 2010/2011 M. de Gunst Lecture 9.
8-1 MGMG 522 : Session #8 Heteroskedasticity (Ch. 10)
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
Multiple Regression David A. Kenny January 12, 2014.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Inference about the slope parameter and correlation
Lecture 11: Simple Linear Regression
Inference for Least Squares Lines
Linear Regression.
AP Statistics Chapter 14 Section 1.
CJT 765: Structural Equation Modeling
Hypothesis Tests: One Sample
12 Inferential Analysis.
Correlation and Simple Linear Regression
G Lecture 6 Multilevel Notation; Level 1 and Level 2 Equations
CHAPTER 29: Multiple Regression*
Statistics for Business and Economics
Correlation and Simple Linear Regression
Simple Linear Regression
12 Inferential Analysis.
Simple Linear Regression
Simple Linear Regression and Correlation
Chapter 14 Inference for Regression
Inference for Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Measurement Bias Detection Through Factor Analysis Barendse, M. T., Oort, F. J. Werner, C. S., Ligtvoet, R., Schermelleh-Engel, K.

Defining measurement bias Violation of measurement invariance Where V is violator If V is grouping variable, then MGFA is suitable Intercepts – uniform bias Factor loadings – non-uniform bias (vary with t)

Restricted Factor Analysis (RFA) Advantages of RFA over MGFA: V can be continuous or discrete, observed or latent Investigate measurement bias with multiple Vs. More precise parameter estimates and larger power Disadvantage of RFA: Not suited for nonuniform bias (interaction term)

Approaches for non-uniform bias RFA with latent moderated structural equations (LMS) ---- Simulation (categorical V) showed at least as good as MGFA RFA with random regression coefficients in structural equation modeling (RSP) ---- performance unknown

This paper… Compared methods: MGFA RFA with LMS RFA with RSP Measurement bias Uniform Nonuniform Violator Dichotomous Continous

Data generation (RFA) True model: Uniform bias:. Nonuniform bias: T and v are bivariate standard normal distributed with correlation r e is standard normal distributed u is null vector

Simulation Design For continuous V: Type of bias (only on item 1): No bias (b=c=0), uniform bias(b=0.3,c=0), nonuniform bias (b=0,c=0.3), mixed bias (b=c=0.3) Relationship between T and V Independent (r=0), dependent (r=0.5)

Simulation Design For dichotomous V: V=-1 for group 1 and v=1 for group 2 Model can be rewritten into Relationship between T and V: Correlation varies!

The MGFA method When v is dichotomous, regular MGFA When v is continuous, dichotomize x by V Using chi-square difference test with df=2 Uniform : intercepts Nonuniform: loadings

The RFA/LMS method V is modeled as latent variable: Single indicator Fix residual variance (0.01) Fix factor loading Three-factor model: T, V, T*V Robust ML estimation Chi-square test with S-B correction: : uniform bias : nonuniform bias

RFA/RSP method Replacing with, where is a random slope. Robust ML estimation Chi-square test with S-B correction: : uniform bias : nonuniform bias

Single & iterative procedures Single run procedure: test once for each item Iterative procedure: 1)Locate the item with the largest chi-square difference 2)Free constrains on intercepts and factor loadings for this item and test others 3)Locate the item with the largest chi-sqaure difference 4)… 5)Stops when no significant results exist or half are detected as biased

Results of MGFA – single run Shown in Table 2. Conclusion: 1.better with dichotomous than with continuous V; 2.non-uniform bias is more difficult to detect than uniform bias; 3.Type I error inflated.

Results of MGFA – iterative run Shown in Table 3. Conclusion: 1.Iterative procedure produces close power as single run does. 2.Iterative procedure produces better controlled Type I error rate.

Results of RFA/LMS & RFA/RSP - single run Shown in Table 4 and Table 5. Conclusion: 1.LMS and RSP produce almost equivalent results. 2.larger power than MGFA with continuous V. 3.More severely inflated Type I error rates

Results of RFA/LMS & RFA/RSP - iterative run Shown in Table 6. Conclusion: 1.Power is close to the single run 2.Type I error rates are improved

Results of estimation bias - MGFA Shown in Table 7. Conclusion: 1.Bias in estimates is small 2.Bias in SD is non-ignorable 3.Smaller bias in estimates for dichotomous V (dependent T&V)

Results of estimation bias - RFA Shown in Table 8 & 9 Conclusion: 1.Similar results for LMS and RSP 2.Small bias in estimates 3.Non-ignorable bias in SD 4.Smaller SE than MGFA 5.Smaller bias in estimates than MGFA with dependent T&V, continuous V.

Discussion Nonconvergence occurs with RFA/LMS

Non-convergence Summary: