Download presentation
Presentation is loading. Please wait.
Published byChester Black Modified over 9 years ago
1
Bayesian Evaluation of Informative Hypotheses in SEM using Mplus Rens van de Schoot a.g.j.vandeschoot@uu.nl rensvandeschoot.wordpress.com
2
Informative hypotheses
3
Null hypothesis testing l Difficult to evaluate specific expectations using classical null hypothesis testing: –Not always interested in null hypothesis –‘accepting’ alternative hypothesis no answer –No direct relation –Visual inspection –Contradictory results
4
Null hypothesis testing l Theory l Expectations l Testing: –H 0 : nothing is going on vs. –H 1 : something is going on, but we do not know what… = catch-all hypothesis
5
Evaluating Informative Hypotheses l Theory l Expectations l Evaluating informative hypotheses: - H a : theory/expectation 1 vs. - H b : theory/expectation 2 vs. - H c : theory/expectation 3 etc. √
6
Informative Hypotheses Hypothesized order constraints between statistical parameters l Order constraints: l Statistical parameters: means, regression coefficients, etc.
7
Why??? l Direct support for your expectation l Gain in power l Van de Schoot & Strohmeier, (2011), Testing informative hypotheses in SEM Increases Power. IJBD vol. 35 no. 2 180-190 7
8
Default Bayes factors
11
Bayes factors for informative hypo’s l As was shown by Klugkist et al. (2005, Psych.Met.,10, 477-493), the Bayes factor (BF) of H A versus H unc can be written as l where f i can be interpreted as a measure for model fit and c i as a measure for model complexity of H a.
12
Bayes factors for informative hypo’s l Model Complexity, c i : –Can be computed before observing any data. –Determining the number of restrictions imposed on the means –The more restriction, the lower c i
13
Bayes factors for informative hypo’s l Model fit, f i : –After observing some data, –It quantifies the amount of agreement of the sample means with the restrictions imposed
14
Bayes factors for informative hypo’s l Bayesian Evaluation of Informative Hypotheses in SEM using Mplus –Van de Schoot, Hoijtink, Hallquist, & Boelen (in press). Bayesian Evaluation of inequality-constrained Hypotheses in SEM Models using Mplus. Structural Equation Modeling –Van de Schoot, Verhoeven & Hoijtink (under review). Bayesian Evaluation of Informative Hypotheses in SEM using Mplus: A Black Bear story.
15
Example: Depression 15
16
Data l (1) females with a high score on negative coping strategies (n = 1429), l (2) females with a low score on negative coping strategies (n = 1532), l (3) males with a high score on negative coping strategies (n = 1545), l (4) males with a low score on negative coping strategies (n = 1072), 16
17
Model 17
18
Expectations l “We expected that the relation between life events on Time 1 is a stronger predictor of depression on Time 2 for girls who have a negative coping strategy than for girls with a less negative coping strategy and that the same holds for boys. Moreover, we expected that this relation is stronger for girls with a negative coping style compared to boys with a negative coping style and that the same holds for girls with a less negative coping style compared to boys with a less negative copings style.” 18
19
Expectations l H i1 : (β1 > β2) & (β3 > β4) l H i2 : β1 > (β2, β3) > β4) 19
20
Model 20
21
Bayes Factor 21
22
Step-by-step 22 l we need to obtain estimates for f i and c i l Step 1. The first step is to formulate an inequality constrained hypothesis l Step 2. The second step is to compute c i. For simple order restricted hypotheses this can be done by hand.
23
Step-by-step 23 l Count the number of parameters in the inequality constrained hypothesis –in our example: 4 ( β1 β2 β3 β4) l Order these parameters in all possible ways: –in our example there are 4! = 4x3x2x1= 24 different ways of ordering four parameters.
24
Step-by-step 24 l Count the number of possible orderings that are in line with each of the informative hypotheses: –For H i1 (β1 > β2) & (β3 > β4) that are 6 possibilities; –For H i2 β1 > (β2, β3) > β4) that are 2 possibilities;
25
Step-by-step 25 l Divide the value obtained in step 3 by the value obtained in step 2: –c i1 = 6/24 = 0.25 –c i2 = 2/24 = 0.0833 l Note that H i2 is the most specific hypothesis and receives the smallest value for complexity.
26
Step-by-step 26 l Step 3. Run the model in Mplus:
27
Mplus syntax DATA: FILE = data.dat; VARIABLE: NAMES ARE lif1 depr1 depr2 groups; MISSING ARE ALL (-9999); KNOWNCLASS is g(group = 1 group = 2 group = 3 group = 4); CLASSES is g(4); 27
28
Mplus syntax ANALYSIS: TYPE is mixture; ESTIMATOR = Bayes; PROCESSOR= 32; 28
29
Mplus syntax MODEL: %overall% depr2 on lif1; depr2 on depr1; lif1 with depr1; [lif1 depr1 depr2]; lif1 depr1 depr2; 29
30
Mplus syntax !save the parameter estimates for each iteration: SAVEDATA: BPARAMETERS are c:/Bayesian_results.dat; 30
31
31
32
Using MplusAutomation
33
R syntax To install MplusAutomation: R: install.packages(c("MplusAutomation")) R: library(MplusAutomation) Specify directory: R: setwd("c:/mplus_output") 33
34
R syntax Locate output file of Mplus: R: btest <- getSavedata_Bparams("output.out") Compute f 1 : R: testBParamCompoundConstraint (btest, "( STDYX_.G.1...DEPR2.ON.LIF_1 > STDYX_.G.2...DEPR2.ON.LIF_1) & STDYX_.G.3...DEPR2.ON.LIF_1 > TDYX_.G.4...DEPR2.ON.LIF_1)") 34
35
R syntax Compute f 2 : R: testBParamCompoundConstraint(btest, "( STDYX_.G.1...DEPR2.ON.LIF_1 > STDYX_.G.2...DEPR2.ON.LIF_1) & (STDYX_.G.3...DEPR2.ON.LIF_1 > STDYX_.G.4...DEPR2.ON.LIF_1) & (STDYX_.G.1...DEPR2.ON.LIF_1 > STDYX_.G.3...DEPR2.ON.LIF_1) & STDYX_.G.2...DEPR2.ON.LIF_1 > STDYX_.G.4...DEPR2.ON.LIF_1)") 35
36
Results l f i1 =.7573 l c i1 = 0.25 l f i2 =.5146 l c i2 = 0.0833 36
37
Results l BF 1 vs unc =.7573 /.25 = 3.03 l BF 2 vs unc =.5146 /.0833 = 6.18 37
38
Results l BF 1 vs unc =.7573 /.25 = 3.03 l BF 2 vs unc =.5146 /.0833 = 6.18 l BF 2 vs 1 = 6.18 / 3.03 = 2.04 38
39
Conclusions Excellent tool to include prior knowledge if available Direct support for you expectations! Gain in power
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.