Presentation is loading. Please wait.

Presentation is loading. Please wait.

DCM – the theory. Bayseian inference DCM examples Choosing the best model Group analysis.

Similar presentations


Presentation on theme: "DCM – the theory. Bayseian inference DCM examples Choosing the best model Group analysis."— Presentation transcript:

1 DCM – the theory

2 Bayseian inference DCM examples Choosing the best model Group analysis

3 Bayseian Inference Classical inference – tests null hypothesis Is the effect significantly different from zero? Or in spm terms, is any activation seen due effect of regressor rather than random noise! Bayseian inference – probability that activation exceeds a set threshold given data Derived from posterior probability (calculated using Bayes) No false positives (no need for correction!)

4 Bayes rule If A and B are 2 separate but possibly dependent random events, then: Prob of A and B occurring together = P[(A,B)] The conditional prob of A, given that B occurs = P[(A|B)] The conditional prob of B, given that A occurs = P[(B|A)] P[(A,B)] = P[(A|B)] P[B] P[(B|A)] P[A] (1) Dividing the right-hand pair of expressions by P[B] gives Bayes rule: P[(A|B)] = P[(B|A)] P[A] P[B] (2) In probabilistic inference, we try to estimate the most probable underlying model for a random process, based on observed data. If A represents a given set of model parameters, and B represents the set of observed data values, then: P[A] is the prior prob of the model A (in the absence of any evidence); P[B] is the prob of the evidence B; P[B|A] is the likelihood that the evidence B was produced, given that the model was A; P[A|B] is the posterior prob of the model being A, given that the evidence is B. Posterior Probability α Likelihood x Prior Probability

5 Bayes rule 2 In DCM Likelihood derived from error and confounds (eg. drift) Priors – empirical (haemodynamic parameters) and non- empirical (eg. shrinkage priors, temporal scaling) Posterior probability for each effect calculated and probability that it exceeds a set threshold expressed as a percentage

6 A1 WA A2 SPM{F} An example

7 A2 WA A1.. Stimulus (perturbation), u 1 Set (context), u 2

8 A2 WA A1.. Stimulus (perturbation), u 1 Set (context), u 2 Full intrinsic connectivity: a

9 A2 WA A1.. Stimulus (perturbation), u 1 Set (context), u 2 Full intrinsic connectivity: a u 1 activates A1: c

10 A2 WA A1. Stimulus (perturbation), u 1 Set (context), u 2 Full intrinsic connectivity: a u 1 may modulate self connections  induced connectivities: b 1 u 1 activates A1: c

11 A2 WA A1. Stimulus (perturbation), u 1 Set (context), u 2 Full intrinsic connectivity: a u 1 may modulate self connections  induced connectivities: b 1 u 2 may modulate anything  induced connectivities: b 2 u 1 activates A1: c

12 A2 WA A1.92 (100%).38 (94%).47 (98%).37 (91%) -.62 (99%) -.51 (99%).37 (100%) u1u1 u2u2

13 A2 WA A1.92 (100%).38 (94%).47 (98%) u1u1 u2u2 Intrinsic connectivity: a

14 A2 WA A1.92 (100%).38 (94%).47 (98%) u1u1 u2u2 Intrinsic connectivity: a Extrinsic influence: c.37 (100%)

15 A2 WA A1.92 (100%).38 (94%).47 (98%) u1u1 u2u2 Intrinsic connectivity: a Connectivity induced by u 1 : b 1 Extrinsic influence: c.37 (100%) -. 62 (99%) -.51 (99%)

16 A2 WA A1.92 (100%).38 (94%).47 (98%) u1u1 u2u2 Intrinsic connectivity: a Connectivity induced by u 1 : b 1 Extrinsic influence: c.37 (100%) -.62 (99%) -.51 (99%) saturation

17 A2 WA A1.92 (100%).38 (94%).47 (98%) u1u1 u2u2 Intrinsic connectivity: a Connectivity induced by u 1 : b 1 Connectivity induced by u 2 : b 2 Extrinsic influence: c.37 (100%) -.62 (99%) -.51 (99%).37 (91%) saturation

18 A2 WA A1.92 (100%).38 (94%).47 (98%) u1u1 u2u2 Intrinsic connectivity: a Connectivity induced by u 1 : b 1 Connectivity induced by u 2 : b 2 Extrinsic influence: c.37 (100%) -.62 (99%) -.51 (99%).37 (91%) saturation adaptation

19 Design: moving dots (u 1 ), attention(u 2 ) Another example

20 Design: moving dots (u 1 ), attention(u 2 ) SPM analysis: V1, V5, SPC, IFG Another example

21 Design: moving dots (u 1 ), attention(u 2 ) SPM analysis: V1, V5, SPC, IFG Literature: V5 motion-sensitive Another example

22 Design: moving dots (u 1 ), attention(u 2 ) SPM analysis: V1, V5, SPC, IFG Literature: V5 motion-sensitive Previous connect. analyses: SPC mod. V5, IFG mod. SPC Another example

23 Design: moving dots (u 1 ), attention(u 2 ) SPM analysis: V1, V5, SPC, IFG Literature: V5 motion-sensitive Previous connect. analyses: SPC mod. V5, IFG mod. SPC  Constraints: - intrinsic connectivity: V1 V5 SPC IFG - u 1 V1 - u 2 : modulates V1 V5 SPC IFG - u 3 : motion modulates V1 V5 SPC IFG Another example

24 Design: moving dots (u 1 ), attention(u 2 ) SPM analysis: V1, V5, SPC, IFG Literature: V5 motion-sensitive Previous connect. analyses: SPC mod. V5, IFG mod. SPC  Constraints: - intrinsic connectivity: V1 V5 SPC IFG - u 1 V1 - u 2 : modulates V1 V5 SPC IFG - u 3 : motion modulates V1 V5 SPC IFG (photic) Another example

25 V1 IFG V5 SPC Motion (u 3 ) Photic (u 1 ) Attention (u 2 ).82 (100%).42 (100%).37 (90%).69 (100%).47 (100%).65 (100%).52 (98%).56 (99%) Another example

26 V1 V5 SPC Motion Photic Attention 0.85 0.57 -0.02 1.36 0.70 0.84 0.23 V1V5 SPC Motion Photic Attention 0.86 0.56 -0.02 1.42 0.55 0.75 0.89 V1 V5 SPC Motion Photic Attention 0.85 0.57 -0.02 1.36 0.03 0.70 0.85 Attention 0.23 Model 1: attentional modulation of V1 →V5 Model 2: attentional modulation of SPC → V5 Model 3: attentional modulation of V1 → V5 and SPC →V5 Comparison of models Bayesian model selection:Model 1 better than model 2, model 1 and model 3 equal → Decision for model 1: in this instance, attention primarily modulates V1 → V5

27 Comparison of models Bayseian inference again Depends on goodness of fit and complexity of various models

28 Inference about DCM parameters: group analysis In analogy to “random effects” analyses in SPM, 2 nd level analyses can be applied to DCM parameters: Separate fitting of identical models for each subject Selection of bilinear parameters of interest one-sample t- test: parameter > 0 ? paired t-test: parameter 1 > parameter 2 ? rmANOVA: e.g. in case of multiple sessions per subject

29 "Laughing is a celebration of the good, and it's also how we deal with the bad. Laughing, like crying, is a good way of eliminating toxins from the body. Since the mind and body are connected, you use an amazing amount of muscles when you laugh." http://www.balloonhat.com/


Download ppt "DCM – the theory. Bayseian inference DCM examples Choosing the best model Group analysis."

Similar presentations


Ads by Google