Download presentation
Presentation is loading. Please wait.
Published byLeonard Hancock Modified over 9 years ago
1
1 Modeling the fMRI signal via Hierarchical Clustered Hidden Process Models Stefan Niculescu, Tom Mitchell, R. Bharat Rao Siemens Medical Solutions Carnegie Mellon University November, 2007
2
2 Outline Introduction Hidden Process Models (HPMs) Hierarchical Clustered HPMs Experiments Summary / Future Work
3
3 …
4
4
5
5 Motivation / Related Work Challenges: –High dimensionality (thousands) –Few examples per task (tens) –Variability in subjects’ brains Discriminative Models: –SVMs, kNN, ANNs –Decent accuracy –Do not necessarily model the fMRI signal properly Generative Models: –Hidden Process Models (HPMs) Do not take into account similarities among voxels –Hierarchical Clustered HPMs Extensions of standard HPMs Automatically identify clusters of “similar” voxels to improve estimation
6
6 Outline Introduction Hidden Process Models (HPMs) Hierarchical Clustered HPMs Experiments Summary / Future Work
7
7 Starplus Dataset Collected by Just et al. Trial: –read sentence –view picture –answer whether sentence describes picture 40 trials – 32 time slices (2/sec) –picture presented first in half of trials –sentence first in the other half Three possible objects: star, dollar, plus
8
8 It is true that the star is above the plus?
9
9
10
10 + --- *
11
11
12
12 Hidden Process Models One observation (trial): N different trials: All trials and all Processes have equal length T
13
13 Outline Introduction Hidden Process Models (HPMs) Hierarchical Clustered HPMs Experiments Summary / Future Work
14
14 Parameter Sharing in HPMs similar shape activity different amplitudes XvXv
15
15 Parameter Sharing in HPMs ~ Maximum Likelihood Estimation ~ l’(P,C) quadratic in (P,C): each square linear in P and C a maximizer for l is also a maximizer for l’
16
16 Parameter Sharing in HPMs ~ Maximum Likelihood Estimation ~
17
17 Parameter Sharing in HPMs ~ Hierarchical Partitioning Algorithm ~
18
18 Outline Introduction Hidden Process Models (HPMs) Hierarchical Clustered HPMs Experiments Summary / Future Work
19
19 Parameter Sharing in HPMs ~ Experiments ~ We compare three models: –Based on Average (per trial) Likelihood –StHPM – Standard, per voxel HPM –ShHPM – One HPM for all voxels in an ROI (24 total) –HieHPM – Hierarchical HPM Effect of training set size (6 to 40) in CALC: –ShHPM biased here Better than StHPM at small sample size Worse at 40 examples – HieHPM – the best It can represent both models 106 times better data likelihood than StHPM at 40 examples StHPM needs 2.9 times more examples to catch up
20
20 Parameter Sharing in HPMs ~ Experiments ~ Performance over whole brain (40 examples): –HieHPM – the best 1792 times better data likelihood than StHPM Better than StHPM in 23/24 ROIs Better than ShHPM in 12/24 ROIs, equal in 11/24 –ShHPM – second best 464 times better data likelihood than StHPM Better than StHPM in 18/24 ROIs It has bias, but makes sense to share whole ROIs not involved in the cognitive task
21
21 Learned Voxel Clusters In the whole brain: ~ 300 clusters ~ 15 voxels / cluster In CALC: ~ 60 clusters ~ 5 voxels / cluster
22
22 Sentence Process in CALC
23
23 Outline Introduction Hidden Process Models (HPMs) Hierarchical Clustered HPMs Experiments Summary / Future Work
24
24 Summary Extended on previous non-temporal models for describing the fMRI signal Introduced a new method of clustering voxels with similar activities based on Hidden Process Models Showed how taking advantage of domain knowledge improved over standard temporal models
25
25 Research Opportunities Evaluate this algorithm on multiple studies/subjects Take advantage of data from multiple subjects to learn better clusters Accommodate uncertainty in the onset times of the hidden processes Explore soft clustering
26
26 Questions ?
27
27 THE END
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.