Download presentation
Presentation is loading. Please wait.
Published byElla Shields Modified over 6 years ago
1
The geometry of random fields in astrophysics and brain mapping
Keith Worsley, Farzan Rohani, McGill Nicholas Chamandy, McGill and Google Jonathan Taylor, Stanford and Université de Montréal Jin Cao, Lucent Arnaud Charil, Montreal Neurological Institute Frédéric Gosselin, Université de Montréal Philippe Schyns, Fraser Smith, Glasgow
2
Astrophysics
3
Sloan Digital Sky Survey, data release 6, Aug. ‘07
4
fMRI data: 120 scans, 3 scans each of hot, rest, warm, rest, …
500 1000 First scan of fMRI data -5 5 T statistic for hot - warm effect 100 200 300 870 880 890 hot rest warm Highly significant effect, T=6.59 800 820 No significant effect, T=-0.74 790 810 Drift Time, seconds fMRI data: 120 scans, 3 scans each of hot, rest, warm, rest, … T = (hot – warm effect) / S.d. ~ t110 if no effect
5
Linear model regressors
50 100 150 200 250 300 350 -1 1 2 Alternating hot and warm stimuli separated by rest (9 seconds each). hot warm -0.2 0.2 0.4 Hemodynamic response function: difference of two gamma densities Regressors = stimuli * HRF, sampled every 3 seconds Time, seconds
6
Brain imaging Detect sparse regions of “activation”
Construct a test statistic “image” for detecting activation. Activated regions: test statistic > threshold Choose threshold to control false positive rate to say 0.05, i.e. P(max test statistic > threshold) = 0.05 Bonferroni??? Too conservative … False discovery rate??? Not appropriate …
7
Detecting sparse cone alternatives
Test statistics are usually functions of Gaussian fields, e.g. T or F statistics Let’s take a challenging example: a random field of chi-bar statistics for detecting a sparse cone alternative. Application: detecting fMRI activation in the presence of unknown latency of the hemodynamic response function (HRF) Linear model with two regressors of the HRF shifted by +/-2 seconds Fit by non-negative least-squares; equivalent to a cone alternative -2 +2 -2 +2
8
Chi-bar ¹ Â = m a x Z c o s + i n X = f s : ¹ Â ¸ g R = f Z : ¹ Â ¸ g
Example test statistic: Â = m a x 2 Z 1 c o s + i n Z1~N(0,1) Z2~N(0,1) s2 s1 Excursion sets, Rejection regions, X t = f s : Â g R t = f Z : Â g Threshold t Z2 Cone alternative Search Region, S Z1 Null
9
Euler characteristic heuristic again
0.5 1 1.5 2 2.5 3 3.5 4 -2 6 8 10 Euler characteristic heuristic again Excursion sets, Xt Search Region, S EC= #blobs - # holes = Observed H e u r i s t c : P ( m a x 2 S Â ) E C = 5 3 7 Expected Euler characteristic, EC Threshold, t E ( C S \ X t ) = D d L EXACT!
10
L i p s c h t z - K l n g u r v a e ( S ) E C d e n s i t y ½ ( ) ½ (
\ X t ) = D d L = S d @ Z s L i p s c h t z - K l n g u r v a e d ( S ) E C d e n s i t y ( ) Steiner-Weyl Tube Formula (1930) Morse Theory Approach (1995) Put a tube of radius r about the search region λS d ( t ) = 1 E f Z g e @ 2 s P r Tube(λS,r) λS For Z a Gaussian random field d ( t ) = 1 p 2 @ P Z For Z a chi-bar random field??? Find volume, expand as a power series in r, pull off coefficients: j T u b e ( S ; r ) = D X d 2 + 1 L
11
L i p s c h t z - K l n g u r v a e ( S ) o f d A r e a ( T u b ¸ S ;
Tube(λS,r) λS = S d @ Z s p 4 l o g 2 F W H M Steiner-Weyl Volume of Tubes Formula (1930) A r e a ( T u b S ; ) = D X d 2 + 1 L P i m t E C Lipschitz-Killing curvatures are just “intrinisic volumes” or “Minkowski functionals” in the (Riemannian) metric of the variance of the derivative of the process
12
L i p s c h t z - K l n g u r v a e ( S ) o f y S S S d L ( ² ) = 1 ,
= S d @ Z s Edge length × λ Lipschitz-Killing curvature of triangles L ( ) = 1 , N e d g l n t h 2 p r i m a Lipschitz-Killing curvature of union of triangles L ( S ) = P + N 1 2
13
Non-isotropic data? Use Riemannian metric of Var(∇Z)
Z~N(0,1) Z~N(0,1) s2 = S d @ Z s s1 Edge length × λ Lipschitz-Killing curvature of triangles L ( ) = 1 , N e d g l n t h 2 p r i m a Lipschitz-Killing curvature of union of triangles L ( S ) = P + N 1 2
14
E s t i m a n g L p c h z - K l u r v e ( S ) d R e p l a c o r d i n
1 2 3 4 5 6 7 8 9 n E s t i m a n g L p c h z - K l u r v e d ( S ) We need independent & identically distributed random fields e.g. residuals from a linear model … Lipschitz-Killing curvature of triangles R e p l a c o r d i n t s f h g S < 2 b y m u Z j ; = ( 1 : ) L ( ) = 1 , N e d g l n t h 2 p r i m a Lipschitz-Killing curvature of union of triangles L ( S ) = P + N 1 2 Taylor & Worsley, JASA (2007)
15
Beautiful symmetry: L i p s c h t z - K l n g u r v a e ( S ) E C d e
\ X t ) = D d L Beautiful symmetry: = S d @ Z s L i p s c h t z - K l n g u r v a e d ( S ) E C d e n s i t y ( ) Steiner-Weyl Tube Formula (1930) Taylor Gaussian Tube Formula (2003) Put a tube of radius r about the search region λS and rejection region Rt: Z2~N(0,1) Rt r Tube(λS,r) Tube(Rt,r) r λS Z1~N(0,1) t-r t Find volume or probability, expand as a power series in r, pull off coefficients: j T u b e ( S ; r ) = D X d 2 + 1 L P ( T u b e R t ; r ) = 1 X d 2 !
16
E C d e n s i t y ½ ( ) o f h ¹ Â a c P ( Z ; T u b e R r ) = X ¼ ! ½
Z2~N(0,1) E C d e n s i t y ( ) o f h  a c Rejection region Rt Tube(Rt,r) r Z1~N(0,1) t-r t Taylor’s Gaussian Tube Formula (2003) P ( Z 1 ; 2 T u b e R t r ) = X d ! + z 4 3 8 . Taylor & Worsley, Annals of Statistics, submitted (2007)
17
General cone alternatives
Z = ( 1 ; : n ) N I 2 . W w i s h o H v C f a u U S p r g k R c q l B m x j & d y , T K 9 7 P X F b E A
18
Proof, n=3:
19
Gaussian random field in 3D
Z ( s ) N ; 1 i a n o t r p c G u d m e l , 2 < 3 w h = V @ E C S \ : g z + D A v F W H M x 4 . L ( S ) 1 2 3 ( t ) 1 2 3 Lipschitz-Killing curvatures of S EC densities of Z filter FWHM
20
The accuracy of the EC heuristic
Z ( s ) N ; 1 i a n o t r p c G u d m e l , 2 < 3 w h = V @ > P x S E C \ : g + O z D A L ( S ) 1 2 3 ( t ) 1 2 3 Lipschitz-Killing curvatures of S EC densities of Z The expected EC gives all the polynomial terms in the expansion for the P-value.
21
What is ‘bubbles’?
22
Nature (2005)
23
Subject is shown one of 40 faces chosen at random …
Happy Sad Fearful Neutral
24
… but face is only revealed through random ‘bubbles’
First trial: “Sad” expression Subject is asked the expression: “Neutral” Response: Incorrect 75 random bubble centres Smoothed by a Gaussian ‘bubble’ What the subject sees Sad
25
Your turn … Trial 2 Subject response: “Fearful” CORRECT
26
Your turn … Trial 3 Subject response: “Happy” INCORRECT (Fearful)
27
Your turn … Trial 4 Subject response: “Happy” CORRECT
28
Your turn … Trial 5 Subject response: “Fearful” CORRECT
29
Your turn … Trial 6 Subject response: “Sad” CORRECT
30
Your turn … Trial 7 Subject response: “Happy” CORRECT
31
Your turn … Trial 8 Subject response: “Neutral” CORRECT
32
Your turn … Trial 9 Subject response: “Happy” CORRECT
33
Your turn … Trial 3000 Subject response: “Happy” INCORRECT (Fearful)
34
E.g. Fearful (3000/4=750 trials):
Bubbles analysis E.g. Fearful (3000/4=750 trials): Trial … + 750 = Sum Correct trials Thresholded at proportion of correct trials=0.68, scaled to [0,1] Use this as a bubble mask Proportion of correct bubbles =(sum correct bubbles) /(sum all bubbles)
35
Happy Sad Fearful Neutral
Results Mask average face But are these features real or just noise? Need statistics … Happy Sad Fearful Neutral
36
Very similar to the proportion of correct bubbles:
Statistical analysis Correlate bubbles with response (correct = 1, incorrect = 0), separately for each expression Equivalent to 2-sample Z-statistic for correct vs. incorrect bubbles, e.g. Fearful: Very similar to the proportion of correct bubbles: Z~N(0,1) statistic Trial … Response …
37
Happy Sad Fearful Neutral
Results Thresholded at Z=1.64 (P=0.05) Multiple comparisons correction? Need random field theory … Z~N(0,1) statistic Average face Happy Sad Fearful Neutral
38
Euler Characteristic = #blobs - #holes
Excursion set {Z > threshold} for neutral face EC = Heuristic: At high thresholds t, the holes disappear, EC ~ 1 or 0, E(EC) ~ P(max Z > t). Exact expression for E(EC) for all thresholds, E(EC) ~ P(max Z > t) is extremely accurate.
39
The result I f Z ( s ) » N ; 1 i a n o t r p c G u d m ¯ e l , 2 <
; 1 i a n o t r p c G u d m e l , 2 < 3 w h = V @ P x S E C \ : g z + D A v F W H M 4 . L ( S ) 1 2 3 ( t ) 1 2 3 Lipschitz-Killing curvatures of S EC densities of Z filter FWHM
40
Results, corrected for search
Random field theory threshold: Z=3.92 (P=0.05) Bonferroni threshold: Z=4.87 (P=0.05) – nothing Z~N(0,1) statistic Average face Happy Sad Fearful Neutral
41
Bubbles task in fMRI scanner
Correlate bubbles with BOLD at every voxel: Calculate Z for each pair (bubble pixel, fMRI voxel) – a 5D “image” of Z statistics … Trial … fMRI
42
Thresholding? Cross correlation random field
Correlation between 2 fields at 2 different locations, searched over all pairs of locations, one in S, one in T: Bubbles data: P=0.05, n=3000, c=0.113, T=6.22 P m a x s 2 S ; t T C ( ) c E f : g = d i X j L n h 1 ! + b k l Cao & Worsley, Annals of Applied Probability (1999)
43
MS lesions and cortical thickness
Idea: MS lesions interrupt neuronal signals, causing thinning in down-stream cortex Data: n = 425 mild MS patients Lesion density, smoothed 10mm Cortical thickness, smoothed 20mm Find connectivity i.e. find voxels in 3D, nodes in 2D with high correlation(lesion density, cortical thickness) Look for high negative correlations … Threshold: P=0.05, c=0.300, T=6.48
44
n=425 subjects, correlation = -0.568
10 20 30 40 50 60 70 80 1.5 2 2.5 3 3.5 4 4.5 5 5.5 Average cortical thickness Average lesion volume
45
Discussion: modeling The random response is Y=1 (correct) or 0 (incorrect), or Y=fMRI The regressors are Xj=bubble mask at pixel j, j=1 … 240x380=91200 (!) Logistic regression or ordinary regression: logit(E(Y)) or E(Y) = b0+X1b1+…+X91200b91200 But there are only n=3000 observations (trials) … Instead, since regressors are independent, fit them one at a time: logit(E(Y)) or E(Y) = b0+Xjbj However the regressors (bubbles) are random with a simple known distribution, so turn the problem around and condition on Y: E(Xj) = c0+Ycj Equivalent to conditional logistic regression (Cox, 1962) which gives exact inference for b1 conditional on sufficient statistics for b0 Cox also suggested using saddle-point approximations to improve accuracy of inference … Interactions? logit(E(Y)) or E(Y)=b0+X1b1+…+X91200b91200+X1X2b1,2+ …
46
Three methods so far The set-up:
S is a subset of a D-dimensional lattice (e.g. pixels); Z(s) ~ N(0,1) at most points s in S; Z(s) ~ N(μ(s),1), μ(s)>0 at a sparse set of points; Z(s1), Z(s2) are spatially correlated. To control the false positive rate to ≤α we want a good approximation to α = P(maxS Z(s) ≥ t): Bonferroni (1936) Random field theory (1970’s) Discrete local maxima (2005, 2007)
47
Simulations (99999) Bonferroni Random field theory
1 2 3 4 5 6 7 8 9 10 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09 0.1 P value FWHM (Full Width at Half Maximum) of smoothing filter -2 Simulations (99999) Bonferroni Random field theory Discrete local maxima Z(s)
48
Discrete local maxima Bonferroni applied to events:
{Z(s) ≥ t and Z(s) is a discrete local maximum} i.e. {Z(s) ≥ t and neighbour Z’s ≤ Z(s)} Conservative If Z(s) is stationary, with Cor(Z(s1),Z(s2)) = ρ(s1-s2), all we need is P{Z(s) ≥ t and neighbour Z’s ≤ Z(s)} a (2D+1)-variate integral Z(s2) ≤ Z(s-1) ≤ Z(s) ≥ Z(s1) ≥ Z(s-2)
49
“Markovian” trick If ρ is “separable”: s=(x,y),
ρ((x,y)) = ρ((x,0)) × ρ((0,y)) e.g. Gaussian spatial correlation function: ρ((x,y)) = exp(-½(x2+y2)/w2) Then Z(s) has a “Markovian” property: conditional on central Z(s), Z’s on different axes are independent: Z(s±1) ┴ Z(s±2) | Z(s) So condition on Z(s)=z, find P{neighbour Z’s ≤ z | Z(s)=z} = ПdP{Z(s±d) ≤ z | Z(s)=z} then take expectations over Z(s)=z Cuts the (2D+1)-variate integral down to a bivariate integral Z(s2) ≤ Z(s-1) ≤ Z(s) ≥ Z(s1) ≥ Z(s-2)
50
T h e r s u l t o n y i v c a ½ b w j x g , = 1 ; : D . F G P Á ( z )
2 Z Q + - f m S Y
51
Comparison Bonferroni (1936) Conservative
Accurate if spatial correlation is low Simple Discrete local maxima (2005, 2007) Accurate for all ranges of spatial correlation A bit messy Only easy for stationary separable Gaussian data on rectilinear lattices Even if not separable, always seems to be conservative (but no proof!) Random field theory (1970’s) Approximation based on assuming S is continuous Accurate if spatial correlation is high Elegant Easily extended to non-Gaussian, non-isotropic random fields
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.