Presentation is loading. Please wait.

Presentation is loading. Please wait.

Random Field Theory Ciaran S Hill & Christian Lambert Methods for Dummies 2008.

Similar presentations


Presentation on theme: "Random Field Theory Ciaran S Hill & Christian Lambert Methods for Dummies 2008."— Presentation transcript:

1 Random Field Theory Ciaran S Hill & Christian Lambert Methods for Dummies 2008

2 Overview PART ONE Statistics of a Voxel Multiple Comparisons and Bonferroni correction PART TWO Spatial Smoothing Random Field Theory

3 PART I

4 realignment & motion correction smoothing normalisation General Linear Model Ümodel fitting Üstatistic image Corrected thresholds & p-values image data parameter estimates design matrix anatomical reference kernel Statistical Parametric Map Thresholding & Random Field Theory

5 A Voxel “A volume element: A unit of graphical information that defines a point in 3D space” Consists of 1.Location 2.Value Usually we divide the brain into 20,000+

6 Statistics of a voxel Determine if the value of a single specified voxel is significant Create a null hypothesis Compare our voxel’s value to a null distribution: “the distribution we would expect if there is no effect”

7 Statistics of a voxel  = p(t>t-value|H 0 ) t-value = 2.42 p-value: probability of getting a value of t at least as extreme as 2.42 from the t distribution (= 0.01). t-value = 2.42 As p < α, we reject the null hypothesist-value = 2.02 t-distribution NULL hypothesis, H 0 : activation is zero alpha = 0.025

8 Statistics of a voxel Compare each voxel’s value to whole brain volume to see if significantly different A statistically different voxel should give us “localizing power” Because we have so many voxels we will have a lot of errors: 1000 voxels: 10 false positives (using 0.01) This creates the multiple comparison problem

9 Statistics of a voxel If we do not know where the effect we are searching for occurs in the brain we look at all the voxels Family-Wise Hypothesis “ The whole family of voxels arose by chance” If we think that the voxels in the brain as a whole is unlikely to have arisen from a null distribution then we would reject the null hypothesis

10 Statistics of a voxel Choose a Family Wise Error rate Equivalent to alpha threshold The risk of error we are prepared to accept This is the likelihood that all the voxels we see have arisen by chance from the null distribution If >10 in 1000 then there is probably a statistical difference somewhere in the brain but we can’t trust our localization How do we test our Family Wise Hypothesis/choose our FWE rate?

11 Thresholding Height thresholding This gives us localizing power

12 Thresholding t > 0.5 t > 3.5 Med. ThresholdLow Threshold t<0.5High Threshold (t>5.5) Good Specificity Poor Power (risk of false negatives) Poor Specificity (risk of false positives) Good Power t > 5.5

13 Carlo E. Bonferroni

14 Bonferroni correction A method of setting a threshold above which results are unlikely to have arisen by chance If we would trust a p value of 0.05 for one hypothesis then for multiple hypotheses we should use 1/n For example if one hypothesis requires 0.05 then two should require 0.05/2 = 0.025 More conservative to ensure we maintain probability

15 Bonferroni correction  (bon) =  /n

16 Thresholding Error signal (noise) Pure signal What we actually get (bit of both)

17 Thresholding Too much false positive outside our blob

18 Bonferroni correction For an individual voxel: Probability of a result > threshold =  Probability of a result < threshold = 1-  (  = chosen probability threshold eg 0.01)

19 Bonferroni correction For whole family of voxels: Probability of all results > threshold = (  ) n Probability of all results < threshold = (1-  ) n FWE (the probability that 1 or more values will be greater than threshold) = 1 - (1 -  ) n and as alpha is so small: =  x n or  = FWE / n

20 Bonferroni correction Is this useful for imaging data? 100,000 voxels = 100,000 t values If we choose FWE 0.05 then using Bonferroni correction = 0.05/100,000 = 0.0000005 Corresponding t value is 5.77 so any statistic above this threshold is significant This is for a p value of 0.05 corrected for the multiple comparisons Controls type I error BUT increases type II error

21 Thresholding FWE Use of ‘corrected’ p-value,  =0.1 Too conservative for functional imaging

22 How can we make our threshold less conservative without creating too many false positives?

23 Spatial Correlation The Bonferroni correction doesn’t consider spatial correlation Voxels are not independent –The process of data acquisition –Physiological signal –Spatial preprocessing applied before analysis –Corrections for movement Fewer independent observations than voxels

24 Spatial Correlation This is a 100 by 100 square full of voxels There are 10,000 tests (voxels) with a 5% family wise error rate Bonferroni correction gives us a threshold of: 0.05/10000 = 0.000005 This corresponds to a z score of 4.42

25 Spatial Correlation If we average contents of the boxes we get 10 x10 Simple smoothing Our correction falls to 0.0005 This corresponds to a z score of 3.29 We still have 10,000 z scores but only 100 independent variables. Still too conservative. The problem is we are considering variable that are spatially correlated to be independent? How do we know how many independent variable there really are?

26 Smoothing We can improve spatial correlation with smoothing Increases signal-to-noise ratio Enables averaging across subjects Allows use of Gaussian Random Field Theory for thresholding

27 PART II

28 TOPOLOGYSTATISTICS PROBABILITY

29 Leonhard Euler (1707-1783) Leonhard Paul Euler - Swiss Mathematician who worked in Germany & Russia Prolific mathematician - 80 volumes of work covering almost every area of mathematics (geometry, calculus, number theory, physics) Of interest wrote “most beautiful formula ever”: e i   Laid the foundations of topology (Euler's 1736 paper on Seven Bridges of Königsberg) EULER CHARACTERISTIC

30 Euler Characteristic I Polyhedron e.g. Cube: Can subdivide object into the number of verticies, edges, faces: Euler observed for all solid polyhedra: V – E + F = 2 Can generalise this formula by including P (number of polyhedra): V – E + F – P = EC Property of topological space: 0d - 1d + 2d - 3d + 4d…etc.,= EC EC is 1 for ALL SOLID POLYHEDRA

31 V = 16 E = 32 F = 24 P = 8 HP = 1 V = 16 E = 28 F = 16 P = 3 V = 8 E = 12 F = 6 P = 1 8 – 12 + 6 – 1 = 1 16 – 28 + 16 – 3 = 1 16 – 32 + 24 – 8 + 1 = 1

32 Euler Characteristic II i) Holes: Each hole through an object reduces it’s EC by 1: EC = 0EC = -1EC = -2

33 EC = 2 ii) Hollows: Homeomorphic 8 Torus = 0 2 ‘sphere’ = 4 2 Solid = 2 EC of Set [Donut] = 6 iii) Set of disconnected Polyhedra Calculate individual EC and sum:

34 iv) EC of a Three Dimensionsal set: Often look at EXCURSION SETS Define a fixed threshold Define the objects that exceed that density and calculate EC for them Simplified EC = Maxima – Saddles + Minima VERY dependent on threshold – RANDOM FIELD THEORY

35

36 Gaussian Curves Standard Normal Distribution (Probability density function) Mean = 0 Standard Deviation = 1 Full Width at Half its Maximum Height: Weighting Function Data Smoothing (Gaussian Kernels) Brownian Motion -> Gaussian Random Field FWHM

37 Data Smoothing Smoothing: The process by which data points are averaged with their neighbours in a series Attempting to maximise the signal to noise ratio Kernel: Defines the shape of the function that is used to take the average of the neighbouring points Each pixel's new value is set to a weighted average of that pixel's neighbourhood. The original pixel's value receives the heaviest weight and neighbouring pixels receive smaller weights as their distance to the original pixel increases. This results in a blur that preserves boundaries and edges better than other, more uniform blurring filters Visualisation of Gaussian Kernel: Value 1 in the centre (c), 0’s everywhere else Effect of Kernel seen in (d)

38 Gaussian Smoothing the Gaussian Kernel Nil 1 Pixel2 Pixels3 Pixels Johann Carl Friedrich Gauss

39 Number of Resolution Elements (RESELS), R A block of values (e.g pixels) that is the same size as the FWHM. In 3D a cube of voxels that is of size (FWHM in x) by (FWHM in y) by (FWHM in z) RFT requires FWHM > 3 voxelsRFT requires FWHM > 3 voxels 27 Voxels1 RESEL Typical applied smoothing:Typical applied smoothing: Single Subj fMRI: 6mmSingle Subj fMRI: 6mm PET: 12mm Multi Subj fMRI: 8-12mmMulti Subj fMRI: 8-12mm PET: 16mm

40 Stationary Gaussian Random Field Studied by Robert Adler as his PhD thesis 1976 Published the book 1981, The Geometry of Random Fields There are several types (Gaussian, non-Gaussian, Markov, Poisson…) Deals with the behaviour of a stochastic process over a specified dimension D (often D=3, but can be higher) To create a stationary Gaussian field, create a lattice of independent Gaussian observations (i.e. White Noise). Each will have a mean = 0, SD = 1 (standard normal distribution) Take the weighted average (using Gaussian Kernel) PLOT(Brownian Bridge)

41 Mean = 0 SD = 1 Mean = 0 SD = 1

42 The link between the topology of an excursion set and the local maxima was published by Hasofer (Adler’s Supervisor) in 1978: As threshold increases the holes in the excursion set disappears until each component of the excursion set contains just one local maximum EC = Number of local maxima (at high thresholds) Just below the global maximum the EC = 1; Just above = 0 At high thresholds the expected EC approximates the probability that global maximum exceeds threshold

43 EC = 1 EC = 4 EC = 15

44 Mathematical method for generating threshold (t) 2 Formulas Required values (whole brain*): –Volume: 1,064cc –Surface Area: 1,077cm 2 –Caliper Diameter= 0.1cm –EC = 2 (Ventricles) FWHM Value to calculate, a measure of the roughness of the field =  4.log e 2/FWHM 1 * Or region of interest (see later results)

45 Given E(EC) = 0.05 Just solve for t……….. 2

46 EC Diameter Surface Area Volume FWHM=20mm (1)Threshold depends on Search Volume (2) Surface area makes a large contribution Volume of Interest:

47 Signal 1 Signal 2 Threshold (t) corresponding to p(local maxima) <0.05 (“Voxel level) Threshold (t) corresponding to probability p of a ‘cluster’ of activation size n (“Cluster level”) n

48 Cluster level threshold raised – Increasing specificity at the cost of sensitivity Low Threshold – Could still ask at what threshold (t) do you see a cluster (C) ≥ N with p<0.05 Cluster C with N voxels Cluster c with n voxels; As N>n at t, significant

49 Images need to follow Gaussian Constructed statistics need to be sufficiently smooth. If underlying images are smooth, constructed statistics are smooth. Random Field Theory Assumptions

50 Link to General Linear Model The Random Fields are the component fields: Y = Xw +E, e=E/σ This is because under the null hypothesis there is no contribution from Xw to observed data Y; hence E should explain all the results in this scenario, if it does not then there is a statistically significant contribution from the Xw term. We can only estimate the component fields, using estimates of w and σ To apply RFT we need the RESEL count which requires smoothness estimates

51  ^ residuals estimated component fields parameter estimates estimated variance   = Each row is an estimated component field Estimated component fields data matrix design matrix parameters errors + ? =  ? voxels scans

52 Example Data:

53

54 Beyond Random Field Theory False Discovery Rate (FDR) Whenever one performs multiple tests, the FDR is the proportion of false positives among those tests for which the null hypothesis is rejected This quantity gets at the essence of what one wants to control in multiple comparisons This procedure is conservative if the voxels are positively dependent

55 p 1 <p 2 <p 3 <p 4 ……p N Take all P Values To control the FDR at q, find the largest value i so that: Designate i max = R Then Threshold at R i.e. reject all the voxels that contributed from p 1 …..p r Developments in Random Field Theory. K.J. Worsley. Chapter 15 in Human Brain Function (2nd Ed) K Friston, J Ashburner, W Penny Thresholding of Statistical Maps in Functional Neuroimaging Using the False Discovery Rate. C Genovese et al. NeuroImage 15, 870–878 (2002)

56 Conclusion We should not use uncorrected p-values Uncorrected = High False Positive Bonferroni = High False Negative We can use Random Field Theory (RFT) to ‘correct’ p-values Combines Probability, Topology and Extreme Statistics RFT requires FWHM > 3 voxels We only need to correct for the volume of interest Cluster-level inference – can detect changes missed on voxel level inference Only covered basics of RFT – Broad ranging applications

57

58 Acknowledgements Will Penny’s slides –http://www.fil.ion.ucl.ac.uk/spm/course/slides05/ppt/infer.ppt#324,1,Random Field Theory Jean-Etienne’s slides –http://www.poirrier.be/~jean-etienne/presentations/rft/spm-rft-slides-poirrier06.pdf Previous MfD slides –http://www.fil.ion.ucl.ac.uk/~jchumb/MfDweb.htm Brett, Penny & Keibel. An introduction to Random Field Theory. Chapter from Human Brain Mapping http://www.icn.ucl.ac.uk/courses/MATLAB-Tutorials/spm_icn_2008/ParamThresh.ppt The Geometry of Random Fields, Robert Adler (1981) at http://www.dleex.com/read/?4204 Geometry of Random Images, Keith Worsley, Chance, 1996; 9(1):27-39 An Introduction to Random Field Theory (Chapter 14) Human Brain Mapping A Unifed Statistical Approach for Determining Signifcant Signals in Images of Cerebral Activation, Worsley et al. Human Brain Mapping, 4:58-73. Medical Image Analysis: http://www.stat.wisc.edu/~mchung/teaching/MIA/lectures/MIA.lecture10.random.field.feb.2 1.2007.pdf Calculating the Topology of Large-Scale Structure, R. J. Massey http://www.astro.caltech.edu/~rjm/thesis/Massey_MSci_LSSTopology.pdf Topological Characterization of Porous Media, Hans-J¨org Vogel, Morphology of Condensed Matter, 2002 Volume 600: 75-92

59 ….Any Questions?


Download ppt "Random Field Theory Ciaran S Hill & Christian Lambert Methods for Dummies 2008."

Similar presentations


Ads by Google