Download presentation
Presentation is loading. Please wait.
1
M/EEG Study Design and Pre-Processing of Data
Methods for Dummies: M/EEG Study Design and Pre-Processing of Data Marwa Zein & Yuki Shimura With expert Dr. Thomas Parr Hi everyone, today Marwa and I will talk about MEG and EEG experimental design and pre-processing of data.
2
Steps to conduct a scientific experiment
1)Theoretical question 2) Working Hypotheses Before jumping into the experimental design for the particular topic, let’s review the basics of a scientific experiment. Usually, we start with a theoretical question, for example, a lot of us at Max Planck study the theoretical question, such as
3
Steps to conduct a scientific experiment
1)Theoretical question 2) Working Hypotheses How does the brain encode information? “How does the brain encode information?”. And then, we formulate a hypothesis based on literature: For now we know: when one learns about an association between an environment and other things related to the situation, high-gamma rhythms was observed during the post-learning period in the hippocampus and rhinal cortex,
4
Steps to conduct a scientific experiment
1)Theoretical question 2) Working Hypotheses How does the brain encode information? Gamma: awake, focused attention and high-gamma in the rhinal cortex was positively correlated with subsequent memory performance. Based on this, we could possibly hypothesize that more the high-gamma power in hippocampal area increase after encoding the new environment, the faster the participant would learn a new location in the subsequent training set when the environment had become familiar.
5
Steps to conduct a scientific experiment
1)Theoretical question 2) Working Hypotheses How does the brain encode information? Gamma power in hippocampal area ↑ Learning speed of the participant ↑ Based on this, we could possibly hypothesize that more the high-gamma power in hippocampal area increase after encoding the new environment, the faster the participant would learn a new location in the subsequent training set when the environment had become familiar.
6
Steps to conduct a scientific experiment
1)Theoretical question 2) Working Hypotheses 3) Choose an experimental paradigm (relevant methods) 4) Measuring 5) Analysis 6) Interpretation of results How does the brain encode information? Gamma power in hippocampal area ↑ Learning speed of the participant ↑ We will then choose an experiment paradigm, whether we use block or event-related design, which we have covered in fMRI section of the course. Let’s say we chose a block design and conduct a MEG experiment - the rest will follow, analyses and interpretation of results.
7
Exploratory Research That was a very linear, hypothesis-based approach. But when we don’t have a solid foundation of previous research upon which to build our next study, there is a high chance of a false positive, wherein we fail to reject Fisher’s null hypothesis, which indicates there is no experimental effect, or an association of some kind, between two entities. There is discourse about such issues in brain science like the paper highlighted here.
8
Exploratory Research Keep potential dimensions for results broad
Run a large battery of tasks, record using multiple modalities The paper suggests several tips to avoid a false positive: for example, we should keep potential dimensions for results broad so that we can maximize the room for exploration.
9
Exploratory Research Keep potential dimensions for results broad
Run a large battery of tasks, record using multiple modalities Be wary of spurious results The nature of the research results in large numbers of tests and the risk of a false positive is high We also need to be wary of results when the number of subjects increases so that we ensure to avoid p-hacking. We set our degree of freedom in advance so that we can have practical implementation of our experiment. And most importantly, we need to be aware that our exploratory research produces actual hypotheses - we need to follow them up by formal testing.
10
Exploratory Research Keep potential dimensions for results broad
Run a large battery of tasks, record using multiple modalities Be wary of spurious results The nature of the research results in large numbers of tests and the risk of a false positive is high Set your “researcher’s degrees of freedom” prior to testing Predesignate sample sizes, criteria for trial rejections, independent variables to test prior to conducting experiments We set our degree of freedom in advance so that we can have practical implementation of our experiment.
11
Exploratory Research Keep potential dimensions for results broad
Run a large battery of tasks, record using multiple modalities Be wary of spurious results The nature of the research results in large numbers of tests and the risk of a false positive is high Set your “researcher’s degrees of freedom” prior to testing Predesignate sample sizes, criteria for trial rejections, independent variables to test prior to conducting experiments Exploratory research produces hypotheses Follow them up by more rigorous, formal testing And most importantly, we need to be aware that our exploratory research produces actual hypotheses - we need to follow them up by formal testing.
12
Tips for Statistical Analysis
Bayesian statistics Can provide a probability of an effect being of a certain size, but require a prior probability distribution. Bayesian statistics are an alternative way to perform statistics that help avoiding p-hacking. Based on model comparisons, and trying to find the best model to explain the data, this method allows to test as many participants as we want. As the more data acquired, the closer we get to know which model best explains the data (and there is a software called JASP you can use for conducting Bayesian statistics). The paper here (Etz) explains very well the concept.
13
Tips for Statistical Analysis
Bayesian statistics Can provide a probability of an effect being of a certain size, but require a prior probability distribution. Sequential analysis Statistical analysis where the sample size is not fixed in advance. Instead data are evaluated as they are collected, and further sampling is stopped in accordance with a pre-defined stopping rule as soon as significant results are observed. Also we can use another technique called sequential analyses where we adjust p-values as we increase the number of subjects when we don’t have any prior data for power analysis – there is a package available in SPSS and the paper explains the math behind it; by using sequential analysis, we can not only avoid collecting too many subjects, which is economical for us, but also avoid p-hacking because we constantly adjust our p-value for each cohort of data collection.
14
Temporal and spatial resolution of neuroimaging techniques
Now we will move on to what it takes to design MEG and EEG experiments. As Magda talked about last week, different neuroimaging techniques have different temporal and spatial resolution
15
Temporal and spatial resolution of neuroimaging techniques
MEG and EEG have a great temporal resolution if compared with methods used in humans like fMRI, but a less good spatial resolution - although now we could get better spatial localization through source reconstruction from deep structures, especially in MEG Temporal Resolution
16
Choice of Methods: MEG vs. EEG
Assuming we want temporal resolution, we choose MEG/EEG. We’ve covered the basis of these signals and how the techniques differ in a previous session, so I won’t go into too much detail here. To look at the overall pros and cons of each method, here we can see that the preparation time for MEG is much shorter than EEG. You don’t need to create a direct contact of the sensors on the skin for MEG while EEG you need gel. Unlike EEG, MEG signal is less distorted by the scalp, but it’s not portable.
17
Recommendations and Caveats
Minimum number of trials A trade-off between: A minimum that depends on the physiology of the brain regions involved and; A maximum that depends on how long participants can perform the task at the required performance level, maintaining a stable head position and avoiding eye blinks, etc. Let’s say we have chosen our method – I will talk about several important things to consider when designing an experiment. When thinking about the number of trials in MEG there is no rule of thumb to my knowledge – some say we should have at least twenty trials for each block of stimuli but it largely depends on what sort of experiment you are looking for. Overall, the number of trials required to reveal effects of interest is a trade-off between a minimum that depends on the physiology of the brain regions involved and a maximum that depends on how long participants can perform the task at the required performance level.
18
Recommendations and Caveats
Minimum number of trials A trade-off between: A minimum that depends on the physiology of the brain regions involved and; A maximum that depends on how long participants can perform the task at the required performance level, maintaining a stable head position and avoiding eye blinks, etc. Inter-stimulus intervals (ISIs) By jittering the ISI over an interval of ~1/f, one can suppress non-phase-locked oscillations below frequency It is often advisable to introduce a random element into the inter-stimulus intervals in event-related paradigms. This is because jittering ISIs could reduce effects of expectancy on brain responses; anticipating the upcoming stimulus is known to modulate brain activity when subsequent source reconstruction is intended.
19
Recommendations and Caveats
Minimum number of trials A trade-off between: A minimum that depends on the physiology of the brain regions involved and; A maximum that depends on how long participants can perform the task at the required performance level, maintaining a stable head position and avoiding eye blinks, etc. Inter-stimulus intervals (ISIs) By jittering the ISI over an interval of ~1/f, one can suppress non-phase-locked oscillations below frequency Application of filters and baseline correction Data filtering is a powerful technique to extract signals within a predefined frequency band of interest. Finally, data filtering is a conceptually simple, though powerful technique to extract signals within a predefined frequency band of interest. Applying a filter to the data presupposes that the signals of interest will be mostly preserved while other frequency components, supposedly of no interest, will be attenuated.
20
Analyses of M/EEG signal:
Evoked potentials Lastly for the experimental design, I will talk about two examples of MEG and EEG experiment: The first thing is called evoked potentials
21
Average on a LOT of trials to get a signal
Evoked potentials: Average on a LOT of trials to get a signal AVERAGE N170 For instance, participants see many trials with a face appearing on the screen, each of the trial alone does not show anything special, but if you average on all the trials, you get a component here, callet the N170 , because it is a negative deflection at 170 ms, that is typical of face processing Trial by trial signal
22
Interest: compare different
types of trials if you show participants cars at some trials and faces at some trials, you can see that this N170 is much larger for faces than cars. The interest with evoked potentials is to compare different conditions to see whether specific conditions trigger specific components, or larger components Georges et al. 2005
23
Analyses of M/EEG signal: Time-frequency analyses
Another example is called the time frequency analyses
24
Brain rythms Raw EEG Gamma: awake, focused attention
Alpha: awake, relaxed, diminished attention Delta: sleep another powerful analysis to conduct with EEG/MEg data is to look at brain rythms. If you look at a raw EEG recorded, you can see you have different oscillations. These are typically associated with different functions: gamma with being awake and focused, alpha with being more sleepy diminished attention ( our fear wehn we scan subjects as it means they are falling asleep) and delta asscociated with sleeping
25
Brain rythms Oscillatory signals
we quatify these in terms of how many oscialltions are observed in 1 second
26
Brain rythms Need to apply models on the data to quantify, for example the Fourier transform ( simple oscillatory function such as sines and cosines) to properly look at the data, we apply methods such as the fourier transform. A spectral analysis is like a GLM - quantify the power of each frequecy by fitting simple oscillatory functions
27
Brain rythms This is how a time frequency plot looks like after this type of analysis - here you can see an increase in the gamma power related to attention and awareness during the awake phase
28
Brain rythms Donner et al., 2009
Here you can see ho there is a suppression in the lower beta band before a motor action, while there is an increase in the gamma- and this i lateralized : it happens in the hemisphere contralateral to the hand used to execute an action Donner et al., 2009
29
Analyses of EEG signal: Model/regression based analyses
30
Example: Facial expressions of threat
El Zein, Wyart, Grèzes, 2015 Example: Facial expressions of threat Stimuli Faces expressing emotional expressions fearful angry neutral irrelevant direct relevant averted evidence for angry evidence for fearful
31
Decision variable: emotion strength
→Take advantage of the parametric modulation of emotion strength fearful angry neutral direct evidence for angry evidence for fearful
32
Neural encoding of emotional expressions
Neural ‘encoding’ approach Linear regression of model variables against EEG signals Variable of interest: emotion strength x (7 morph levels) electrode Pz 500 ms after stimulus p < 0.001 results
33
Neural encoding of emotional expressions
Neural ‘encoding’ approach Linear regression of model variables against EEG signals Variable of interest: emotion strength x (7 morph levels) electrode Pz 500 ms after stimulus p < 0.001 t23 = 9.6 electrode Pz 5 ms before stimulus p > 0.5 t23 = −0.6 Dans l’autre sens ?
34
Neural encoding of emotional expressions
Neural ‘encoding’ approach Linear regression of model variables against EEG signals encoding strength 500 ms after stimulus
35
Neural encoding of emotional expressions
Gaze direction influences emotion processing? Neural ‘encoding’ approach Time course of encoding as informative of stimulus processing 500 ms after stimulus onset
36
Example: Facial expressions of threat
Main effect of emotion At 100ms and 170 ms LPP Main effect of intensity After 400 ms
37
Source localization Source reconstruction
Mathematical model that takes into account the different directions of the gyri and the characteristic of propagation of the different tissues
38
Pre-processing of signal: Steps
Resampling (typical recordings at 1000 Hz/ 600 Hz) Re-referencing if EEG Epoching (around conditions of interest, stimulus onset, response onset) Removing baseline (for example -200ms) Softwars to use: eeglab – fieldtrip – SPM - Brainstorm
39
Pre-processing of signal: Filtering
Excludes specific frequencies Low-pass filter ~ 30 Hz Removes high frequency noise e.g., muscle activity High-pass filter Hz Reduces slow drifts e.g., perspiration artefact Notch filter 50/60 Hz Electrical line noise Band-pass Passes signal through only at a specific frequency range Too much filtering can alter the data and lose temporal precision, so we really want to make sure we eliminate as much of this noise as possible before collecting our data.
40
Pre-processing of signal: Artefacts
remove trials with artefacts, blinks and alpha
41
Pre-processing of signal: Artefacts
Blinks Bad electrode
42
Pre-processing of signal: Artefacts
Heart artefact Muscle artefact
43
Artefact Detection and Removal
Pre-processing of signal: Artefacts Artefact Detection and Removal Inspect data and reject artefacts for each participant Independent Component Analysis (ICA) Removes artefacts Useful for blink-related muscle activity Record EOG to remove eye movement artefact from signal Manual vs automatic rejections?
44
Technical considerations
Synchronisation of recording equipment and stimuli presentation is required to ensure everything is precisely time-locked (photodiode to fix delays) Event-related studies need ISI for baseline calculations Baseline length for Time-Frequency analysis Presentation time/jitters Sampling frequency should be ~ 2 x the highest frequency of interest Source localisation? Choice of reference electrodes (EEG) We are converting an analog signal, a sine wave, to a digital representation. According to the Nyquist rate, for lossless digitization, the sampling rate should be at least twice the maximum frequency responses. Indeed, many times more the better – what is your computational affordance? How much data can you handle? This is important if you’re interested in high frequency bands in the EEG. In terms of event-related experiments. The minimum duration between two stimuli defines the maximum length you can consider analyzing after the stimulus. You should design your experiment so that it always includes the entire evoked response, plus an additional segment that you can use as a baseline for the following epoch. Remember that the baseline of some epochs may contain motor and somatosensory components (if you incorporate button presses, for instance). For data processing, it is always better to have longer ISI, but it also means increasing the duration of the experiment or decreasing the number of repetitions, so it’s just something to consider. In source localisation, one applies signal processing techniques to estimate the current sources inside the brain that best fit the EEG data. The accuracy with which a source can be located is affected by a number of factors including head-modelling errors, source-modelling errors and EEG noise (instrumental or biological). The dipolar fields of each brain region propagate in three dimensions, in a dipolar pattern depending on the orientation of the cortical sources. Activity recorded at any head surface sensor reflects a summation of all active sources in the brain, superposed as a function of their distance, orientation, and the resistivity of the underlying tissues. Therefore, realistic source analysis of EEG potentials requires objective biophysical models that incorporate the exact positions of the sensors as well as the properties of head and brain anatomy, such that appropriate inverse techniques can be applied to map surface potentials to cortical sources. Spatial sampling may be sub optimal with EEG recordings with conventional electrode montages (less than 128 channels) Referencing. In EEG, voltages recorded at each electrode are relative to voltages recorded at other electrodes. Theoretically, a reference could be anywhere but the reference needs to be carefully chosen because any activity in the reference electrode will reflected in the activity at other electrodes. Often, the mastoids are chosen as reference electrodes, because while being close in distance to the electrodes, they record less signal from the brain. However, the very fact of being close to the brain means that mastoid signal does contain some neural signal, which means that the mastoids are not the ideal references. For high density EEG (100+ electrodes), the average of activity at all electrodes is often chosen as the reference. As a rule of thumb, the position of a reference electrode should not be close to that of an electrode where you expect your main effects to be. For example, the Cz is often used as a reference electrode, but it should not be used as such if you expect task-related activity to be centred around this electrode. It also not advisable to reference your data to an electrode of one hemisphere as this could introduce a laterality bias into your data
45
On study design Important things to remember for study design
Sufficient number of trials (per condition) to have reliable statistics vs length of study (participants get tired/uncomfortable setting) - Sufficient number of participants planned since the start (power analysis ? or Bayesian statistics)
46
Compare and Contrast Tutorials
Different disciplines, different departments, and different PIs will likely state their own best practices, which can result in conflicting advice. If you are careful and thorough with your design and data collection, you’ll have the freedom to elect for the preprocessing pipeline that makes the most sense for you and your project. Compare softwares, they have different default: eeglab – fieldtrip – SPM - Brainstorm
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.