General Linear Model L ύ cia Garrido and Marieke Schölvinck ICN.

Slides:



Advertisements
Similar presentations
T tests, ANOVAs and regression
Advertisements

Andrea Banino & Punit Shah. Samples vs Populations Descriptive vs Inferential William Sealy Gosset (Student) Distributions, probabilities and P-values.
The SPM MfD course 12th Dec 2007 Elvina Chu
General Linear Model Beatriz Calvo Davina Bristow.
2nd level analysis – design matrix, contrasts and inference
1st level analysis - Design matrix, contrasts & inference
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
Non-orthogonal regressors: concepts and consequences
1 st Level Analysis: design matrix, contrasts, GLM Clare Palmer & Misun Kim Methods for Dummies
SPM 2002 C1C2C3 X =  C1 C2 Xb L C1 L C2  C1 C2 Xb L C1  L C2 Y Xb e Space of X C1 C2 Xb Space X C1 C2 C1  C3 P C1C2  Xb Xb Space of X C1 C2 C1 
Outline What is ‘1st level analysis’? The Design matrix
The General Linear Model Or, What the Hell’s Going on During Estimation?
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
The General Linear Model (GLM)
1st level analysis: basis functions and correlated regressors
Lorelei Howard and Nick Wright MfD 2008
Linear Algebra and Matrices
1st Level Analysis Design Matrix, Contrasts & Inference
Correlation and Regression
General Linear Model & Classical Inference Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London SPM M/EEGCourse London, May.
The General Linear Model
Methods for Dummies Second level analysis
With many thanks for slides & images to: FIL Methods group, Virginia Flanagin and Klaas Enno Stephan Dr. Frederike Petzschner Translational Neuromodeling.
Brain Mapping Unit The General Linear Model A Basic Introduction Roger Tait
FMRI Methods Lecture7 – Review: analyses & statistics.
SPM short course – Oct Linear Models and Contrasts Jean-Baptiste Poline Neurospin, I2BM, CEA Saclay, France.
General Linear Model. Y1Y2...YJY1Y2...YJ = X 11 … X 1l … X 1L X 21 … X 2l … X 2L. X J1 … X Jl … X JL β1β2...βLβ1β2...βL + ε1ε2...εJε1ε2...εJ Y = X * β.
The General Linear Model (for dummies…) Carmen Tur and Ashwani Jha 2009.
General Linear Model and fMRI Rachel Denison & Marsha Quallo Methods for Dummies 2007.
FMRI Modelling & Statistical Inference Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London SPM Course Chicago, Oct.
Idiot's guide to... General Linear Model & fMRI Elliot Freeman, ICN. fMRI model, Linear Time Series, Design Matrices, Parameter estimation,
The General Linear Model
The General Linear Model Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London SPM fMRI Course London, May 2012.
SPM short – Mai 2008 Linear Models and Contrasts Stefan Kiebel Wellcome Trust Centre for Neuroimaging.
1 st level analysis: Design matrix, contrasts, and inference Stephane De Brito & Fiona McNabe.
The general linear model and Statistical Parametric Mapping I: Introduction to the GLM Alexa Morcom and Stefan Kiebel, Rik Henson, Andrew Holmes & J-B.
The general linear model and Statistical Parametric Mapping II: GLM for fMRI Alexa Morcom and Stefan Kiebel, Rik Henson, Andrew Holmes & J-B Poline.
The General Linear Model Christophe Phillips SPM Short Course London, May 2013.
The General Linear Model Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London SPM fMRI Course London, October 2012.
SPM short course – Mai 2008 Linear Models and Contrasts Jean-Baptiste Poline Neurospin, I2BM, CEA Saclay, France.
The General Linear Model …a talk for dummies
The General Linear Model (GLM)
The General Linear Model (GLM)
Part 5 - Chapter
Contrast and Inferences
General Linear Model & Classical Inference
The general linear model and Statistical Parametric Mapping
The General Linear Model
Design Matrix, General Linear Modelling, Contrasts and Inference
The General Linear Model (GLM): the marriage between linear systems and stats FFA.
and Stefan Kiebel, Rik Henson, Andrew Holmes & J-B Poline
The SPM MfD course 12th Dec 2007 Elvina Chu
The General Linear Model (GLM)
Contrasts & Statistical Inference
The General Linear Model
Rachel Denison & Marsha Quallo
The general linear model and Statistical Parametric Mapping
The General Linear Model
The General Linear Model (GLM)
Contrasts & Statistical Inference
Chapter 3 General Linear Model
MfD 04/12/18 Alice Accorroni – Elena Amoruso
The General Linear Model
The General Linear Model (GLM)
The General Linear Model
The General Linear Model
Linear Algebra and Matrices
Topic 11: Matrix Approach to Linear Regression
Contrasts & Statistical Inference
Presentation transcript:

General Linear Model L ύ cia Garrido and Marieke Schölvinck ICN

Observed data Preprocessing... Intensity Time Y Y is a matrix of BOLD signals: Each column represents a single voxel sampled at successive time points.

Univariate analysis GLM in two steps: Does an analysis of variance separately at each voxel (univariate) Makes t statistic from the results of this analysis, for each voxel

Example X can contain values quantifying experimental variable YXYX

Parameters & error this line is a 'model' of the data slope β = 0.23 Intercept c = 54.5 β: slope of line relating x to y ‘how much of x is needed to approximate y?’ ε = residual error the best estimate of β minimises ε: deviations from line Assumed to be independently, identically and normally distributed Y = βx + c + ε

Multiple Regression Simple regression Multiple regression (more than one predictor/regressor /beta) y = β 1 * x 1 + β 2 * x 2 + c + ε

Matrix Formulation Write out equation for each observation of variable Y from 1 to J: Y 1 = X 11 β 1 +…+X 1l β l +…+ X 1L β L + ε 1 Y j = X j1 β 1 +…+X jl β l +…+ X jL β L + ε j Y J = X J1 β 1 +…+X Jl β l +…+ X JL β L + ε J Y1YjYJY1YjYJ = X 11 … X 1l … X 1L X j1 … X 1l … X 1L X 11 … X 1l … X 1L Can turn these simultaneous equations into matrix form to get a single equation: β1βjβJβ1βjβJ + ε1εjεJε1εjεJ Y = X x β + ε Observed dataDesign MatrixParametersResiduals/Error Y = X. β + ε

GLM and fMRI Y = X. β + ε Observed data: Y is the BOLD signal at various time points at a single voxel Design matrix: Several components which explain the observed data, i.e. the BOLD time series for the voxel Parameters: Define the contribution of each component of the design matrix to the value of Y Estimated so as to minimise the error, ε, i.e. least sums of squares Error: Difference between the observed data, Y, and that predicted by the model, Xβ.

Design Matrix Matrix represents values of X Different columns = different predictors x 1 x 2 c

Parameter estimation e = Y – Ỹ = Y - X β S = Σ j J e j 2 = e T e = (Y - X β ) T (Y - X β ) The least square estimates are the parameter estimates which minimize the residual sum of squares  find derivative and solve for ∂S/∂β = 0  β = (X T X) -1 X T Y (if (X T X) is invertible) Matlab magic: >> B = inv(X) * Y

Statistical inference A beta value is estimated for each column in design matrix Test if the slope is significantly different from zero (null hypothesis) t-statistic = beta / standard error of the slope Many betas → contrasts (contents of another talk…) t-tests or F-tests depending on nature of question

Continuous predictors X can contain values quantifying experimental variable YXYX

Binary predictors X can contain values distinguishing experimental conditions YXYX

Covariates vs. conditions Covariates: parametric modulation of independent variable e.g. task-difficulty 1 to 6 Conditions: 'dummy' codes identify different levels of experimental factor e.g. integers 0 or 1: 'off' or 'on' on off off on

Ways to improve your model: modelling haemodynamics Brain does not just switch on and off! Reshape (convolve) regressors to resemble HRF HRF basic function Original HRF Convolved

Ways to improve your model: model everything Important to model all known variables, even if not experimentally interesting: e.g. head movement, block and subject effects  minimise residual error variance for better stats  effects-of-interest are the regressors you’re actually interested in subjects global activity or movement conditions: effects of interest

fMRI characteristics which may increase error Variable gain & scanner drift Variations of signal amplitude with every volume and between scanning sessions Proportional & Grand-mean scaling of data High-pass filtering in design matrix Serial temporal correlations breathing, heartbeat: activity at one time point correlates with other times -> adjust error term

Summary The General Linear Model allows you to find the parameters, β, which provide the best fit with your data, Y The optimal parameters estimates, β, are found by minimising the Sums of Squares differences between your predicted model and the observed data The design matrix in SPM contains the information about the factors, X, which may explain the observed data Once we have obtained the βs at each voxel we can use these to do various statistical tests

Previous MfD talks: Elliot Freeman (2005), Davina Bristow and Beatriz Calvo (2004) Thanks to…

Summary Y = X. β + ε Observed data: SPM uses a mass univariate approach – that is each voxel is treated as a separate column vector of data. Y is the BOLD signal at various time points at a single voxel Design matrix: Several components which explain the observed data, i.e. the BOLD time series for the voxel Timing info: onset vectors, O m j, and duration vectors, D m j HRF, h m, describes shape of the expected BOLD response over time Other regressors, e.g. realignment parameters Parameters: Define the contribution of each component of the design matrix to the value of Y Estimated so as to minimise the error, ε, i.e. least sums of squares Error: Difference between the observed data, Y, and that predicted by the model, Xβ. Not assumed to be spherical in fMRI