Spectral Methods in EEG Analysis Steven L. Bressler Cognitive Neurodynamics Laboratory Center for Complex Systems & Brain Sciences Department of Psychology.

Slides:



Advertisements
Similar presentations
Dates for term tests Friday, February 07 Friday, March 07
Advertisements

Analysis of the time-varying cortical neural connectivity in the newborn EEG: a time-frequency approach Amir Omidvarnia, Mostefa Mesbah, John M. O’Toole,
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Chapter 3. Noise Husheng Li The University of Tennessee.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Lecture 7 Linear time invariant systems
AGC DSP AGC DSP Professor A G Constantinides©1 Modern Spectral Estimation Modern Spectral Estimation is based on a priori assumptions on the manner, the.
The General Linear Model Or, What the Hell’s Going on During Estimation?
ELEC 303 – Random Signals Lecture 20 – Random processes
ELE Adaptive Signal Processing
Biostatistics for Dummies Biomedical Computing Cross-Training Seminar October 18 th, 2002.
Environmental Data Analysis with MatLab Lecture 17: Covariance and Autocorrelation.
Stochastic processes Lecture 8 Ergodicty.
Spike-triggering stimulus features stimulus X(t) multidimensional decision function spike output Y(t) x1x1 x2x2 x3x3 f1f1 f2f2 f3f3 Functional models of.
Image Enhancement in the Frequency Domain Part I Image Enhancement in the Frequency Domain Part I Dr. Samir H. Abdul-Jauwad Electrical Engineering Department.
EE513 Audio Signals and Systems Wiener Inverse Filter Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Digital Communication
Linear Prediction Problem: Forward Prediction Backward Prediction
Review of Probability.
Lecture 24: Cross-correlation and spectral analysis MP574.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Introduction to Error Analysis
STAT 497 LECTURE NOTES 2.
1 CS 551/651: Structure of Spoken Language Lecture 8: Mathematical Descriptions of the Speech Signal John-Paul Hosom Fall 2008.
Review for Exam I ECE460 Spring, 2012.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
T – Biomedical Signal Processing Chapters
Multivariate Statistics Matrix Algebra I W. M. van der Veld University of Amsterdam.
CHAPTER 4 Adaptive Tapped-delay-line Filters Using the Least Squares Adaptive Filtering.
W ORKSHOP ON D IRECTED F UNCTIONAL C ONNECTIVITY A NALYSIS U SING W IENER - G RANGER C AUSALITY Dr. Steven Bressler Cognitive Neurodynamics Laboratory.
Intro. ANN & Fuzzy Systems Lecture 26 Modeling (1): Time Series Prediction.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Speech Signal Representations I Seminar Speech Recognition 2002 F.R. Verhage.
Spectral analysis of multiple timeseries Kenneth D. Harris 18/2/15.
2. Stationary Processes and Models
Elements of Stochastic Processes Lecture II
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
Classifying Event-Related Desynchronization in EEG, ECoG, and MEG Signals Kim Sang-Hyuk.
EE513 Audio Signals and Systems
CHAPTER 5 SIGNAL SPACE ANALYSIS
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
Chapter 1 Random Process
Spectrum Sensing In Cognitive Radio Networks
Chapter 2. Fourier Representation of Signals and Systems
Geology 6600/7600 Signal Analysis 21 Sep 2015 © A.R. Lowry 2015 Last time: The Cross-Power Spectrum relating two random processes x and y is given by:
Discrete-time Random Signals
Geology 6600/7600 Signal Analysis 28 Sep 2015 © A.R. Lowry 2015 Last time: Energy Spectral Density; Linear Systems given (deterministic) finite-energy.
Stochastic Process Theory and Spectral Estimation Bijan Pesaran Center for Neural Science New York University.
Multivariate Time Series Analysis
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
Stochastic Process Theory and Spectral Estimation
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
Geology 5600/6600 Signal Analysis 11 Sep 2015 © A.R. Lowry 2015 Last time: The Central Limit theorem : The sum of a sequence of random variables tends.
Eeng360 1 Chapter 2 Linear Systems Topics:  Review of Linear Systems Linear Time-Invariant Systems Impulse Response Transfer Functions Distortionless.
Environmental Data Analysis with MatLab 2 nd Edition Lecture 22: Linear Approximations and Non Linear Least Squares.
Dr. Thomas Kigabo RUSUHUZWA
I. Statistical Methods for Genome-Enabled Prediction of Complex Traits OUTLINE THE CHALLENGES OF PREDICTING COMPLEX TRAITS ORDINARY LEAST SQUARES (OLS)
Source-Resolved Connectivity Analysis
Why Stochastic Hydrology ?
Steven L. Bressler, PhD Director, Cognitive Neurodynamics Laboratory
SIGNALS PROCESSING AND ANALYSIS
VAR models and cointegration
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Steven L. Bressler Cognitive Neurodynamics Laboratory
Lecture 26 Modeling (1): Time Series Prediction
Department of Electrical and Computer Engineering
EE513 Audio Signals and Systems
Beta Synchrony in Visual Expectation
Presentation transcript:

Spectral Methods in EEG Analysis Steven L. Bressler Cognitive Neurodynamics Laboratory Center for Complex Systems & Brain Sciences Department of Psychology Florida Atantic University

Overview Fourier Analysis Spectral Analysis of the EEG Cross-Correlation Analysis Spectral Coherence Parametric Spectral Analysis AutoRegressive Modeling Spectral Analysis by AR Modeling Spectral Granger Causality BSMART

Fourier Analysis Joseph Fourier ( ) was a French mathematician who is credited with first introducing the representation of a mathematical function as the sum of trigonometric basis functions.

Spectral Analysis of the EEG (A)16-second EEG segment from a sleeping rat. (B-E) Corresponding power time series in delta, theta, alpha, and beta frequency bands. Log power spectrum of sleeping rat EEG.

Correlation Analysis Awake mouse respiratory wave and somatosensory EEG time series. Auto-correlations (left) and cross-correlations (right) between respiratory and EEG time series. Note: Cross-correlation is a function of time lag.

Spectral Coherence Spectral coherence between two time series, x(t) and y(t), is the modulus-squared of the cross-spectral density divided by the product of the auto-spectral densities of x(t) and y(t). 1.The coherence is the spectral equivalent of the cross-correlation function. 2.The spectral coherence is bounded by 0 & 1 at all frequencies: 0 ≤ Coh 2 (f) ≤ 1. 3.If x & y are independent processes, then the cross-spectral density is 0 at all frequencies, and so is the coherence. 4.If x & y are identical processes, then the auto-spectral densities are equal, and are equal to the cross-spectral density, and the coherence is 1 at all frequencies.

Spectral Coherence The coherence measures the interdependence of processes x and y – it reflects the distribution across frequency of activity that is common to x and y. Example: Two EEGs (e.g. from left & right motor cortices) may be largely independent of each other, but synchronized at times within a narrow frequency range (e.g. during coordinated bilateral movement). In this example, x and y are only interdependent in this frequency range  coherence is high (near one) for these frequencies and low (near zero) at all other frequencies.

The Coherence Spectrum is Related to the Distribution of Relative Phase

Coherence and Relative Phase Distribution Case I  Y   X has a narrow distribution The resultant vector sum is high  the coherence is high Case II  Y   X has a wide distribution The resultant vector sum is low  the coherence is low

Experimental Example High Coherence Low Coherence

Parametric Spectral Analysis EEG time series are stochastic, i.e. they can be represented as a sequence of related random variables. Statistical spectral analysis treats EEGs as time series data generated by stationary stochastic (random) processes. Unlike nonparametric spectral analysis, which computes spectra directly from time series data by Fourier analysis, parametric spectral analysis derives spectral quantities from a statistical model of the time series. In the model, the EEG at one time is expressed by statistical relations with the EEG from past times. The parametric model is typically autoregressive, meaning that each time series value is modeled as a weighted sum of past values (the weights being considered as the parameters of the model). Parametric modeling allows a precise time-frequency representation of the EEG (Ding et al. 2000; Nalatore & Rangarajan 2009). It also serves as a theoretically sound basis for directed spectral analysis (Ding et al. 2006; Bressler & Seth 2011).

AMVAR Spectral Coherence Profile

The AutoRegressive Model X t = [a 1 X t-1 + a 2 X t-2 + a 3 X t-3 + … + a m X t-m ] + ε t where X is a zero-mean stationary stochastic process, a i are model coefficients, m is the model order, and ε t is the white noise residual error process.

X i,t = a i,1,1 X 1,t-1 + a i,1,2 X 1,t-2 + … + a i,1,m X 1,t-m + a i,2,1 X 2,t-1 + a i,2,2 X 2,t-2 + … + a i,2,m X 2,t-m + … + a i,p,1 X p,t-1 + a i,p,2 X p,t-2 + … + a i,p,m X p,t-m + e i,t The MultiVariate AutoRegressive (MVAR) Model where: X t = [x 1t, x 2t, , x pt ] T are p data channels, m is the model order, A k are p x p coefficient matrices, & E t is the white noise residual error process vector. X t = A 1 X t-1 +  + A m X t-m + E t

Repeated trials are treated as realizations of a stationary stochastic process. A k are obtained by solving the multivariate Yule-Walker equations (of size mp 2 ), using the Levinson, Wiggens, Robinson algorithm, as implemented by Morf et al. (1978). Morf M, Vieira A, Lee D, Kailath T (1978) Recursive multichannel maximum entropy spectral estimation. IEEE Trans Geoscience Electronics 16: The model order is determined by parametric testing. MVAR Modeling of Event-Related Neural Time Series

Spectral Analysis by MVAR Modeling The Spectral Matrix is defined as: S( f ) = = H(f )  H*(f ) where * denotes matrix transposition & complex conjugation;  is the covariance matrix of E t ; and is the transfer function of the system. The Power Spectrum of channel k is S kk ( f ) which is the k th diagonal element of the spectral matrix.

Identification of EEG Oscillatory Activity from AR Power Spectra

Coherence Analysis by MVAR Modeling The (squared) Coherence Spectrum of channels k & l is also derived from the spectral matrix as the magnitude of the cross-spectrum normalized by the two auto- spectra: C kl ( f ) = |S kl ( f )| 2 / S kk ( f ) S ll ( f ).

Statistical Causality For two simultaneous time series, one series is called causal to the other if we can better predict the second series by incorporating knowledge of the first one (Wiener, The Theory of Prediction, 1956).

Granger (1969) implemented the idea of causality in terms of autoregressive models. Let x 1, x 2, …, x t and y 1, y 2, …, y t represent two time series. Granger compared two linear models: x t = a 1 x t-1 + … + a m x t-m +  t and x t = b 1 x t-1 + … + b m x t-m + c 1 y t-1 + … + c m y t-m +  t Granger Causality

If Then, in some suitable statistical sense, we can say that the y time series has a casual influence on the x time series. Granger Causality

Granger Causal Spectrum Geweke (1982) found a spectral representation of the time domain Granger causality (F y  x ): The Granger Causal Spectrum from y to x is: where  is the covariance matrix of the residual error, H is the transfer function of the system, and S xx is the autospectrum of x.

Experimental Example of Granger Causal Spectra

BSMART A Matlab/C Toolbox for Analyzing Brain Circuits BSMART, an acronym of Brain-System for Multivariate AutoRegressive Timeseries, is an open-source, downloadable software package for analyzing brain circuits. BSMART is a project that was born out of a collaborative research effort between Dr. Hualou Liang at Drexel University, Dr. Steven Bressler at Florida Atlantic University, and Dr. Mingzhou Ding at University of Florida. BSMART can be applied to a wide variety of neuroelectromagnetic phenomena, including EEG, MEG and fMRI data. A unique feature of the BSMART package is Granger causality that can be used to assess causal influences and directions of driving among multiple neural signals. The backbone of the BSMART project is MultiVariate AutoRegressive (MVAR) analysis. The MVAR model provides a plethora of spectral quantities such as auto power, partial power, coherence, partial coherence, multiple coherence and Granger causality. The approach has been fruitfully used to characterize, with high spatial, temporal, and frequency resolution, functional relations within large scale brain networks. BSMART is described in: Jie Cui, Lei Xu, Steven L. Bressler, Mingzhou Ding, Hualou Liang, BSMART: a Matlab/C toolbox for analysis of multichannel neural time series, Neural Networks, Special Issue on Neuroinformatics, 21: , Download BSMART at