A Tensorial Approach to Access Cognitive Workload related to Mental Arithmetic from EEG Functional Connectivity Estimates S.I. Dimitriadis, Yu Sun, K.

Slides:



Advertisements
Similar presentations
Text mining Gergely Kótyuk Laboratory of Cryptography and System Security (CrySyS) Budapest University of Technology and Economics
Advertisements

Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Carolina Galleguillos, Brian McFee, Serge Belongie, Gert Lanckriet Computer Science and Engineering Department Electrical and Computer Engineering Department.
Nonlinear Dimension Reduction Presenter: Xingwei Yang The powerpoint is organized from: 1.Ronald R. Coifman et al. (Yale University) 2. Jieping Ye, (Arizona.
Minimum Redundancy and Maximum Relevance Feature Selection
AGE ESTIMATION: A CLASSIFICATION PROBLEM HANDE ALEMDAR, BERNA ALTINEL, NEŞE ALYÜZ, SERHAN DANİŞ.
Self Organization of a Massive Document Collection
A novel supervised feature extraction and classification framework for land cover recognition of the off-land scenario Yan Cui
What does delta band tell us about cognitive processes: A mental calculation study Stavros I. Dimitriadis,Nikolaos A. Laskaris, Vasso Tsirka, Michael Vourkas,
An EEG Study of Brain Connectivity Dynamics at the Resting State Stavros I. Dimitriadis,Nikolaos A. Laskaris, Vasso Tsirka, Michael Vourkas, Sifis Micheloyannis.
The loss function, the normal equation,
“Random Projections on Smooth Manifolds” -A short summary
Using Structure Indices for Efficient Approximation of Network Properties Matthew J. Rattigan, Marc Maier, and David Jensen University of Massachusetts.
Principal Component Analysis
A novel single-trial analysis scheme for characterizing the presaccadic brain activity based on a SON representation K. Bozas, S.I. Dimitriadis, N.A. Laskaris,
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Effective Connectivity Patterns Associated with P300 Unmask Differences in the Level of Attention/Cognition between Normal and Disabled Subjects S.I. Dimitriadis,
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
1 Graph Embedding (GE) & Marginal Fisher Analysis (MFA) 吳沛勳 劉冠成 韓仁智
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Cs: compressed sensing
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
A novel symbolization scheme for multichannel recordings with emphasis on phase information and its application to differentiate EEG activity from different.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
Transductive Regression Piloted by Inter-Manifold Relations.
Analysis of Movement Related EEG Signal by Time Dependent Fractal Dimension and Neural Network for Brain Computer Interface NI NI SOE (D3) Fractal and.
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Manifold learning: MDS and Isomap
SemiBoost : Boosting for Semi-supervised Learning Pavan Kumar Mallapragada, Student Member, IEEE, Rong Jin, Member, IEEE, Anil K. Jain, Fellow, IEEE, and.
Spectral Sequencing Based on Graph Distance Rong Liu, Hao Zhang, Oliver van Kaick {lrong, haoz, cs.sfu.ca {lrong, haoz, cs.sfu.ca.
Optimal Dimensionality of Metric Space for kNN Classification Wei Zhang, Xiangyang Xue, Zichen Sun Yuefei Guo, and Hong Lu Dept. of Computer Science &
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
A Convergent Solution to Tensor Subspace Learning.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
CZ5225: Modeling and Simulation in Biology Lecture 7, Microarray Class Classification by Machine learning Methods Prof. Chen Yu Zong Tel:
Chapter 13 (Prototype Methods and Nearest-Neighbors )
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Feature Extraction 主講人:虞台文. Content Principal Component Analysis (PCA) PCA Calculation — for Fewer-Sample Case Factor Analysis Fisher’s Linear Discriminant.
2D-LDA: A statistical linear discriminant analysis for image matrix
A Kernel Approach for Learning From Almost Orthogonal Pattern * CIS 525 Class Presentation Professor: Slobodan Vucetic Presenter: Yilian Qin * B. Scholkopf.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
Next, this study employed SVM to classify the emotion label for each EEG segment. The basic idea is to project input data onto a higher dimensional feature.
Manifold Learning JAMES MCQUEEN – UW DEPARTMENT OF STATISTICS.
SemiBoost : Boosting for Semi-supervised Learning Pavan Kumar Mallapragada, Student Member, IEEE, Rong Jin, Member, IEEE, Anil K. Jain, Fellow, IEEE, and.
Descriptive Statistics The means for all but the C 3 features exhibit a significant difference between both classes. On the other hand, the variances for.
Spectral Methods for Dimensionality
Principal Component Analysis (PCA)
LECTURE 11: Advanced Discriminant Analysis
4NeuroInformatics Group, AUTH, Thessaloniki, Greece
کاربرد نگاشت با حفظ تنکی در شناسایی چهره
Machine Learning Basics
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Learning with information of features
REMOTE SENSING Multispectral Image Classification
Lecture 22 Clustering (3).
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
NonLinear Dimensionality Reduction or Unfolding Manifolds
Lecture 16. Classification (II): Practical Considerations
Presentation transcript:

A Tensorial Approach to Access Cognitive Workload related to Mental Arithmetic from EEG Functional Connectivity Estimates S.I. Dimitriadis, Yu Sun, K. Kwok, N.A. Laskaris, A. Bezerianos AIIA-Lab, Informatics dept., Aristotle University of Thessaloniki, Greece SINAPSE, National University of Singapore Temasek Laboratories, National University of Singapore 1

Outline Introduction functional connectivity have demonstrated its potential use for decoding various brain states high-dimensionality of the pairwise functional connectivity limits its usefulness in some real-time applications 2 Methodology Brain is a complex network Functional Connectivity Graphs (FCGs) were treated as vectors FCG can be treated as a tensor Tensor Subspace Analysis Results Conclusions

3 FUNCTIONAL/EFFECTIVE MEASURES ΕΕG & CONNECTOMICS IntroMethodResultsConclusions FROM MULTICHANNEL RECORDINGS TO CONNECTIVITY GRAPHS

COMPLEXITY IN THE HUMAN BRAIN 4 EEG & CONNECTOMICS IntroMethodResultsConclusions

Motivation and problem statement IntroMethodResultsConclusions 5 Association of functional connectivity patterns with particular cognitive tasks has been a hot topic in neuroscience studies of functional connectivity have demonstrated its potential use for decoding various brain states e.g. mind reading the high-dimensionality of the pairwise functional connectivity limits its usefulness in some real-time applications We used tensor subspace analysis (TSA) to reduce the initial high-dimensionality of the pairwise coupling in the original functional connectivity network which 1.would significantly decrease the computational cost and 2. facilitate the differentiation of brain states

IntroMethodResultsConclusions 6 FCG matrix TENSOR VECTOR Outline of our methodology

The linear Dimensionality Reduction Problem in Tensor Space IntroMethodResultsConclusions 7 Tensor Subspace Analysis (TSA) is fundamentally based on Locality Preserving Projection X can be thought as a 2 nd order tensor (or 2-tensor) in the tensor space The generic problem of linear dimensionality reduction in the second order space is the following: 1.Given a set of tensors (i.e. matrices) find two transformation matrices U of size and V of that maps these tensors to a set of tensors such that “represents”, where The method is of particular interest in the special case where and is a nonlinear sub-manifold embedded in

Optimal Linear Embedding IntroMethodResultsConclusions 8 The domain of FCGs is most probably a nonlinear sub-manifold embedded in the tensor space Adopting TSA, we estimate geometrical and topological properties of the sub-manifold from “scattered data” lying on this unknown sub-manifold. We will consider the particular question of finding a linear subspace approximation to the sub-manifold in the sense of local isometry TSA is fundamentally based on Locality Preserving Projection (LPP) 1.Given m data points sampled from the FCG sub-manifold, one can build a nearest neighbor graph G to model the local geometrical structure of M 2. Let S be the weight matrix of G. A possible definition of is as follows: The function is the so called heat kernel which is intimately related to the manifold structure AND |.| is the Frobenius norm of matrix.

Optimal Linear Embedding IntroMethodResultsConclusions 9 In the case of supervised learning (classification labels are available), the label information can be easily incorporated into the graph as follows: Let and be the transformation matrices. A reasonable transformation respecting the graph structure can be obtained by solving the following objective functions: The objective function incurs a heavy penalty if neighboring points and are mapped far apart With D be a diagonal matrix the optimization problem was restricted to the following equations Once is obtained, can be updated by solving the following generalized eigenvector problem Thus, the optimal and can be obtained by iteratively computing the generalized eigenvectors of (4) and (5)

Data acquisition: Mental Task (summation) 1 subject 64 EEG electrodes Horizontal and Vertical EOG > 40 trials for each condition 5Different difficult levels e.g. 3+4 … IntroMethodResultsConclusions

Preprocessing 11 IntroMethodResultsConclusions ICA (Independent Component Analysis) Signals were filtered within frequency range of 0.5 to 45 Hz (from δ to γ band) the data from level 1 (addition of 1-digit numbers – 187 trials) and level 5 (addition of 3-digit numbers – 59 trials) was used for the presentation of TSA Initial FCG Derivation and standard network analysis Functional Connectivity Graphs (FCGs) were constructed using a phase coupling estimator called PLV We computed local efficiency (LE) which was defined as: where in k i corresponded to the total number of spatial directed neighbors (first level neighbors) of the current node, N was the set of all nodes in the network, and d set the shortest absolute path length between every possible pair in the neighborhood of the current node

12 IntroMethodResultsConclusions Brain Topographies and Statistical Differences of LE between L1 and L5 BRAIN TOPOGRAPHIES OF NODAL LE Trial–averaged nodal LE measurements (across various frequency bands) for PO7 and PO8 sensors

13 IntroMethodResultsConclusions Trimmed FCGs We reduced the input for tensor analysis from FCGs of dimension N x N to sub-graphs of size N’ x N’ (with N’=12sensors corresponding to the spatial neighborhood of PO7 and PO8 sensors) The total number of connections between sensors belonging in the neighborhood is: 66/2016≈3% of the total number of possible connections in the original graph The distribution of PLV values over the parieto-occipital (PO) brain regions and the topographically thresholded functional connections

14 Machine learning validation IntroMethodResultsConclusions 1.The TSA algorithm, followed by a k-nearest-neighbor classifier (with k=3), was tested on trial-based connectivity data from all bands 2.10 – fold Cross – Validation scheme (90% for training and 10% for testing) 3.The following table summarizes the average classification rates derived after applying the above cross-validation scheme 100 times B ILATERALLY PO Left PO Right PO TSALDA TSALDA TSALDA   1 2 1 2  The selected options for TSA were: Weight mode=Ηeat Kernel Neighbor mode=1 Supervised learning Number of dimensions=3 Table 1 summarizes the classification performance of TSA+knn and of LDA+knn (Linear Discriminant Analysis) as a baseline validation feature extraction algorithm

15 Conclusions IntroMethodResultsConclusions We introduced the tensorial approach in brain connectivity analysis The introduced methodological approach can find application in many other situations where brain state has to be decoded from functional connectivity structure After extensive validation, it might prove an extremely useful tool for on-line applications of : Human-machine interaction Brain decoding Mind reading