Chapter 15: Classification of Time- Embedded EEG Using Short-Time Principal Component Analysis by Nguyen Duc Thang 5/2009
Outline Part one Introduction Principal Component Analysis (PCA) Signal Fraction Analysis (SFA) EEG signal representation Short time PCA Part two Classifier Experimental setups, results, and analysis
Introduction Feature extraction Classification PCA, SFA, Short time PCA LDA, SVM
Outline Part one Introduction Principal Component Analysis (PCA) Signal Fraction Analysis (SFA) EEG signal representation Short time PCA Part two Classifier Experimental setups, results, and analysis
Projection w1w1 x
w1w1 w2w2 x d basic vectors reduce dimension
Principal Component Analysis (PCA) Motivation: Reduce dimension + minimum information loss. W = ? w w w O
Principal Component Analysis w hihi hihi constant O Minimize projection errors Maximize variations
Principal Component Analysis - w i is the eigenvector of the covariance matrix C x - Among D eigenvectors of C x, choose d<D eigenvectors - W=[w 1,w 2,…,w d ] T is projection matrix, reduce dimension D → d w1w1 w2w2
Outline Part one Introduction Principal Component Analysis (PCA) Signal Fraction Analysis (SFA) EEG signal representation Short time PCA Part two Classifier Experimental setups, results, and analysis
Signal Fraction Analysis (SFA)
Signal Fraction Analysis Assumption: The source signals are uncorrelated Algorithm
Results
Comparison between SFA and ICA Correlation between estimated sources and ground truths - SFA: suitable for small sample size, fast computation - ICA: suitable for large sample size
Extract basic vectors by SFA W SFA x W PCA x
Outline Part one Introduction Principal Component Analysis (PCA) Signal Fraction Analysis (SFA) EEG signal representation Short time PCA Part two Classifier Experimental setups, results, and analysis
Feature extraction Classification
EEG signal representation (Feature extraction) Raw feature Time-embedded feature r EEG channels l+1 More temporal information
Extract PCA features Training data (embedded space) D=r(l+1) N samples PCA d basic vectors form projection matrix W PCA D W PCA X = d PCA features (d X D) Time-embedded features
Extract SFA features Training data (embedded space) D=r(l+1) N samples SFA d basic vectors form projection matrix W SFA D W SFA X = d SFA features (d X D) Time-embedded features
Outline Part one Introduction Principal Component Analysis (PCA) Signal Fraction Analysis (SFA) EEG signal representation Short time PCA Part two Classifier Experimental setups, results, and analysis
The shortcomings of conventional PCA projection line Not good for large number of samples
Short time PCA approach Apply PCA on short durations
Extract short time PCA features D Time-embedded features h D h window PCA n basic vectors D n stack Short time PCA features D X n
Next Part one Introduction Principal Component Analysis (PCA) Signal Fraction Analysis (SFA) EEG signal representation Short time PCA Part two Classifier Experimental setups, results, and analysis