Chapter 15: Classification of Time- Embedded EEG Using Short-Time Principal Component Analysis by Nguyen Duc Thang 5/2009.

Slides:



Advertisements
Similar presentations
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Advertisements

Face Recognition By Sunny Tang.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Face Recognition Committee Machine Presented by Sunny Tang.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Subspace and Kernel Methods April 2004 Seong-Wook Joo.
Principal Component Analysis
Dimensional reduction, PCA
Implementing a reliable neuro-classifier
Independent Component Analysis (ICA) and Factor Analysis (FA)
CSE 300: Software Reliability Engineering Topics covered: Software metrics and software reliability.
ICA Alphan Altinok. Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers.
Principal Component Analysis Principles and Application.
Summarized by Soo-Jin Kim
This week: overview on pattern recognition (related to machine learning)
1 Graph Embedding (GE) & Marginal Fisher Analysis (MFA) 吳沛勳 劉冠成 韓仁智
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
Machine Learning Lecture 11 Summary G53MLE | Machine Learning | Dr Guoping Qiu1.
EEG Classification Using Maximum Noise Fractions and spectral classification Steve Grikschart and Hugo Shi EECS 559 Fall 2005.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Basics of Neural Networks Neural Network Topologies.
Using Support Vector Machines to Enhance the Performance of Bayesian Face Recognition IEEE Transaction on Information Forensics and Security Zhifeng Li,
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
Face Recognition: An Introduction
Dimensionality Reduction Motivation I: Data Compression Machine Learning.
Why to reduce the number of the features? Having D features, we want to reduce their number to n, where n
ECE 8443 – Pattern Recognition LECTURE 08: DIMENSIONALITY, PRINCIPAL COMPONENTS ANALYSIS Objectives: Data Considerations Computational Complexity Overfitting.
Chapter 7 Multivariate techniques with text Parallel embedded system design lab 이청용.
CSE 185 Introduction to Computer Vision Face Recognition.
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Optimal Component Analysis Optimal Linear Representations of Images for Object Recognition X. Liu, A. Srivastava, and Kyle Gallivan, “Optimal linear representations.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
Supervisor: Nakhmani Arie Semester: Winter 2007 Target Recognition Harmatz Isca.
Speech Lab, ECE, State University of New York at Binghamton  Classification accuracies of neural network (left) and MXL (right) classifiers with various.
PCA vs ICA vs LDA. How to represent images? Why representation methods are needed?? –Curse of dimensionality – width x height x channels –Noise reduction.
MACHINE LEARNING 7. Dimensionality Reduction. Dimensionality of input Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Feature Selection and Dimensionality Reduction. “Curse of dimensionality” – The higher the dimensionality of the data, the more data is needed to learn.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 10: PRINCIPAL COMPONENTS ANALYSIS Objectives:
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
Principal Component Analysis (PCA)
Dimensionality Reduction
Background on Classification
LECTURE 10: DISCRIMINANT ANALYSIS
Recognition with Expression Variations
Lecture 8:Eigenfaces and Shared Features
Principal Component Analysis (PCA)
Pawan Lingras and Cory Butz
Classification Discriminant Analysis
Principal Component Analysis
PCA vs ICA vs LDA.
Blind Signal Separation using Principal Components Analysis
Classification Discriminant Analysis
Object Modeling with Layers
Principal Component Analysis
Techniques for studying correlation and covariance structure
Outline H. Murase, and S. K. Nayar, “Visual learning and recognition of 3-D objects from appearance,” International Journal of Computer Vision, vol. 14,
Introduction PCA (Principal Component Analysis) Characteristics:
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Dimensionality Reduction
A Fast Fixed-Point Algorithm for Independent Component Analysis
Feature space tansformation methods
LECTURE 09: DISCRIMINANT ANALYSIS
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Concave Minimization for Support Vector Machine Classifiers
Multivariate Methods Berlin Chen
Feature Selection Methods
Principal Component Analysis
NON-NEGATIVE COMPONENT PARTS OF SOUND FOR CLASSIFICATION Yong-Choon Cho, Seungjin Choi, Sung-Yang Bang Wen-Yi Chu Department of Computer Science &
Presentation transcript:

Chapter 15: Classification of Time- Embedded EEG Using Short-Time Principal Component Analysis by Nguyen Duc Thang 5/2009

Outline Part one  Introduction  Principal Component Analysis (PCA)  Signal Fraction Analysis (SFA)  EEG signal representation  Short time PCA Part two  Classifier  Experimental setups, results, and analysis

Introduction Feature extraction Classification PCA, SFA, Short time PCA LDA, SVM

Outline Part one  Introduction  Principal Component Analysis (PCA)  Signal Fraction Analysis (SFA)  EEG signal representation  Short time PCA Part two  Classifier  Experimental setups, results, and analysis

Projection w1w1 x

w1w1 w2w2 x d basic vectors reduce dimension

Principal Component Analysis (PCA) Motivation: Reduce dimension + minimum information loss. W = ? w w w O

Principal Component Analysis w hihi hihi constant O Minimize projection errors Maximize variations

Principal Component Analysis - w i is the eigenvector of the covariance matrix C x - Among D eigenvectors of C x, choose d<D eigenvectors - W=[w 1,w 2,…,w d ] T is projection matrix, reduce dimension D → d w1w1 w2w2

Outline Part one  Introduction  Principal Component Analysis (PCA)  Signal Fraction Analysis (SFA)  EEG signal representation  Short time PCA Part two  Classifier  Experimental setups, results, and analysis

Signal Fraction Analysis (SFA)

Signal Fraction Analysis Assumption: The source signals are uncorrelated Algorithm

Results

Comparison between SFA and ICA Correlation between estimated sources and ground truths - SFA: suitable for small sample size, fast computation - ICA: suitable for large sample size

Extract basic vectors by SFA W SFA x W PCA x

Outline Part one  Introduction  Principal Component Analysis (PCA)  Signal Fraction Analysis (SFA)  EEG signal representation  Short time PCA Part two  Classifier  Experimental setups, results, and analysis

Feature extraction Classification

EEG signal representation (Feature extraction) Raw feature Time-embedded feature r EEG channels l+1 More temporal information

Extract PCA features Training data (embedded space) D=r(l+1) N samples PCA d basic vectors form projection matrix W PCA D W PCA X = d PCA features (d X D) Time-embedded features

Extract SFA features Training data (embedded space) D=r(l+1) N samples SFA d basic vectors form projection matrix W SFA D W SFA X = d SFA features (d X D) Time-embedded features

Outline Part one  Introduction  Principal Component Analysis (PCA)  Signal Fraction Analysis (SFA)  EEG signal representation  Short time PCA Part two  Classifier  Experimental setups, results, and analysis

The shortcomings of conventional PCA projection line Not good for large number of samples

Short time PCA approach Apply PCA on short durations

Extract short time PCA features D Time-embedded features h D h window PCA n basic vectors D n stack Short time PCA features D X n

Next Part one  Introduction  Principal Component Analysis (PCA)  Signal Fraction Analysis (SFA)  EEG signal representation  Short time PCA Part two  Classifier  Experimental setups, results, and analysis