Next, this study employed SVM to classify the emotion label for each EEG segment. The basic idea is to project input data onto a higher dimensional feature.

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

(SubLoc) Support vector machine approach for protein subcelluar localization prediction (SubLoc) Kim Hye Jin Intelligent Multimedia Lab
ECG Signal processing (2)
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
An Introduction of Support Vector Machine
Classification / Regression Support Vector Machines
An Introduction of Support Vector Machine
Support Vector Machines
SVM—Support Vector Machines
Support vector machine
Search Engines Information Retrieval in Practice All slides ©Addison Wesley, 2008.
Machine learning continued Image source:
Particle swarm optimization for parameter determination and feature selection of support vector machines Shih-Wei Lin, Kuo-Ching Ying, Shih-Chieh Chen,
Content Based Image Clustering and Image Retrieval Using Multiple Instance Learning Using Multiple Instance Learning Xin Chen Advisor: Chengcui Zhang Department.
Support Vector Machines (and Kernel Methods in general)
Support Vector Machines and Kernel Methods
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Resampling techniques Why resampling? Jacknife Cross-validation Bootstrap Examples of application of bootstrap.
Support Vector Classification (Linearly Separable Case, Primal) The hyperplanethat solves the minimization problem: realizes the maximal margin hyperplane.
Classification: Support Vector Machine 10/10/07. What hyperplane (line) can separate the two classes of data?
A Kernel-based Support Vector Machine by Peter Axelberg and Johan Löfhede.
SVM (Support Vector Machines) Base on statistical learning theory choose the kernel before the learning process.
05/06/2005CSIS © M. Gibbons On Evaluating Open Biometric Identification Systems Spring 2005 Michael Gibbons School of Computer Science & Information Systems.
Applications of Data Mining in Microarray Data Analysis Yen-Jen Oyang Dept. of Computer Science and Information Engineering.
An Introduction to Support Vector Machines Martin Law.
A Geometric Framework for Unsupervised Anomaly Detection: Detecting Intrusions in Unlabeled Data Authors: Eleazar Eskin, Andrew Arnold, Michael Prerau,
This week: overview on pattern recognition (related to machine learning)
CLassification TESTING Testing classifier accuracy
Efficient Model Selection for Support Vector Machines
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Fuzzy Entropy based feature selection for classification of hyperspectral data Mahesh Pal Department of Civil Engineering National Institute of Technology.
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
Jun-Won Suh Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Speaker Verification System.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
An Introduction to Support Vector Machines (M. Law)
Today Ensemble Methods. Recap of the course. Classifier Fusion
Evaluating Results of Learning Blaž Zupan
START OF DAY 5 Reading: Chap. 8. Support Vector Machine.
컴퓨터 과학부 김명재.  Introduction  Data Preprocessing  Model Selection  Experiments.
An Introduction to Support Vector Machine (SVM)
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
CS558 Project Local SVM Classification based on triangulation (on the plane) Glenn Fung.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
CZ5225: Modeling and Simulation in Biology Lecture 7, Microarray Class Classification by Machine learning Methods Prof. Chen Yu Zong Tel:
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Validation methods.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Supervised Machine Learning: Classification Techniques Chaleece Sandberg Chris Bradley Kyle Walsh.
Feature Selction for SVMs J. Weston et al., NIPS 2000 오장민 (2000/01/04) Second reference : Mark A. Holl, Correlation-based Feature Selection for Machine.
SVMs in a Nutshell.
Support Vector Machine: An Introduction. (C) by Yu Hen Hu 2 Linear Hyper-plane Classifier For x in the side of o : w T x + b  0; d = +1; For.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
A distributed PSO – SVM hybrid system with feature selection and parameter optimization Cheng-Lung Huang & Jian-Fan Dun Soft Computing 2008.
A Brief Introduction to Support Vector Machine (SVM) Most slides were from Prof. A. W. Moore, School of Computer Science, Carnegie Mellon University.
Support Vector Machines Reading: Textbook, Chapter 5 Ben-Hur and Weston, A User’s Guide to Support Vector Machines (linked from class web page)
High resolution product by SVM. L’Aquila experience and prospects for the validation site R. Anniballe DIET- Sapienza University of Rome.
Non-separable SVM's, and non-linear classification using kernels Jakob Verbeek December 16, 2011 Course website:
Neural networks and support vector machines
CS 9633 Machine Learning Support Vector Machines
PREDICT 422: Practical Machine Learning
Evaluating Results of Learning
LINEAR AND NON-LINEAR CLASSIFICATION USING SVM and KERNELS
Pattern Recognition CS479/679 Pattern Recognition Dr. George Bebis
CSSE463: Image Recognition Day 14
Support Vector Machine _ 2 (SVM)
COSC 4368 Machine Learning Organization
Lecture 16. Classification (II): Practical Considerations
Support Vector Machines 2
Presentation transcript:

Next, this study employed SVM to classify the emotion label for each EEG segment. The basic idea is to project input data onto a higher dimensional feature space via a kernel transfer function, which is easier to be separated than that in the original feature space.

Depending on input data, the iterative learning process of SVM would eventually converge into optimal hyperplanes with maximal margins between each class. These hyperplanes would be the decision boundaries for distinguishing different data clusters. This study used LIBSVM software to build the SVM classifier and employed radial basis function (RBF) kernel to nonlinearly map data onto a higher dimension space.

In the experiments, the number of sample points from each subject was around 480 points (16 30-s EEG segments×around 30 points per segment) derived from artifact-free EEG signals. A ten times of 10-fold cross-validation scheme with randomization was applied to dataset from each subject in order to increase the reliability of the recognition results.

In a 10-fold cross validation, whole EEG dataset was divided into ten subsets. The MLP and SVM were trained with nine subsets of feature vectors, whereas the remaining subset was used for testing. This procedure was repeated ten times with each subset having an equal chance of being the testing data. The entire 10-fold cross validation was then repeated ten times with different subset splits.

The accuracy was evaluated by the ratio of correctly classified number of samples and the total number of samples. After validation processing, the average subject-dependent performance using different feature types as inputs were evaluated.

C. FEATURE SELECTION Feature selection was a necessary process before performing any data classification and clustering. The objective of feature selection is to extract a subset of features by removing redundant features and maintaining the informative features. Further, since 32-channel EEG module was used to acquire the brain activity, the feature selection also tested the feasibility of using fewer electrodes for practical applications.

Thus, the feature selection seems particularly important not only to improve the computational efficiency, but also expand the applicability of EEG-based human- centered system in real-world applications. This study adopted F-score index, one of statistical methods to measure the ratio of between- and within-class variance, for sorting each feature in descending order accounting for discrimination between different EEG patterns, i.e., the larger the F- score, the greater the discrimination power.

The F-score of the ith feature is defined as follows [30]: where xi and xl,i are the average of the ith feature of the entire data set and class l data set (l = 1 ∼ g, g = 4 for four emotion labels), respectively, xl,k,i is the ith feature of the kth of the class l instance, and nl is the number of instances of class l.