EEG – BASED EMOTION RECOGNITION in MUSIC LEARNING.

Slides:



Advertisements
Similar presentations
Study of Change Blindness EEG Synchronization using Wavelet Coherence Analysis Professor: Liu Student: Ruby.
Advertisements

Emotion Psychology, 4/e by Saul Kassin CHAPTER 12: Emotion 4/12/2017
Associations of behavioral parameters of speech emotional prosody perception with EI measures in adult listeners Elena Dmitrieva Kira Zaitseva, Alexandr.
Recognition memory amongst individuals varying in the personality dimensions of Reward Seeking and Impulsivity Chase Kluemper 1, Chelsea Black 1, Yang.
NeuroPhone: Brain-Mobile Phone Interface using a Wireless EEG Headset Source: MobiHeld 2010 Presented By: Corey Campbell.
Brain-Computer Interfaces for Communication in Paralysis: A Clinical Experimental Approach By Adil Mehmood Khan.
Elementary Statistics MOREHEAD STATE UNIVERSITY
Look Who’s Talking Now SEM Exchange, Fall 2008 October 9, Montgomery College Speaker Identification Using Pitch Engineering Expo Banquet /08/09.
MUSICAL SCALE IDENTIFICATION USING NEURAL NETWORKS -Lyndon Quadros.
Facial expression as an input annotation modality for affective speech-to-speech translation Éva Székely, Zeeshan Ahmed, Ingmar Steiner, Julie Carson-Berndsen.
Vocal Emotion Recognition with Cochlear Implants Xin Luo, Qian-Jie Fu, John J. Galvin III Presentation By Archie Archibong.
AUTOMATIC SPEECH CLASSIFICATION TO FIVE EMOTIONAL STATES BASED ON GENDER INFORMATION ABSTRACT We report on the statistics of global prosodic features of.
Standard electrode arrays for recording EEG are placed on the surface of the brain. Detection of High Frequency Oscillations Using Support Vector Machines:
LYU0103 Speech Recognition Techniques for Digital Video Library Supervisor : Prof Michael R. Lyu Students: Gao Zheng Hong Lei Mo.
Body Sensor Networks to Evaluate Standing Balance: Interpreting Muscular Activities Based on Intertial Sensors Rohith Ramachandran Lakshmish Ramanna Hassan.
1 Computerized Recognition of Emotions from Physiological Signals Physiological Signals Dec Mr. Amit Peled Mr. Hilel Polak Instructor : Mr. Eyal.
Research Plan of Affective learning Affective Learning Group.
ENTERFACE ’10 Amsterdam, July-August 2010 Hamdi Dibeklio ğ lu Ilkka Kosunen Marcos Ortega Albert Ali Salah Petr Zuzánek.
1 A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions Zhihong Zeng, Maja Pantic, Glenn I. Roisman, Thomas S. Huang Reported.
INTRODUCTION TO DATA STRUCTURE. Topics To Be Discussed………………………. Meaning of Data Structure Classification of Data Structure Data Structure Operations.
Mark 洪偉翔 Andy 楊燿宇 Image Emotion.
Chapter 5 Data mining : A Closer Look.
1 EmuPlayer Music Recommendation System Based on User Emotion Using Vital-sensor KMSF- sunny 親: namachan さん.
Neural Decoding: Model and Algorithm for Evidence Accumulator Inference Thomas Desautels University College London Gatsby Computational Neuroscience Group.
Analysis of Physiological Responses from Multiple Subjects for Emotion Recognition 2012 IEEE 14th International Conference on e-Health Networking, Applications.
Measuring the brain’s response to temporally modulated sound stimuli Chloe Rose Institute of Digital Healthcare, WMG, University of Warwick, INTRODUCTION.
Qualitative Analysis A qualitative researcher starts with a research question and little else! Theory develops during the data collection process. Theory.
The Team Adriano Claro Monteiro Alain de Cheveign Anahita Mehta
Categorizing Emotion in Spoken Language Janine K. Fitzpatrick and John Logan METHOD RESULTS We understand emotion through spoken language via two types.
Music Emotion Recognition 許博智 謝承諺.
Prediction model building and feature selection with SVM in breast cancer diagnosis Cheng-Lung Huang, Hung-Chang Liao, Mu- Chen Chen Expert Systems with.
K Nearest Neighbors Saed Sayad 1www.ismartsoft.com.
1 Methods for detection of hidden changes in the EEG H. Hinrikus*, M.Bachmann*, J.Kalda**, M.Säkki**, J.Lass*, R.Tomson* *Biomedical Engineering Center.
Graz-Brain-Computer Interface: State of Research By Hyun Sang Suh.
Current work at UCL & KCL. Project aim: find the network of regions associated with pleasant and unpleasant stimuli and use this information to classify.
Exploration of Instantaneous Amplitude and Frequency Features for Epileptic Seizure Prediction Ning Wang and Michael R. Lyu Dept. of Computer Science and.
Spam Detection Ethan Grefe December 13, 2013.
1 Opinion Retrieval from Blogs Wei Zhang, Clement Yu, and Weiyi Meng (2007 CIKM)
Aaron Raymond See Department of Electrical Engineering Southern Taiwan University 11/17/20151.
Analysis of Movement Related EEG Signal by Time Dependent Fractal Dimension and Neural Network for Brain Computer Interface NI NI SOE (D3) Fractal and.
Advisor : Prof. Sing Ling Lee Student : Chao Chih Wang Date :
EEG COORDINATION DYNAMICS: self-organization in the brain The Human Brain and Behavior Laboratory Emmanuelle Tognoli 12/17/2007 Merck – West-Point
Spectral Analysis of Resting State Electroencephalogram (EEG) in Subjects With and Without Family Histories of Alcoholism Spectral Analysis of Resting.
Copyright © 2008 Pearson Education, Inc.. Slide 1-2 Chapter 1 The Nature of Statistics Section 1.3 Other Sampling Designs.
Advanced Science and Technology Letters Vol.91 (Bioscience and Medical Research 2015), pp Towards emotion.
Molecular Classification of Cancer Class Discovery and Class Prediction by Gene Expression Monitoring.
GENDER AND AGE RECOGNITION FOR VIDEO ANALYTICS SOLUTION PRESENTED BY: SUBHASH REDDY JOLAPURAM.
Cultural Differences and Similarities in Emotion Recognition Vladimir Kurbalija, Mirjana Ivanović, Miloš Radovanović, Zoltan Geler, Dejan Mitrović, Weihui.
Using Wikipedia for Hierarchical Finer Categorization of Named Entities Aasish Pappu Language Technologies Institute Carnegie Mellon University PACLIC.
WHAT IS MAAGNET?  Maag’s online library catalog  Contains all materials owned by Maag  Books, Journals, Audio/Video, Ebooks, Microforms, Government.
Describing People: A Poselet-Based Approach to Attribute Classification.
EEG processing based on IFAST system and Artificial Neural Networks for early detection of Alzheimer’s disease.
Goggle Gist on the Google Phone A Content-based image retrieval system for the Google phone Manu Viswanathan Chin-Kai Chang Ji Hyun Moon.
Musical Genre Categorization Using Support Vector Machines Shu Wang.
IPSIHAND AN EEG BASED BRAIN COMPUTER INTERFACE FOR MOTOR REHABILITATION.
Chapter 15: Classification of Time- Embedded EEG Using Short-Time Principal Component Analysis by Nguyen Duc Thang 5/2009.
Next, this study employed SVM to classify the emotion label for each EEG segment. The basic idea is to project input data onto a higher dimensional feature.
Draw Your Own Story : Paper and Pencil Interactive Storytelling Edirlei Soares de Lima, Bruno Feijó, Simone Barbosa, Antonio L. Furtado, Angelo Ciarlini,
Native Language Shifts Across Sleep-wake States in Bilingual Sleep-talkers Juan A. Pareja, Eloy de Pablos, Ana B. Caminero, Isabel Millán and José LDobato.
A Cortico-Muscular-Coupling based Single-Trial Detection in EEG-EMG based BCI for Personalized Neuro-Rehabilitation of Stroke Patients 1. Introduction.
1 Clustering Web Queries John S. Whissell, Charles L.A. Clarke, Azin Ashkan CIKM ’ 09 Speaker: Hsin-Lan, Wang Date: 2010/08/31.
Detection Of Anger In Telephone Speech Using Support Vector Machine and Gaussian Mixture Model Prepared By : Siti Marahaini Binti Mahamood.
PSYCH 500 Week 4 Individual Similarities and Differences in Adolescent Development Is the experience of being an adolescent the same for males and females?
Course Outline (6 Weeks) for Professor K.H Wong
University of Rochester
* the sampling rate and filter bandwidth were set to 500Hz and 1-10 Hz, respectively. * an additional 60Hz notch filter was employed to avoid the power-line.
Aaron Raymond See Department of Electrical Engineering
Major Project Presentation Phase - I
Machine Learning for Visual Scene Classification with EEG Data
Record your QUESTIONS as your read.
Presentation transcript:

EEG – BASED EMOTION RECOGNITION in MUSIC LEARNING

I INTRODUCTION

On going brain activity can be recorded as EEG to discover the links between emotional states and brain activity. Algorithm used: Machine-Learning To categorize EEG dynamics according to subject self-reported emotional states during music listening.

SVM was employed to classify 4 emotional states (joy, anger, sadness and pleasure) obtained an average classification accuracy of 82.29% ± 3.06% across 26 subjects.

The objective of this study is to systematically uncover the association between the EEG dynamics and Emotions by 1) Searching emotion-specific features of the EEG 2) Testing the efficacy of different classifiers

II DATA COLLECTION AND EXPERIMENT PROCEDURE

*EEG data were collected from 26 healthy subjects (16 males & 10 females; age 24.4 ± 2.53) during music listening. * A 32-channel EEG module (Neuroscan, Inc) arranged according to international system was used. * all leads were reference to link mastoids (average of A1 and A2) and a ground electrode was located in the forehead.