Emotion Recognition from Electromyography and Skin Conductance

Slides:



Advertisements
Similar presentations
Associations of behavioral parameters of speech emotional prosody perception with EI measures in adult listeners Elena Dmitrieva Kira Zaitseva, Alexandr.
Advertisements

Group 3-Youngjin Kang, Alyssa Nolde, Antoinette Sellers, Zhiheng Zhou
Based on a fine paper byPhilippe Zimmermann
1 IUT de Montreuil Université Paris 8 Emotion in Interaction: Embodied Conversational Agents Catherine Pelachaud.
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
Analysis of Physiological Responses from Multiple Subjects for Emotion Recognition 2012 IEEE 14th International Conference on e-Health Networking, Applications.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
An Architecture for Empathic Agents. Abstract Architecture Planning + Coping Deliberated Actions Agent in the World Body Speech Facial expressions Effectors.
APML, a Markup Language for Believable Behavior Generation Soft computing Laboratory Yonsei University October 25, 2004.
Element 2: Discuss basic computational intelligence methods.
Emotion Recognition from Electromyography and Skin Conductance Arturo Nakasone (University of Tokyo) Helmut Prendinger (National Institute of Informatics,
Vinnette Gibson Intelligent Tutoring Systems Troy University EDU 6606.
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
EMPATH: A Neural Network that Categorizes Facial Expressions Matthew N. Dailey and Garrison W. Cottrell University of California, San Diego Curtis Padgett.
Why would anyone in their right mind do Psychophysiology Graduate Methods January 27, 2003 Becky Ray.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
C ONTEXT AWARE SMART PHONE YOGITHA N. & PREETHI G.D. 6 th SEM, B.E.(C.S.E) SIDDAGANGA INSTITUTE OF TECHNOLOGY TUMKUR
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
IMOTIONS EMOTION RECOGNITION RW Taggart, M Dressler, S Khan, P Kumar JF Coppola, C Tappert CSIS 692/481 Pace University May 5, 2016.
Detect Driver’s Emotion State By Using EEG Analysis Research Methodology Name: Ahmad Affandi Supli Matrix No: Prepared For: Dr. Farzana binti Kabir.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
Adding Dynamic Nodes to Reliability Graph with General Gates using Discrete-Time Method Lab Seminar Mar. 12th, 2007 Seung Ki, Shin.
Microprocessors Data Converters Analog to Digital Converters (ADC)
The problem. Psychologically plausible ways of
Physiological response:
3.3 Fundamentals of data representation
Alison Burros, Kallie MacKay, Jennifer Hwee, & Dr. Mei-Ching Lien
Talal H. Noor, Quan Z. Sheng, Lina Yao,
Motivation and Emotions
Derek McColl Alexander Hong Naoaki Hatakeyama Goldie Nejat
Trustworthiness Management in the Social Internet of Things
Chapter 13 Emotion.
Emotion, Day 2.
Introduction to Sampling Distributions
Meghan Brzinski and David Havas, Ph.D.
ARD Presentation January, 2012 BodyPointer.
Thinking About Psychology: The Science of Mind and Behavior
Chapter 9 Lesson 3 Section 4: Emotion.
FACE DETECTION USING ARTIFICIAL INTELLIGENCE
Chapter 4: Emotions and Stress Management
Yusmadi Yah Jusoh
Il-Kyoung Kwon1, Sang-Yong Lee2
Voluntary (Motor Cortex)
Recall The Team Skills Analyzing the Problem (with 5 steps)
Presentation by Sasha Beltinova
Algorithm Design.
Personalizing conversational agent based e-learning applications
Emotion…and Motivation..
OTHER MOTIVATIONS.
How to Build Smart Appliances?
Multiple Aspect Modeling of the Synchronous Language Signal
The Vision of Autonomic Computing
Assoc. Prof. Dr. Syed Abdul-Rahman Al-Haddad
Expressed Emotion Emotions are expressed on the face, by the body, and by the intonation of voice. Is this non-verbal language of emotion universal?
Emotion Ch. 13 AP Psychology.
The Role of Arousal in Mood Mediation: A Closer Look at Mood Congruent Memory Eric Eich 1/17/2019.
Ying Dai Faculty of software and information science,
Ying Dai Faculty of software and information science,
Ying Dai Faculty of software and information science,
Computer Architecture Group U.S.C.
Do Now Put away cell phones Take out journals.
Quick review on Sex & Orientation EMOTION: Theories and Expression
“the Most Dangerous Game”
Emotion…and Motivation..
11.2 Inference for Relationships
Evolutionary Ensembles with Negative Correlation Learning
Software Development Process Using UML Recap
Software Architecture Taxonomy
Continuous Random Variables: Basics
Iterative Projection and Matching: Finding Structure-preserving Representatives and Its Application to Computer Vision.
Presentation transcript:

Emotion Recognition from Electromyography and Skin Conductance Arturo Nakasone (University of Tokyo) Helmut Prendinger (National Institute of Informatics, Tokyo) Mitsuru Ishizuka (University of Tokyo) 1. Emotion Recognition - Introduction 6. Architecture of Emotion Recognition Component Embodied Conversational Agents (ECA) are being developed to enhance communication in a natural way between humans and computer applications. In this context, emotions are considered one of the key components to increase the believability of ECAs. By analyzing emotional state inputs, ECAs may be able to adapt their behaviors, allowing users to experience the interaction in a more sensible way. Applications like Affective Gaming are making use of emotion recognition through physiological signal analysis in order to control several aspects of the gaming experience 2. Objective of Research Develop a real time Emotion Recognition Component (ERC) based on the analysis of two physiological signals : Electromyography Skin conductance 3. Experimental Gaming Environment The ERC was integrated to a game where the user plays a card game called “Skip-Bo” against the ECA Max. The perceived emotion from the ERC allows Max to adapt his own emotional behavior expressed by his facial expressions and game play Initialization parameters are provided to control data sampling rates, data file storage and queue sizes for retrieved values. The Device Layer retrieves the data from the Procomp Infiniti unit and store them in separate queues corresponding to each of the sensors attached to the unit. Prompted by the ECA Max, the mean of the current values stored in the queues are calculated and compared to the baselines in order to search for meaningful changes in the valence/arousal space. “happy” “surprised” 7. Emotion Resolution through Bayesian Networks “angry” “reproach” EMG and SC signal values 4. Relation between Emotions and Physiological Signals In his research, P.J. Lang claimed that emotions can be characterized in terms of judged valence (pleasant or unpleasant) and arousal (calm or aroused) The relation between physiological signals and arousal/valence is established due to the activation of the autonomic nervous system when emotions are elicited Meaningful changes in EMG and/or SC are categorized into discrete levels in the Categorization Layer In our network, the value from the categorized skin conductance signal is used to determine arousal directly. Since the value from the categorized electromyography signal cannot completely determine the sign of the valence component, a non-physiological node was introduced to discriminate this value based on the current outcome of the game (i.e. game status). Probability values have been set according to psychophysiology literature. 8. Conclusions and Future Work Skin Conductance (SC) and Electromyography (EMG) have been chosen because of their high reliability. Skin Conductance determines arousal level through linear relation Electromyography has been shown to correlate with negatively valenced emotions Emotions are key components in the development of truly believable ECAs. Even if people do not perceive them as humans, some suspension of disbelief is possible when emotions come into play. Empathic behavior contributes to a better interaction in terms of user experience. In some cases, the use of only two signals may not be enough to properly handle the emotion recognition process. Therefore, other kind of information like gaze and pupil dilation will be included in our ERC to further enhance the emotion recognition network. 5. Issues in Real Time Assessment of Physiological Data Baseline values calculations were performed by inducing an initial relaxation period on the subject of approx. 3 minutes. These values were used for comparison purposes. Properly detection of emotional activity required sampling every 50 milliseconds and using a 5 second window of data values