Emotion Recognition from Electromyography and Skin Conductance Arturo Nakasone (University of Tokyo) Helmut Prendinger (National Institute of Informatics,

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

Affective Facial Expressions Facilitate Robot Learning Joost Broekens Pascal Haazebroek LIACS, Leiden University, The Netherlands.
Bridgette Parsons Megan Tarter Eva Millan, Tomasz Loboda, Jose Luis Perez-de-la-Cruz Bayesian Networks for Student Model Engineering.
A cognitive theory for affective user modelling in a virtual reality educational game George Katsionis, Maria Virvou Department of Informatics University.
Intelligent Agents Russell and Norvig: 2
Associations of behavioral parameters of speech emotional prosody perception with EI measures in adult listeners Elena Dmitrieva Kira Zaitseva, Alexandr.
Group 3-Youngjin Kang, Alyssa Nolde, Antoinette Sellers, Zhiheng Zhou
Introduction The aim of this study is to investigate the psychophysiological correlates of pleasant emotions with reference to the “bio- informational.
Physiological responses to violent game events: Does it matter whose character you kill? J Matias Kivikangas, M.A. & Niklas Ravaja, Ph.D. CKIR, Helsinki.
WHAT IS INTERACTION DESIGN?
 How Does the Brain Process Emotion?  How Can You Tell if Someone is Lying?  What Causes Emotion?  What Makes People Happy?
Dept. of Computer Science & Engineering, CUHK1 Trust- and Clustering-Based Authentication Services in Mobile Ad Hoc Networks Edith Ngai and Michael R.
Based on a fine paper byPhilippe Zimmermann
1 IUT de Montreuil Université Paris 8 Emotion in Interaction: Embodied Conversational Agents Catherine Pelachaud.
Part two. 3.2 operating system architecture  Software have two categories  Application software  System software  Application software: consists of.
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
Wilma Bainbridge Tencia Lee Kendra Leigh
Operating system Part two Introduction to computer, 2nd semester, 2010/2011 Mr.Nael Aburas Faculty of Information.
Analysis of Physiological Responses from Multiple Subjects for Emotion Recognition 2012 IEEE 14th International Conference on e-Health Networking, Applications.
Affective Computing Multimedia Communications University of Ottawa Ana Laura Pérez Rocha Anwar Fallatah.
Emotion.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
1. 2 Purpose of This Presentation ◆ To explain how spacecraft can be virtualized by using a standard modeling method; ◆ To introduce the basic concept.
An Architecture for Empathic Agents. Abstract Architecture Planning + Coping Deliberated Actions Agent in the World Body Speech Facial expressions Effectors.
Affective Computing: Agents With Emotion Victor C. Hung University of Central Florida – Orlando, FL EEL6938: Special Topics in Autonomous Agents March.
Active Monitoring in GRID environments using Mobile Agent technology Orazio Tomarchio Andrea Calvagna Dipartimento di Ingegneria Informatica e delle Telecomunicazioni.
APML, a Markup Language for Believable Behavior Generation Soft computing Laboratory Yonsei University October 25, 2004.
Element 2: Discuss basic computational intelligence methods.
INTRODUCTION DISCUSSION EMOTIONAL RESPONSE TO EROTIC STIMULI: GENDER DIFFERENCES According to Lang’s model, emotional response is organised along the two.
System Analysis of Virtual Team Collaboration Management System based on Cloud Technology Panita Wannapiroon, Ph.D. Assistant Professor Division of Information.
Emotion Psychology Introduction Emotions are a mix of: Emotions are a mix of: Physiological arousal of some sort Physiological arousal of some sort.
Beyond Gazing, Pointing, and Reaching A Survey of Developmental Robotics Authors: Max Lungarella, Giorgio Metta.
Submission No. 17 An Affective Agent (AA) for Predicting Composite Emotions AAMAS Demo 2015 Submission No. 17 Demo site:
 For all animals, emotions have an adaptive purpose  Emphasis on  Behavior Changes and Facial Expressions  Physiological Changes [Schirmer, A. (2014).
Meta-Cognition, Motivation, and Affect PSY504 Spring term, 2011 April 6, 2011.
Vinnette Gibson Intelligent Tutoring Systems Troy University EDU 6606.
Gottman’s Social- Psychophysiological Research Protocol Gottman, J. M., Katz, L. F., & Hooven, C. (1997). Meta-emotion: How families communicate emotionally.
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
Introduction to Neural Networks and Example Applications in HCI Nick Gentile.
Feedback Elisabetta Bevacqua, Dirk Heylen,, Catherine Pelachaud, Isabella Poggi, Marc Schröder.
2005/12/021 Fast Image Retrieval Using Low Frequency DCT Coefficients Dept. of Computer Engineering Tatung University Presenter: Yo-Ping Huang ( 黃有評 )
End-to-End Efficiency (E 3 ) Integrating Project of the EC 7 th Framework Programme General View of the E3 Prototyping Environment for Cognitive and Self-x.
Similarity of Target and Observer – Prosocial behaviour Positive correlations are reported for the relation between similarity of target and observer and.
Srinivas Cheekati( ) Instructor: Dr. Dong-Chul Kim
EMPATH: A Neural Network that Categorizes Facial Expressions Matthew N. Dailey and Garrison W. Cottrell University of California, San Diego Curtis Padgett.
TEMPLATE DESIGN © E-Eye : A Multi Media Based Unauthorized Object Identification and Tracking System Tolgahan Cakaloglu.
Why would anyone in their right mind do Psychophysiology Graduate Methods January 27, 2003 Becky Ray.
Emotional Intelligence
Chapter 7 Affective Computing. Structure IntroductionEmotions Emotions & Computers Applications.
Using Emotional Coping Strategies in Intelligent Tutoring Systems Soumaya Chaffar & Claude Frasson Departement d’Informatique et de Recherche Opérationnelle.
Emotions, Stress, and Health. Emotion Purposes? 1.A safeguard of survival 2.An enrichment of experience 3.A powerful communication system.
EMOTION BY: JORDAN, MATT, DOUG, AND JORDAN. WHAT IS EMOTION? Emotion- a natural instinctive state of mind deriving from one's circumstances, mood, or.
Interactive Emotional Content Communications System using Portable Wireless Biofeedback Device IEEE Transactions on Consumer Electronics, Vol. 57, No.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
C ONTEXT AWARE SMART PHONE YOGITHA N. & PREETHI G.D. 6 th SEM, B.E.(C.S.E) SIDDAGANGA INSTITUTE OF TECHNOLOGY TUMKUR
Discuss the extent to which cognitive and biological factors interact in emotion.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
BlueEyes Human Operator Monitoring System BlueEyes Human-Operator Monitoring System PRESENTED BY:- AYUSHI TYAGI B1803B37.
IMOTIONS EMOTION RECOGNITION RW Taggart, M Dressler, S Khan, P Kumar JF Coppola, C Tappert CSIS 692/481 Pace University May 5, 2016.
Detect Driver’s Emotion State By Using EEG Analysis Research Methodology Name: Ahmad Affandi Supli Matrix No: Prepared For: Dr. Farzana binti Kabir.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
Adding Dynamic Nodes to Reliability Graph with General Gates using Discrete-Time Method Lab Seminar Mar. 12th, 2007 Seung Ki, Shin.
Be a Great Teacher? (Lesson Plan Development)
Meghan Brzinski and David Havas, Ph.D.
ARD Presentation January, 2012 BodyPointer.
Presentation by Sasha Beltinova
Emotion Recognition from Electromyography and Skin Conductance
The Role of Arousal in Mood Mediation: A Closer Look at Mood Congruent Memory Eric Eich 1/17/2019.
Presentation transcript:

Emotion Recognition from Electromyography and Skin Conductance Arturo Nakasone (University of Tokyo) Helmut Prendinger (National Institute of Informatics, Tokyo) Mitsuru Ishizuka (University of Tokyo) 1. Emotion Recognition - Introduction 1.In his research, P.J. Lang claimed that emotions can be characterized in terms of judged valence (pleasant or unpleasant) and arousal (calm or aroused) 2.The relation between physiological signals and arousal/valence is established due to the activation of the autonomic nervous system when emotions are elicited Embodied Conversational Agents (ECA) are being developed to enhance communication in a natural way between humans and computer applications. In this context, emotions are considered one of the key components to increase the believability of ECAs. By analyzing emotional state inputs, ECAs may be able to adapt their behaviors, allowing users to experience the interaction in a more sensible way. Applications like Affective Gaming are making use of emotion recognition through physiological signal analysis in order to control several aspects of the gaming experience 2. Objective of Research 3. Experimental Gaming Environment a)The ERC was integrated to a game where the user plays a card game called “Skip-Bo” against the ECA Max. b)The perceived emotion from the ERC allows Max to adapt his own emotional behavior expressed by his facial expressions and game play Develop a real time Emotion Recognition Component (ERC) based on the analysis of two physiological signals : –Electromyography –Skin conductance 4. Relation between Emotions and Physiological Signals 3.Skin Conductance (SC) and Electromyography (EMG) have been chosen because of their high reliability. Skin Conductance determines arousal level through linear relation Electromyography has been shown to correlate with negatively valenced emotions 8. Conclusions and Future Work “angry” “happy” “surprised” “reproach” EMG and SC signal values 6. Architecture of Emotion Recognition Component 1.Initialization parameters are provided to control data sampling rates, data file storage and queue sizes for retrieved values. 2.The Device Layer retrieves the data from the Procomp Infiniti unit and store them in separate queues corresponding to each of the sensors attached to the unit. 3.Prompted by the ECA Max, the mean of the current values stored in the queues are calculated and compared to the baselines in order to search for meaningful changes in the valence/arousal space. 7. Emotion Resolution through Bayesian Networks 1.Meaningful changes in EMG and/or SC are categorized into discrete levels in the Categorization Layer 2.In our network, the value from the categorized skin conductance signal is used to determine arousal directly. 3.Since the value from the categorized electromyography signal cannot completely determine the sign of the valence component, a non-physiological node was introduced to discriminate this value based on the current outcome of the game (i.e. game status). 4.Probability values have been set according to psychophysiology literature. 5. Issues in Real Time Assessment of Physiological Data 1.Baseline values calculations were performed by inducing an initial relaxation period on the subject of approx. 3 minutes. These values were used for comparison purposes. 2.Properly detection of emotional activity required sampling every 50 milliseconds and using a 5 second window of data values 1.Emotions are key components in the development of truly believable ECAs. Even if people do not perceive them as humans, some suspension of disbelief is possible when emotions come into play. 2.Empathic behavior contributes to a better interaction in terms of user experience. 3.In some cases, the use of only two signals may not be enough to properly handle the emotion recognition process. Therefore, other kind of information like gaze and pupil dilation will be included in our ERC to further enhance the emotion recognition network.