Derek McColl Alexander Hong Naoaki Hatakeyama Goldie Nejat

Slides:



Advertisements
Similar presentations
Affective Facial Expressions Facilitate Robot Learning Joost Broekens Pascal Haazebroek LIACS, Leiden University, The Netherlands.
Advertisements

Breakout session B questions. Research directions/areas Multi-modal perception cognition and interaction Learning, adaptation and imitation Design and.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Joemon M Jose (with Ioannis Arapakis & Ioannis Konstas) Department of Computing Science.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
1 A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions Zhihong Zeng, Maja Pantic, Glenn I. Roisman, Thomas S. Huang Reported.
Recognizing Emotions in Facial Expressions
FTCE 3.3 Identify and Apply Motivational Theories and Techniques That Enhance Student Learning Learning – Relatively permanent improvement in performance.
Developing Effective Questioning In Teaching Games For Understanding (TGfU) Pearson & Webb, 2008.
Facial Feature Detection
Interstate New Teacher Assessment and Support Consortium (INTASC)
Technology to support psychosocial self-management Kurt L. Johnson, Ph.D. Henry Kautz, Ph.D.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Emotion Recognition from Electromyography and Skin Conductance Arturo Nakasone (University of Tokyo) Helmut Prendinger (National Institute of Informatics,
Social Emotional Needs of GATE Students WELCOME PARENTS BIENVENIDOS PADRES DE FAMILIA 1.
“Early Detection of Learning Disabilities – The Situation Today”. Lalitha Ramanujan Alpha to Omega Learning Centre 1.
Multimodal Information Analysis for Emotion Recognition
The Expression of Emotion: Nonverbal Communication.
Module 16 Emotion.
Emotion. Emotion Defining Emotion Defining Emotion Elements of Emotion 1: The Body Elements of Emotion 1: The Body Elements of Emotion 2: The Mind Elements.
Performance Comparison of Speaker and Emotion Recognition
MIT Artificial Intelligence Laboratory — Research Directions The Next Generation of Robots? Rodney Brooks.
Emotional Intelligence
Unit 4: Emotions.
The Expression of Emotion: Nonverbal Communication.
Presented By Meet Shah. Goal  Automatically predicting the respondent’s reactions (accept or reject) to offers during face to face negotiation by analyzing.
Interpreting Ambiguous Emotional Expressions Speech Analysis and Interpretation Laboratory ACII 2009.
Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal.
 ASMARUL SHAZILA BINTI ADNAN  Word Emotion comes from Latin word, meaning to move out.  Human emotion can be recognize from facial expression,
School Bus Driver Inservice Special Care on Basic Buses 1.
Introduction to Machine Learning, its potential usage in network area,
Emotion is a psychological state involving
Emotions Emotions seem to rule our daily lives.
Social Interaction.
Recharge for Resilience April 19, 2017 Lynne Brehm and Sami Bradley
Motivation and Emotions
SOCIAL INTEGRATION OF IMMIGRANTS STUDENTS BY USING DRAMA APPLICATIONS
San Diego May 22, 2013 Giovanni Saponaro Giampiero Salvi
Modeling Facial Shape and Appearance
HPE Achievement Objectives
Theories of Emotion 3 Theories of Emotion.
Teacher Prevention Strategies for Challenging Behaviours
GESTURE RECOGNITION TECHNOLOGY
Therapeutic Options: Speech Therapy
When to engage in interaction – and how
Communication Skills COMM 101 Lecture#2
Voluntary (Motor Cortex)
Proper Etiquette with Nonhuman Primates
Motivation and Engagement in Learning
Presentation by Sasha Beltinova
Expressing and Experiencing Emotion
K-3 Student Reflection and Self-Assessment
WORKING WITH PLAYERS AND COACHES
What Is Emotional Intelligence?
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
OTHER MOTIVATIONS.
WORKING WITH PLAYERS AND COACHES
Emotional Messages.
Emotion Recognition from Electromyography and Skin Conductance
Expressed Emotion Emotions are expressed on the face, by the body, and by the intonation of voice. Is this non-verbal language of emotion universal?
Chapter 7: Social Behaviour and Personality in Infants and Toddlers
Discourse Analysis Pragmatics and Nonverbal Communication
Introduction to Computer-mediated Communication
42.1 – Describe our ability to communicate nonverbally, and discuss gender differences in this capacity. Expressed Emotion Emotions are expressed on the.
COMMUNICATION.
Social and Emotional Development.
A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI (2015) Derek McColl[1], Alexander Hong[1][2], Naoaki Hatakeyama[1],
Social-Emotional Learning
LEARNER-CENTERED PSYCHOLOGICAL PRINCIPLES. The American Psychological Association put together the Leaner-Centered Psychological Principles. These psychological.
Presentation transcript:

A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI Derek McColl Alexander Hong Naoaki Hatakeyama Goldie Nejat Beno Benhabib This is a survey paper on Human Robot Interaction, HRI, so I’ll be explaining some of the main concepts behind HRI and its evaluation Then, I’ll go through some examples of research I culled from the paper which I found interesting

Human-Robot Interactions (HRI) Engaging effectively in bi- directional communication requires responding appropriately to social cues and human affect Social intelligence: allows a robot to relate to, understand, and interact and share information with people in real-world human- centered environments Human-affect detection: Facial expressions Body language Voice Physiological signals Combination of the above HRI encompasses not only how robots engage humans but also how humans respond to the interactions

Social HRI Social HRI is a subset of the field where robots interact using natural human communication modalities: Speech Body language Facial expressions Allows humans to interact with robots without prior training Ultimately, less work is required by the user and tasks can be completed more efficiently

Affect Categorization Models

Affect Categorization Models Categorical Models Dimensional Models Finite number of discrete states each representing an affect category Darwin’s 6 states: happiness, surprise, fear, disgust, anger and sadness Thomkins’ 9 states: joy, interest, surprise, anger, fear, shame, disgust, dis-smell and anguish Clearly distinguish one affect from the others, lack the ability to classify any affect not included in the model Continuous models which use feature vectors to represent affective expressions in multi-dimensional spaces, allowing for representation of varying intensities of affect Models have the ability to encompass all possible affective states and their variations, lack strength to distinguish one prominent affect from another, can often be confused Most common is two-dimensional circumplex

HRI Scenarios

HRI Scenarios Collaborative HRI Assistive HRI Robot and person working together to complete a common task Robot must be able to identify affective state and avoid misinterpretations to improve team performance Aid people through physical, social and/or cognitive assistance Promote effective engagement in interactions aimed at improving health or wellbeing For example assisting the elderly or children with autism

HRI Scenarios Mimicry HRI Multi-Purpose HRI Robot or person imitating verbal or nonverbal behaviors of the other Can be used to encourage coordination and cooperation For example a robot can imitate facial expressions of a human user Sony AIBO’s robotic dog uses LEDs and moves its body, head and ears to imitate human affect Bidirectional communication for various applications encompassing all of the prior forms of interactions

Selected Research

Facial Recognition Face to face communication is fundamental, facial expressions are used for social motives and directly influence the behaviors of others in social settings Emulating empathy: recognize affective states, display their own affect and express perspective intentions Challenges: Conditions such as lighting and distances Real time computational requirements Processing spontaneous and dynamic expressions and states Physical and technology constraints Models: adaboost, multi-layer perceptrons, SVMs, NNs, dynamic bayesian iCat Children playing chess with cat, affect of child classified as positive, negative or neutral Variety of features put into SVMs to estimate probability of one of three valence levels Empathetic strategy employed when player was not experiencing positive valence Reinforcement technique used to adapt robot’s behavior to player

Body Language Based Affect Recognition Body language such as arm positions, body orientation and trunk relaxation provide information about human affect Challenges: Identifying affective states from body language is still an open area in psychology and human behavior Most of today’s research is focused on hand and arm gestures Models: SVM, KNN Mimicry HRI: NAO was used to engage children in movement imitation games A camera was used to detect the child’s movements - then the robot determines energy and acceleration of the movement Researchers also investigated how the body language and mood of a user changed during mimicry Nao was able to detect different waves: happy, angry, sad and polite based on acceleration-frequency matrix of the gesture https://www.youtube.com/watch?v=2laujomh0JY

Affect-aware robots were mainly providing a collaborative role by playing the game with the user, or they were involved in behavioral mimicry https://www.youtube.com/watch?v=JokcWOIXt-c

Voice Based Affect Vocal displays are states determined by both physiological changes and social display rules Changes in frequency, articulation, high frequency energy can be used to determine states Models include Naïve Bayes and HMM Collaborative HRI: KaMERo robot designed to engage individuals in games of ‘20 questions’ to identify object that users was thinking of Used an onboard microphone to detect emotions and to determine its own emotional display Participants enjoyed the interaction more if the robot displayed an emotional behavior Voice signal features were trained with HMMs for different emotions Multipurpose HRI: Mung robot developed to detect affective states and interact with humans asking questions

A small round robot, Mung, was developed to detect the affective states of neutral, joy, sadness, and anger https://www.youtube.com/watch?v=RSc-Qwe98O8&t=3s

Physiological Signals Usually use ECG, EMG or skin sensors to identify heart rate, facial muscle activity and skin conductance level Models include, HMMs, SVM, NN and learning techniques Collaborative HRI: Trilobot robot was developed for collaboration during a navigation and exploration task such as assisting an astronaut A person’s anxiety level is detected using ECG, skin conductance and muscle movement The robot would detect anxiety levels and ask if assistance was needed to then alert other people when necessary HMMs were used to classify a person’s valence using heart rate, skin conductance and muscle activity above an eyebrow

Automated affect detection would promote effective engagement in interactions aimed at improving a person’s health and wellbeing, e.g. interventions for children with autism and for the elderly https://www.youtube.com/watch?v=cFACKqWu9Nk

Multimodal More sensors than the other modalities, 2D cameras, microphones, ECG, EMG Advantaged because when one modality is not available due to disturbances or occlusion other modes are available However, multimodal data is more difficult to acquire and process – making it challenging for learning algorithms Mimicry HRI: Muecas robot used for speech recognition and facial expression recognition Muecas was able to imitate human affect and respond in real time with facial expressions and a head pose Facial recognition was accomplished through Gabor filtering and DBN classification Speech rate was calculated using a Fourier transform

Recent and Future work

Future work Investigation of a variety of affect categorization models to respond to large number of states displayed by humans Working on 3d recognition, since current methods work well when human is front facing the camera Combinations of sensors and features for highest recognition rates Body language is very specific to culture and context, further research will work to integrate this aspect into body language recognition Long term studies will get us closer to an age where social robots will be integrated into every day life

What is interesting about HRI? HRI research doesn’t simply focus on whether robots give ‘right or wrong’ answers to human behavior, but also incorporates examination of human behavior with respect to robots and how it is influenced throughout the interaction with the robot There is a wide variety of use cases, including healthcare, partner robots, navigation and exploration