Download presentation
Presentation is loading. Please wait.
Published byNorah Miles Modified over 6 years ago
1
A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI
Derek McColl Alexander Hong Naoaki Hatakeyama Goldie Nejat Beno Benhabib This is a survey paper on Human Robot Interaction, HRI, so I’ll be explaining some of the main concepts behind HRI and its evaluation Then, I’ll go through some examples of research I culled from the paper which I found interesting
2
Human-Robot Interactions (HRI)
Engaging effectively in bi- directional communication requires responding appropriately to social cues and human affect Social intelligence: allows a robot to relate to, understand, and interact and share information with people in real-world human- centered environments Human-affect detection: Facial expressions Body language Voice Physiological signals Combination of the above HRI encompasses not only how robots engage humans but also how humans respond to the interactions
3
Social HRI Social HRI is a subset of the field where robots interact using natural human communication modalities: Speech Body language Facial expressions Allows humans to interact with robots without prior training Ultimately, less work is required by the user and tasks can be completed more efficiently
4
Affect Categorization Models
5
Affect Categorization Models
Categorical Models Dimensional Models Finite number of discrete states each representing an affect category Darwin’s 6 states: happiness, surprise, fear, disgust, anger and sadness Thomkins’ 9 states: joy, interest, surprise, anger, fear, shame, disgust, dis-smell and anguish Clearly distinguish one affect from the others, lack the ability to classify any affect not included in the model Continuous models which use feature vectors to represent affective expressions in multi-dimensional spaces, allowing for representation of varying intensities of affect Models have the ability to encompass all possible affective states and their variations, lack strength to distinguish one prominent affect from another, can often be confused Most common is two-dimensional circumplex
6
HRI Scenarios
7
HRI Scenarios Collaborative HRI Assistive HRI
Robot and person working together to complete a common task Robot must be able to identify affective state and avoid misinterpretations to improve team performance Aid people through physical, social and/or cognitive assistance Promote effective engagement in interactions aimed at improving health or wellbeing For example assisting the elderly or children with autism
8
HRI Scenarios Mimicry HRI Multi-Purpose HRI
Robot or person imitating verbal or nonverbal behaviors of the other Can be used to encourage coordination and cooperation For example a robot can imitate facial expressions of a human user Sony AIBO’s robotic dog uses LEDs and moves its body, head and ears to imitate human affect Bidirectional communication for various applications encompassing all of the prior forms of interactions
9
Selected Research
10
Facial Recognition Face to face communication is fundamental, facial expressions are used for social motives and directly influence the behaviors of others in social settings Emulating empathy: recognize affective states, display their own affect and express perspective intentions Challenges: Conditions such as lighting and distances Real time computational requirements Processing spontaneous and dynamic expressions and states Physical and technology constraints Models: adaboost, multi-layer perceptrons, SVMs, NNs, dynamic bayesian iCat Children playing chess with cat, affect of child classified as positive, negative or neutral Variety of features put into SVMs to estimate probability of one of three valence levels Empathetic strategy employed when player was not experiencing positive valence Reinforcement technique used to adapt robot’s behavior to player
11
Body Language Based Affect Recognition
Body language such as arm positions, body orientation and trunk relaxation provide information about human affect Challenges: Identifying affective states from body language is still an open area in psychology and human behavior Most of today’s research is focused on hand and arm gestures Models: SVM, KNN Mimicry HRI: NAO was used to engage children in movement imitation games A camera was used to detect the child’s movements - then the robot determines energy and acceleration of the movement Researchers also investigated how the body language and mood of a user changed during mimicry Nao was able to detect different waves: happy, angry, sad and polite based on acceleration-frequency matrix of the gesture
12
Affect-aware robots were mainly providing a collaborative role by playing the game with the user, or they were involved in behavioral mimicry
13
Voice Based Affect Vocal displays are states determined by both physiological changes and social display rules Changes in frequency, articulation, high frequency energy can be used to determine states Models include Naïve Bayes and HMM Collaborative HRI: KaMERo robot designed to engage individuals in games of ‘20 questions’ to identify object that users was thinking of Used an onboard microphone to detect emotions and to determine its own emotional display Participants enjoyed the interaction more if the robot displayed an emotional behavior Voice signal features were trained with HMMs for different emotions Multipurpose HRI: Mung robot developed to detect affective states and interact with humans asking questions
14
A small round robot, Mung, was developed to detect the affective states of neutral, joy, sadness, and anger
15
Physiological Signals
Usually use ECG, EMG or skin sensors to identify heart rate, facial muscle activity and skin conductance level Models include, HMMs, SVM, NN and learning techniques Collaborative HRI: Trilobot robot was developed for collaboration during a navigation and exploration task such as assisting an astronaut A person’s anxiety level is detected using ECG, skin conductance and muscle movement The robot would detect anxiety levels and ask if assistance was needed to then alert other people when necessary HMMs were used to classify a person’s valence using heart rate, skin conductance and muscle activity above an eyebrow
16
Automated affect detection would promote effective engagement in interactions aimed at improving a person’s health and wellbeing, e.g. interventions for children with autism and for the elderly
17
Multimodal More sensors than the other modalities, 2D cameras, microphones, ECG, EMG Advantaged because when one modality is not available due to disturbances or occlusion other modes are available However, multimodal data is more difficult to acquire and process – making it challenging for learning algorithms Mimicry HRI: Muecas robot used for speech recognition and facial expression recognition Muecas was able to imitate human affect and respond in real time with facial expressions and a head pose Facial recognition was accomplished through Gabor filtering and DBN classification Speech rate was calculated using a Fourier transform
18
Recent and Future work
19
Future work Investigation of a variety of affect categorization models to respond to large number of states displayed by humans Working on 3d recognition, since current methods work well when human is front facing the camera Combinations of sensors and features for highest recognition rates Body language is very specific to culture and context, further research will work to integrate this aspect into body language recognition Long term studies will get us closer to an age where social robots will be integrated into every day life
20
What is interesting about HRI?
HRI research doesn’t simply focus on whether robots give ‘right or wrong’ answers to human behavior, but also incorporates examination of human behavior with respect to robots and how it is influenced throughout the interaction with the robot There is a wide variety of use cases, including healthcare, partner robots, navigation and exploration
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.