Emotion and Sociable Humanoid Robots (Cynthia Breazeal) Yumeng Liao.

Slides:



Advertisements
Similar presentations
Affective Facial Expressions Facilitate Robot Learning Joost Broekens Pascal Haazebroek LIACS, Leiden University, The Netherlands.
Advertisements

University of Minho School of Engineering Centre ALGORITMI Uma Escola a Reinventar o Futuro – Semana da Escola de Engenharia - 24 a 27 de Outubro de 2011.
Paul Fitzpatrick lbr-vision – Face to Face – Robot vision in social settings.
Let me tell you what I really think
Social Interaction. Includes the third school of sociology Includes the third school of sociology Is easily studied using approaches at the micro level.
Nicola Fankhauser DIUF - Department of Informatics University of Fribourg, Switzerland March.
Affective Computing Lecture 5: Dr. Mark Brosnan 2 South:
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
Affective Computing and Human Robot Interaction a short introduction a short introduction Joost Broekens Telematica Institute, Enschede, LIACS, Leiden.
Ch 4: Perceiving Persons Part 1: Sept. 17, Social Perception Get info from people, situations, & behavior – We make quick 1 st impressions of people.
1 IUT de Montreuil Université Paris 8 Emotion in Interaction: Embodied Conversational Agents Catherine Pelachaud.
CS147 - Terry Winograd - 1 Lecture 16 – Affect Terry Winograd CS147 - Introduction to Human-Computer Interaction Design Computer Science Department Stanford.
Body Language and Facial Expression
Recognizing Emotions in Facial Expressions
Non-Verbal Communication
 A robot is a machine or a computer program which seems to have a life of its own.  A social robot is a robot that interacts and communicates with humans.
Sociable Machines Cynthia Breazeal MIT Media Lab Robotic Presence Group.
 Masanao Toda in 60s: › Integligence is NOT about solving one task › We will not learn much about inteligance testing systems in artificial lab enviroment.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
Therapeutic Communication Lecture 1. Objective #6 Define communication.
Types of Nonverbal Communication and Body Language
Human-Robot Interaction -Emerging Opportunities Pramila Rani 1997A3PS071 October 27,2006.
Humanoid Robots Debzani Deb.
IF A MAD SCIENTIST WERE TO REPLACE YOUR BEST FRIENDS BRAIN WITH A COMPUTER- HOW WOULD YOU KNOW SOMETHING WAS DIFFERENT? Emotion.
Nonverbal Communication
2008 數位互動科技與產業應用研討會 互動表情呈現機器人之技術及趨勢 Approaches to Interactive Emotional Robots 謝銘原 南台科技大學 機器人研究中心 Robotics Research Center, Southern Taiwan University,
Emotions and Motivation Wadsworth, a division of Thomson Learning.
EXPRESSED EMOTIONS Monica Villatoro. Vocab to learn * Throughout the ppt the words will be bold and italicized*  Emotions  Facial Codes  Primary Affects.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
9/29/01Human-Robot Interaction Ecce Homo: Why It’s Great to be Labeled a “Person” Clifford Nass Stanford University.
Maria Neophytou Communication And Internet Studies ENG270 – English for Communication Studies III
Effect of Shared-attention on Human-Robot Communication Written by Junyi Yamato, Kazuhiko Shinozawa, Futoshi Naya Presentation by Bert Gao.
4/12/2007dhartman, CS A Survey of Socially Interactive Robots Terrance Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presentation by Dan Hartmann.
APML, a Markup Language for Believable Behavior Generation Soft computing Laboratory Yonsei University October 25, 2004.
Cynthia Breazeal Aaron Edsinger Paul Fitzpatrick Brian Scassellati MIT AI Lab Social Constraints on Animate Vision.
Sign Language – 5 Parameters Sign Language has an internal structure. They can be broken down into smaller parts. These parts are called the PARAMETERS.
Lecture 15 – Social ‘Robots’. Lecture outline This week Selecting interfaces for robots. Personal robotics Chatbots AIML.
Communication Additional Notes. Communication Achievements 7% of all communication is accomplished Verbally. 55% of all communication is achieved through.
Chapter five.  Language is a communication tools whose development depends on the prior development of communication.  Language is a social tool.* 
The Next Generation of Robots?
Where Robots and Virtual Agents Meet Where Robots and Virtual Agents Meet A Survey of Social Interaction Research across Milgram’s Reality-Virtuality Continuum.
Towards a design-centered framework for social human-robot interactions SOCIAL ROBOTS Forlizzi April 2004 Jodi Forlizzi (with Carl DiSalvo, Sara Kiesler,
CHAPTER 8 The Nonverbal Code. Defining Nonverbal Communication The messages people send to each other that do not contain words – kinesics – occulesics.
Wade/Tavris, (c) 2006, Prentice Hall Emotion A state of arousal involving facial and body changes, brain activation, cognitive appraisals, subjective feelings,
English for communication studies III Semester 2: Spring 2010 Instructor: Stavroulla Hadjiconstantinou Angelidou Nectaria Papaneocleous.
MIT Artificial Intelligence Laboratory — Research Directions The Next Generation of Robots? Rodney Brooks.
Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot.
DARPA Mobile Autonomous Robot Software BAA99-09 July 1999 Natural Tasking of Robots Based on Human Interaction Cues Cynthia Breazeal Rodney Brooks Brian.
CHAPTER 11 NONVERBAL DELIVERY MGT 3213 – ORG. COMMUNICATION Mississippi State University College of Business.
O The Feedback Culture. o Theories of Communication. o Barriers. REVIEW.
The Expression of Emotion: Nonverbal Communication.
The Social Robots Project
SOCIAL PERCEPTION Chapter 4. Social Perception The study of how we form impressions of other people and make inferences about them.
ENTERFACE’08 Multimodal Communication with Robots and Virtual Agents mid-term presentation.
Facial Expressions and Emotions Mental Health. Total Participants Adults (30+ years old)328 Adults (30+ years old) Adolescents (13-19 years old)118 Adolescents.
Bellwork Copy this statement into your notes for today and answer the question. Human beings were made with two ears and one mouth each. What does this.
Eye contact activity Eye contact activity Face to face instructions Back to back instructions 1 min full eye contact from both conversation 1 min no eye.
Expressionbot: An Emotive Lifelike Robotic Face for Face-to- Face Communication Ali Mollahossenini, Gabriel Gairzer, Eric Borts, Stephen Conyers, Richard.
Chapter Eight: Nonverbal Messages This multimedia product and its contents are protected under copyright law. The following are prohibited by law: any.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
9/30/ Cognitive Robotics1 Human-Robot Interaction Cognitive Robotics David S. Touretzky & Ethan Tira-Thompson Carnegie Mellon Spring 2009.
Human-Robot Interaction
Nonverbal Communication
Motivation and Emotions
Verbal and Non-verbal Communication Skills
Life in the Humanoid Robotics Group MIT AI Lab
Presentation by Sasha Beltinova
Copyright © Allyn & Bacon 2006
COMMUNICATION.
Peeping into the Human World
Presentation transcript:

Emotion and Sociable Humanoid Robots (Cynthia Breazeal) Yumeng Liao

Usually, robots are designed as autonomous entities Perform dangerous tasks in toxic environments Even in less dangerous tasks, human-robot interaction is minimal

HCI research by Reeves and Nass shows that “a social interface may be a truly universal interface” Humanoid robots can communicate using the human social interface (facial expression, posture, etc.)

Embodied system: human interacts with robot or animated avatar Social cues enhance robot’s message Robot must sense human’s embodied social cues

Embodied conversation agents Rea: real-estate agent in virtual world Steve: virtual-reality trainer

Human-friendly robots Sage: mobile museum tour Sony Aibo: children’s toy

Expressive-face robots Japanese robots with human-like faces: exampleexample

Kismet: expressive anthropomorphic robot Engages in natural and expressive face-to-face interaction 3 DoF gaze 3 DoF head orientation 15 DoF move facial features (eyelids, eyebrows, lips, ears)

1 narrow field of view camera in each eye 2 wide field of view cameras between eyes 1 microphone on each ear Person interacting wears microphone

Kismet to learn from natural and intuitive interaction with humans Kismet in action

Kismet’s behavior Satisfy 3 drives With 0 stimulation, drive increases until satiated Homeostatic: maintain intensity within a range (overwhelmed, under-stimulated) Social, stimulation, fatigue

Theory of Basic Emotions Emotions are primary and endowed by evolution Anger, disgust, fear, joy, sorrow, surprise Each serves biological or social function in context Become more nuanced through development Relevance-detection and response-preparation system

Russell’s Pleasure- Arousal Space

Human perception of Kismet (questionnaire): Matching face to emotion: 47-83% correct Matching face and posture to emotion: 57-86% correct

Real-time interaction Dynamic quality of interaction Tone of voice: praise, prohibition, attention bids, soothing Emotional feedback cycles Affective mirroring

Conclusion: for compelling and engaging interactions, robot must do the right thing, at the right time, in the right manner!