Peeping into the Human World

Slides:



Advertisements
Similar presentations
Lasting Relationship Cornerstones of Social Robotics in HRI Teamwork Social LearningSocial Intelligence Interdependence Transparent Communication Cognitive.
Advertisements

Announcements HW 6: Written (not programming) assignment. –Assigned today; Due Friday, Dec. 9. to me. Quiz 4 : OPTIONAL: Take home quiz, open book.
Paul Fitzpatrick lbr-vision – Face to Face – Robot vision in social settings.
Nicola Fankhauser DIUF - Department of Informatics University of Fribourg, Switzerland March.
Body Language and Facial Expression
Non-Verbal Communication
Types of Nonverbal Communication and Body Language
Nonverbal Communication Speaks Loudly. Purposes of Nonverbal Comm To accent To complement To contradict To regulate To repeat To substitute.
Humanoid Robots Debzani Deb.
Nonverbal Communication
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. NON-VERBAL.
The Power of Nonverbals in Competitive Speech
Cynthia Breazeal Aaron Edsinger Paul Fitzpatrick Brian Scassellati MIT AI Lab Social Constraints on Animate Vision.
Lecture 15 – Social ‘Robots’. Lecture outline This week Selecting interfaces for robots. Personal robotics Chatbots AIML.
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
Chapter five.  Language is a communication tools whose development depends on the prior development of communication.  Language is a social tool.* 
The Next Generation of Robots?
NONVERBAL COMMUNICATION What is non verbal communication? Nonverbal communication has been defined as communication without words.Nonverbal communication.
English for communication studies III Semester 2: Spring 2010 Instructor: Stavroulla Hadjiconstantinou Angelidou Nectaria Papaneocleous.
MIT Artificial Intelligence Laboratory — Research Directions The Next Generation of Robots? Rodney Brooks.
Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot.
Effective Public Speaking.. A presentation by Ritesh Soni1 Effective Public Speaking Ritesh Soni.
Communication Skills Personal Communication Skills.
Interpreting Ambiguous Emotional Expressions Speech Analysis and Interpretation Laboratory ACII 2009.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
Emotion and Sociable Humanoid Robots (Cynthia Breazeal) Yumeng Liao.
Speech emotion detection General architecture of a speech emotion detection system: What features?
Interpersonal Communication skill in Management Mr. D. Bortamuly.
Effective Public Speaking
Social Interaction.
Influence, Persuade & Sell
Non-verbal communication
Understanding Verbal Messages
Understanding Verbal Messages
Managing Business and Professional Communication
What are the types of communication?
Communication.
Awareness of how students
Chapter 7: Social Behavior and Personality in Infants and Toddlers
Communication Skills COMM 101 Lecture#2
Verbal and Non-verbal Communication Skills
Non-verbal communication techniques
Communication in general is process of sending and receiving messages that enables humans to share knowledge, attitudes, and skills. Although we usually.
INTERPERSONAL COMMUNICATION
Voluntary (Motor Cortex)
Business Communication Dr. Aravind Banakar –
Business Communication
Business Communication
Life in the Humanoid Robotics Group MIT AI Lab
How should we classify emotions?
Presentation by Sasha Beltinova
Nonverbal Communication
Guide on Style in Schools’ Debating
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
I Can Read Body Language!
Chapter 7: Social Behaviour and Personality in Infants and Toddlers
CHAPTER 6 Communication.
“Let’s Talk” Lesson 10.
Communicating in Oral Presentations
Behaviour Style Identification
Habit #5 Seek First to Understand, Then to Be Understood
Motivation On the index card, write down a time when someone in your life really motivated you to do something that you wouldn’t ordinarily do. What.
Chapter 7 Communication.
Copyright © Allyn & Bacon 2006
COMMUNICATION.
youtube. com/watch emoticon 2:25
Social and Emotional Development.
Non Verbal Communication KOPPACT
Presentation transcript:

Peeping into the Human World Sociable Robots Peeping into the Human World

An Infant’s Advantages Non-hostile environment Actively benevolent, empathic caregiver Co-exists with mature version of self

Baby Scheme Physical form can evoke nurturing response Caregiver exaggerates voice, gestures

Kismet – a Baby Robot

Requirements Robot needs to perceive human state Computer vision, speech processing Human needs to perceive robot state Animatronics, speech generation Closed loop interaction requires both directions

Readable locus of attention Attention can be deduced from behavior Or can be expressed more directly

Expressing locus of attention Animatronic expression of locus of attention Small distance between cameras with wide and narrow field of views, simplifying mapping between the two Central location of wide camera allows head to be oriented accurately independently of the distance to an object Allows coarse open-loop control of eye direction from wide camera – improves gaze stability

Computing locus of attention “Cyclopean” camera Stereo pair Useful to have relatively stable view for computer vision tasks. Deviation from human arrangement. Retinotopic view is pretty unstable Need narrow field of view for precise foveation Could add wider field of view resource to eyes – solution on Cog Tricky aesthetically with cameras at time (and still now for wide field of view and not too expensive) So place on head Auxiliary benefit – not affected at all by eye-movement Makes for more stable system Open-loop control of eyes based on head-fixed imaging, refined by closed-loop control based on eye-fixed imaging

Computing locus of attention Visual input Pre-attentive filters Saliency map Pursuit (Cyclopean camera) Perceptual biases. Don’t have to exact – decision will be made attentively. Analogous to pop-out, pre-attentive processes in humans. Massively parallel, relatively simple operations that filter and prioritize the scene. Coarse behavioral influence. Primary information flow Modulatory influence Behavior system Eye movement

Looking Preference Internal influences bias how salience is measured “Seek toy” – low skin gain, high saturated-color gain Looking time 28% face, 72% block “Seek face” – high skin gain, low color saliency gain Looking time 80% face, 20% block Internal influences bias how salience is measured The robot is not a slave to its environment Prefers behaviorally relevant stimuli

Example (video)

Example (video) Robot’s search is task-specific Still opportunistic when appropriate Visual behavior conveys degree of commitment Gaze direction, expression conveys interest

Interpersonal Distance Comfortable interaction distance

Interpersonal Distance

Interpersonal Distance

Interpersonal Distance

Interpersonal Distance

Examples (video) “Back off buster!” “Come hither, friend” Robot backs away if person comes too close Cues person to back away too – social amplification Robot makes itself salient to call a person closer if too far away

Negotiating object showing Comfortable interaction speed Too fast – irritation response Too fast, Too close – threat response

Example (video)

Facial expressions

Facial Expressions (Russell, Scott&Smith) arousal surprise afraid elated angry stress excitement happy frustrated displeasure pleasure neutral sad content depression calm fatigued relaxed Emotions can be mapped to affect dimensions (Russell) Facial postures are related in a systematic way to these affective dimensions (Smith & Scott) bored sleepy sleep

Generating Posture, Expressions Open stance Low arousal anger fear tired Negative valence content Positive valence High arousal Closed stance

Example Facial Expression

With Posture (video)

Emotive Voice Quality

Synthesized Emotive Speech

With Vocalizations (video)

But… Is there more to life than being really, really, really, ridiculously good-looking?

Affective Intent (video)

Fernald’s Results Four cross-cultural contours of infant-directed speech Exaggerated prosody matched to infant’s innate responses time (ms) pitch, f (kHz) o approval That’s a good bo-o-y! No no baby. time (ms) pitch, f (kHz) o prohibition Can you get it? time (ms) pitch, f (kHz) o attention time (ms) pitch, f (kHz) o MMMM Oh, honey. comfort

Evidence for Fernald-like Contours Approval Prohibition Attention Soothing

Performing Recognition prohibition & high-energy neutral attention & approval energy variance soothing & low-energy neutral pitch mean Breazeal & Aryananda, Humanoids 2000

Examples (video)

Turn-Taking Cornerstone of human-style communication, learning, and instruction Four phases of turn cycle relinquish floor listen to speaker reacquire floor speak Integrates visual behavior & attention facial expression & animation body posture vocalization & lip synchronization

Example (video)

Kismet’s really, really, ridiculously detailed web-pages: Conclusions Robots can partake in “infant-caregiver” interactions These interactions are rich with scaffolding acts Prerequisite for socially situated learning Kismet’s really, really, ridiculously detailed web-pages: http://www.ai.mit.edu/projects/kismet/