Cynthia Breazeal Aaron Edsinger Paul Fitzpatrick Brian Scassellati MIT AI Lab Social Constraints on Animate Vision.

Slides:



Advertisements
Similar presentations
Qualities of a good facilitator
Advertisements

Critical Listening and Feedback ECE 3940 Megan OByrne – CLEAR 17 September 09.
Announcements HW 6: Written (not programming) assignment. –Assigned today; Due Friday, Dec. 9. to me. Quiz 4 : OPTIONAL: Take home quiz, open book.
Strategies for Supporting Young Children
Welcome To Pygmalion Effect Training. Hot Round.
Overview of Conditioning. Need to Examine Behavior Look at the behavior of an organism’s interaction with its environment Displacements in space through.
Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.
Paul Fitzpatrick lbr-vision – Face to Face – Robot vision in social settings.
Nonverbal Communication. A. General Information 1.Definition – All the behaviors and elements of people, other than words, that convey meaning 2. At least.
1. Understand the importance of interpersonal communication skills to becoming a leader 2. Convey believability by ensuring the verbal, vocal, and visual.
Psikologi Anak Pertemuan 3 Motor, Sensory, and Perceptual Development.
Attention I Attention Wolfe et al Ch 7. Dana said that most vision is agenda-driven. He introduced the slide where the people attended to the many weird.
Chapter 6: Visual Attention. Scanning a Scene Visual scanning – looking from place to place –Fixation –Saccadic eye movement Overt attention involves.
Chapter 3 Nonverbal Communication. What is nonverbal communication? “Everything that communicates a message but does not use words” Facial expressions,
Selected References Feil-Seifer, D.J. & Matarić, M.J. (2006). “Shaping Human Behavior by Observing Mobility Gestures”. Poster paper in 1 st Annual Conference.
King Saud University College of nursing Master program.
Supporting the Instructional Process Instructional Assistant Training.
Communication Ms. Morris.
1. Written communication = Verbal communication? 2. Define euphemism? Equivocation? 3. Difference between connotative and denotative meaning? 4. In the.
Sociable Machines Cynthia Breazeal MIT Media Lab Robotic Presence Group.
Verbal & Non-Verbal Communication Active & Passive Listening
Effective Communication
 7% is verbal (words)  38% is vocal (volume, pitch, rhythm, etc.)  55% is body movements (mostly facial expressions)
What does your body say?.  all messages that are not expressed as words.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
Nonverbal Communication Voice Body Talk Environmental Cues.
Humanoid Robots Debzani Deb.
Let’s build upon our introduction to visual supports. During this lesson we will review why it important to organize the environment and build systematic.
INTERPERSONAL COMMUNICATION
V-1 Module V ______________________________________________________ Providing Positive Behavioral Interventions and Supports.
DISTRICT TRAINERS’ TRAINING SEMINAR Jakarta Rotary Institute Thursday 30 November, 2006.
When it comes to communication development in children, there is a wide range of things that classify “normal development” These standards are set in.
Nonverbal Communication
Chapter7 Symbolic Communication and Language. Chapter Outline  Language and Verbal Communication  Nonverbal Communication  Social Structure and Communications.
4/12/2007dhartman, CS A Survey of Socially Interactive Robots Terrance Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presentation by Dan Hartmann.
Nonverbal Communication
LD/ADHD Initiative Executive Functions Modules Session Four Strategy Sheets Elementary.
Key Elements of Good Listening
Strategies for Increasing Communication in Natural Environments.
Lecture 15 – Social ‘Robots’. Lecture outline This week Selecting interfaces for robots. Personal robotics Chatbots AIML.
©2007 by the McGraw-Hill Companies, Inc. All rights reserved. 2/e PPTPPT.
NONVERBAL COMMUNICATION
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
CONSEQUENCES THAT TEACH AND RESTORE Developed by Steven Vitto, 2006.
Perception, Cognition, and Emotion in Negotiation
NAEYC Developmentally Appropriate Practice in Early Childhood Programs Key Messages and Implication.
Chapter five.  Language is a communication tools whose development depends on the prior development of communication.  Language is a social tool.* 
Agenda Social Skills Development Importance of Social Skills
Nonverbal Communication
DANCE JEOPARDY DANCE Jeopardy is a team based quiz best suited to groups of 3 or more individuals This quiz reviews DANCE behaviors and coding strategies.
LD/ADHD Initiative Executive Functions Modules
The Next Generation of Robots?
Chapter 11-2 Josh, Sydney, Solomon, McKaylie, Kenton, Lena, & Benjamin 1st period - Speech.
Ta-when-tee-won. Voice Paralanguage – vocal qualities “assist” language Disfluencies – disruptions in the flow of words – Verbal Junk (um, uh, like, and.
MIT Artificial Intelligence Laboratory — Research Directions The Next Generation of Robots? Rodney Brooks.
Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot.
Research Proposal Seminar: SHOW TIME Meeting 5 Subject: G-1342 Research Seminar Year: 2008/2009.
DARPA Mobile Autonomous Robot Software BAA99-09 July 1999 Natural Tasking of Robots Based on Human Interaction Cues Cynthia Breazeal Rodney Brooks Brian.
Unitarian Universalist Church of Fort Myers Leadership Development Program Module 4 Communication -A January 8, 2015.
Common Core State Standards in English/Language Arts What science teachers need to know.
CLASSROOM MANAGEMENT Sunset Elementary. Good Teaching n Instructional goals are clear n Knowledgeable of content and strategies for teaching it n Student.
How we actively interpret our environment..  Perception: The process in which we understand sensory information.  Illusions are powerful examples of.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
Presented By Meet Shah. Goal  Automatically predicting the respondent’s reactions (accept or reject) to offers during face to face negotiation by analyzing.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
Importance of Good Communication Food for thought after working in groups to make a comic strip.
Emotion and Sociable Humanoid Robots (Cynthia Breazeal) Yumeng Liao.
Communication.
Life in the Humanoid Robotics Group MIT AI Lab
Peeping into the Human World
Presentation transcript:

Cynthia Breazeal Aaron Edsinger Paul Fitzpatrick Brian Scassellati MIT AI Lab Social Constraints on Animate Vision

2 Humanoids2000 Social constraints  Robots create expectations through their physical form – particularly humanoid robots  But with careful use, these expectations can facilitate smooth, intuitive interaction –Provide a natural “vocabulary” to make the robot’s behavior and state readable by a human –Provide natural frameworks for trying to negotiate a change in each other’s behavior and (through readability) knowing when you have succeeded –These elements have their own internal logic and constraints which, if violated, lead to confusion

3 Humanoids2000 Visually-mediated social elements  Readable locus of attention  Negotiation of the locus of attention  Readable degree of engagement  Negotiation of interpersonal distance  Negotiation of object showing  Negotiation of turn-taking timing  Readable locus of attention

4 Humanoids2000 Readable locus of attention Attention can be deduced from behavior Or can be expressed more directly

5 Humanoids2000 Kismet – a readable robot Eye tilt Left eye panRight eye pan Camera with wide field of view Camera with narrow field of view  Designed to evoke infant-level social interactions –Eibl-Eiblsfeldt “baby scheme” –physical size, stature  But not exactly human infant –caricature that is readable  Naturally elicit scaffolding acts characteristic of parent-infant scenarios –directing attention –affective feedback, reinforcement –simplified behavior, suggested to make perceptual task easier –slow down, go at infant’s pace

6 Humanoids2000 Visually-mediated social elements  Readable locus of attention  Negotiation of the locus of attention  Readable degree of engagement  Negotiation of interpersonal distance  Negotiation of object showing  Negotiation of turn-taking timing

7 Humanoids2000 Negotiating the locus of attention  For object-centered activities, attention is fundamental  There are natural strategies people use to direct attention  The robot’s attention must be receptive to these influences, but also serve the robot’s own agenda One person’s strategies Another’s strategies

8 Humanoids2000 Skin tone ColorMotionHabituation Weighted by behavioral relevance Pre-attentive filters External influences on attention  Attention is allocated according to salience  Salience can be manipulated by shaking an object, bringing it closer, moving it in front of the robot’s current locus of attention, object choice, hiding distractors, … Current input Saliency map

9 Humanoids2000 Tuned to natural cues stimulus category stimuluspresentations average time (s) commonly used cues commonly read cues color and movement yellow dinosaur88.5 motion across centerline, shaking, bringing object close change in visual behavior, face reaction, body posture multi-colored block 86.5 green cylinder86.0 movement only black&white cow 85.0 skin toned and movement pink cup86.5 hand85.0 face83.0 Overall565.8

10 Humanoids2000 Can shape an interaction  The robot’s attention can be manipulated repeatedly  So caregiver can shape an interaction into the form of an object-centered game, or a teaching session

11 Humanoids2000 “Seek face” – high skin gain, low color saliency gain Looking time 28% face, 72% block “Seek toy” – low skin gain, high saturated-color gain Looking time 28% face, 72% block Internal influences on attention  Internal influences bias how salience is measured  The robot is not a slave to its environment

12 Humanoids2000 Maintaining visual attention  Want attention to be persistent enough to permit coherent behavior  Must be able to maintain fixation on an object, when behaviorally appropriate  Attention system interacts closely with tracker to support this robustly

13 Humanoids2000 Visually-mediated social elements  Readable locus of attention  Negotiation of the locus of attention  Readable degree of engagement  Negotiation of interpersonal distance  Negotiation of object showing  Negotiation of turn-taking timing

14 Humanoids2000 Readable degree of engagement  Visual behavior conveys degree of commitment –fleeting glances –smooth pursuit –full body orientation  Gaze direction, facial expression, and body posture convey robot’s interest

15 Humanoids2000 Visually-mediated social elements  Readable locus of attention  Negotiation of the locus of attention  Readable degree of engagement  Negotiation of interpersonal distance  Negotiation of object showing  Negotiation of turn-taking timing

16 Humanoids2000 Comfortable interaction distance Too close – withdrawal response Too far – calling behavior Person draws closer Person backs off Beyond sensor range Negotiating interpersonal distance  Robot establishes a “personal space” through expressive cues  Tunes interaction to suit its vision capabilities

17 Humanoids2000 Negotiating interpersonal distance  Robot backs away if person comes too close  Cues person to back away too – social amplification  Robot makes itself salient to call a person closer if too far away “Back off buster!” “Come hither, friend”

18 Humanoids2000 Visually-mediated social elements  Readable locus of attention  Negotiation of the locus of attention  Readable degree of engagement  Negotiation of interpersonal distance  Negotiation of object showing  Negotiation of turn-taking timing

19 Humanoids2000 Negotiating object showing  Robot conveys preferences about how objects are presented to it through irritation, threat responses  Again, tunes interaction to suit its limited vision  Also serves protective role Comfortable interaction speed Too fast – irritation response Too fast, Too close – threat response

20 Humanoids2000 Negotiating object showing  Robot “shuts out” close, fast moving object – threat response  Robot backs away if object too close  Robot cranes forward as expression of interest Threat response Withdrawal, startle

21 Humanoids2000 Visually-mediated social elements  Readable locus of attention  Negotiation of the locus of attention  Readable degree of engagement  Negotiation of interpersonal distance  Negotiation of object showing  Negotiation of turn-taking timing

22 Humanoids2000 Turn-Taking  Cornerstone of human-style communication, learning, and instruction  Four phases of turn cycle –relinquish floor –listen to speaker –reacquire floor –speak  Integrates –visual behavior & attention –facial expression & animation –body posture –vocalization & lip synchronization

23 Humanoids2000 Examples of turn-taking  Turn-taking is fine grained regulation of human’s behavior  Uses envelope displays, facial expressions, shifts of gaze and body posture  Tightly coupled dynamic of contingent responses to other Kismet and Adrian Kismet and Rick

24 Humanoids2000 Evaluation of Performance  Naive subjects – ranging in age from 25 to 28 –All young professionals. –No prior experience with Kismet –video recorded  Turn-taking performance –82% “clean” turn transitions –11% interruptions –7% delays followed by prompting  Significant flow disturbances –tend to occur in clusters –6.5% of the time, but rate diminishes  Evidence for entrainment –shorter phrases –wait longer for response –read turn-taking cues –0.5—1.5 seconds between turns 37+8:06 – 8:43 447:18 – 8:02 216:54 – 7:15 76:43 – 6:43 subject :30 – 18: :20 – 17: :56 – 16: :37 – 15: :20 – 15:20 subject 1 subject :20 – 10:40 min 458:25 – 9:10 587:18 – 8:16 186:58 – 7:16 536:00 – 6:53 245:30 – 5:54 155:08 – 5:23 104:52 – 4:52 min seconds between disturbances time stamp (min:sec)

25 Humanoids2000 Conclusion  Active vision involves choosing a robot’s pose to facilitate visual perception.  Focus has been on immediate physical consequences of pose.  For anthropomorphic head, active vision strategies can be “read” by a human, assigned an intent which may then be completed beyond the robot’s immediate physical capabilities.  Robot’s actions have communicative value, to which human responds. 25 Humanoids2000