The Next Generation of Robots?

Slides:



Advertisements
Similar presentations
ARTIFICIAL PASSENGER.
Advertisements

Affective Facial Expressions Facilitate Robot Learning Joost Broekens Pascal Haazebroek LIACS, Leiden University, The Netherlands.
Lasting Relationship Cornerstones of Social Robotics in HRI Teamwork Social LearningSocial Intelligence Interdependence Transparent Communication Cognitive.
1 User experience User interfaces Jaana Holvikivi Metropolia.
Inter-Act, 13th Edition Chapter 5 Nonverbal.
Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.
Paul Fitzpatrick lbr-vision – Face to Face – Robot vision in social settings.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Nicola Fankhauser DIUF - Department of Informatics University of Fribourg, Switzerland March.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
Humanoid Robotics – A Social Interaction CS 575 ::: Spring 2007 Guided By Prof. Baparao By Devangi Patel reprogrammable multifunctionalmovable self - contained.
1 IUT de Montreuil Université Paris 8 Emotion in Interaction: Embodied Conversational Agents Catherine Pelachaud.
Body Language and Facial Expression
Non-Verbal Communication
 A robot is a machine or a computer program which seems to have a life of its own.  A social robot is a robot that interacts and communicates with humans.
Sociable Machines Cynthia Breazeal MIT Media Lab Robotic Presence Group.
Artificial Intelligence
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
Nonverbal Communication
Types of Nonverbal Communication and Body Language
Nonverbal Communication Speaks Loudly. Purposes of Nonverbal Comm To accent To complement To contradict To regulate To repeat To substitute.
Humanoid Robots Debzani Deb.
IF A MAD SCIENTIST WERE TO REPLACE YOUR BEST FRIENDS BRAIN WITH A COMPUTER- HOW WOULD YOU KNOW SOMETHING WAS DIFFERENT? Emotion.
Nonverbal Communication
2008 數位互動科技與產業應用研討會 互動表情呈現機器人之技術及趨勢 Approaches to Interactive Emotional Robots 謝銘原 南台科技大學 機器人研究中心 Robotics Research Center, Southern Taiwan University,
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
The Power of Nonverbals in Competitive Speech
Cynthia Breazeal Aaron Edsinger Paul Fitzpatrick Brian Scassellati MIT AI Lab Social Constraints on Animate Vision.
Building Humanoid Robots Our quest to create intelligent machines Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics.
SPEECH CONTENT Spanish Expressive Voices: Corpus for Emotion Research in Spanish R. Barra-Chicote 1, J. M. Montero 1, J. Macias-Guarasa 2, S. Lufti 1,
Lecture 15 – Social ‘Robots’. Lecture outline This week Selecting interfaces for robots. Personal robotics Chatbots AIML.
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
Perceptual Analysis of Talking Avatar Head Movements: A Quantitative Perspective Xiaohan Ma, Binh H. Le, and Zhigang Deng Department of Computer Science.
Nonverbal communication
Exploiting cross-modal rhythm for robot perception of objects Artur M. Arsenio Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory.
How to Raise Emotional Intelligence (EQ). Developing EQ In order to learn about emotional intelligence in a way that produces change, we need to engage.
English for communication studies III Semester 2: Spring 2010 Instructor: Stavroulla Hadjiconstantinou Angelidou Nectaria Papaneocleous.
MIT Artificial Intelligence Laboratory — Research Directions The Next Generation of Robots? Rodney Brooks.
Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot.
Emotional Intelligence
DARPA Mobile Autonomous Robot Software BAA99-09 July 1999 Natural Tasking of Robots Based on Human Interaction Cues Cynthia Breazeal Rodney Brooks Brian.
Chapter 7 Affective Computing. Structure IntroductionEmotions Emotions & Computers Applications.
Non Verbal Communication.  NV communication can be ambiguous because it can be _________ or ____________.  NV communication is __________ for as long.
Understanding Nonverbal Messages
Communication and Emotion
Design of a Compliant and Force Sensing Hand for a Humanoid Robot Aaron Edsinger-Gonzales MIT Computer Science and Artificial Intelligence Laboratory.
SOCIAL PERCEPTION Chapter 4. Social Perception The study of how we form impressions of other people and make inferences about them.
Theory of Mind for a Humanoid Robot Brian Scassellati MIT Artificial Intelligence Lab.
Communication Skills Personal Communication Skills.
Comes from the Latin verb communicare, “to impart,” “to share,” “to make common.” We communicate by agreeing, consciously or unconsciously, to call an.
Interpreting Ambiguous Emotional Expressions Speech Analysis and Interpretation Laboratory ACII 2009.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
Humanoid-Human Interaction Presented by KMR ANIK 1.
Simulation of Characters in Entertainment Virtual Reality.
Emotion and Sociable Humanoid Robots (Cynthia Breazeal) Yumeng Liao.
MIT Artificial Intelligence Laboratory — Research Directions Intelligent Perceptual Interfaces Trevor Darrell Eric Grimson.
Derek McColl Alexander Hong Naoaki Hatakeyama Goldie Nejat
Organizational Behavior – Session 12 Dr. S. B. Alavi, 2009.
Voluntary (Motor Cortex)
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Manipulation in Human Environments
Life in the Humanoid Robotics Group MIT AI Lab
Presentation by Sasha Beltinova
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Motivation On the index card, write down a time when someone in your life really motivated you to do something that you wouldn’t ordinarily do. What.
Peeping into the Human World
Presentation transcript:

The Next Generation of Robots? Rodney Brooks and Una-May O’Reilly

Our Objectives How can biology inform robotic competence? How can aspects of human development and social behavior inform robotic competence?

Our Approach Exploit the advantages of the robot’s physical embodiment Integrate multiple sensory and motor systems to provide robust and stable behavioral constraints Capitalize on social cues from an instructor Build adaptive systems with a developmental progression to limit complexity

Our Humanoid Platforms Cog and Kismet

Biological Inspiration for Cog Cog has simulated musculature in its arms Cog has an implementation of a human model of visual search and attention Cog employs context-based attention and internal situations influence action Cog uses a naïve model of physics to distinguish animate from inanimate

Social Inspiration for Cog A theory of mind A theory of body Mimicry

Human <—> Robot Cameras Gaze direction Microphones Facial Neck pan Neck tilt Neck lean Eye tilt Left eye pan Right eye pan Camera with wide field of view Camera with narrow field of Axis of rotation Microphones Facial features Speech synthesizer Head orientation

Levels of Control robot responds to human human responds to robot Social Level Behavior Level perceptual feedback current goal Skills Level coordination between motor modalities current primitive(s) Primitives Level

Kismet’s Competencies Direct Visual Attention Recognize Socially Communicated Reinforcement Communicate Internal State to Human Regulation of Social Interaction

No One in Charge QNX L NT 11 400-500 MHz PCs Linux QNX (vision) speech synthesis affect recognition Linux Speech recognition Face Control Emotive Response Percept & Motor Drives & Behavior L Tracker Attn system Dist. to target Motion filter Eye finder Motor ctrl audio speech comms Skin Color QNX CORBA dual-port RAM Cameras Eye, neck, jaw motors Ear, eyebrow, eyelid, lip motors Microphone Speakers 4 Motorola 68332 micro-controllers L, multi-threaded lisp higher-level perception, motivation, behavior, motor skill integration & face control 11 400-500 MHz PCs QNX (vision) Linux (speech recognition) NT (speech synthesis & vocal affect recognition

Visual Attention skin tone color motion habituation attention Frame Grabber skin tone color motion habituation w w w w attention inhibit reset Top down, task-driven influences Eye Motor Control

Visual Search

Social Constraints Person backs off Person draws closer Comfortable interaction speed Too fast – irritation response Too fast, Too close – threat response Comfortable interaction distance Too close – withdrawal response Too far – calling behavior Person draws closer Person backs off Beyond sensor range

Evidence for 4 contours in Kismet-directed speech Cross Cultural Affect Evidence for 4 contours in Kismet-directed speech time (ms) pitch, f (kHz) o approval That’s a good bo-o-y! No no baby. prohibition Can you get it? attention MMMM Oh, honey. comfort

Low-Intensity Neutral High Intensity Neutral Affect Recognizer Soothing & Low-Intensity neutral vs Everything Else Soothing Low-Intensity Neutral Approval & Attention Prohibition High Intensity Neutral approval attention soothing prohibition neutral prohibition attention & approval energy variance soothing & low-energy neutral pitch mean

Naive Subjects 5 female subjects 4 naive subjects 1 caregiver Four contours and neutral speech praise, prohibition, attention, soothing Multiple languages French, German, Indonesian, English, Russian Driven by Human

Facial Expressions arousal sleep displeasure pleasure neutral excitement depression stress calm afraid angry frustrated relaxed content elated bored sad fatigued happy surprise sleepy

Facial Postures in Affect Space Open stance Low arousal fear accepting Negative valence tired unhappy content surprise Positive valence disgust stern High arousal anger Closed stance

Face, Voice, Posture

Turn-Taking / Proto-Dialog Naïve subjects Told to “talk to the robot” Engage in turn taking No understanding (on either side) of content

Implemented Model of Visual Search and Attention Color w Motor System Motion  w Activation Map Skin w Motivation System Habituation w

Hardware – Cog’s Arms 6 DOF in each arm Series elastic actuator Force control Spring law

Hardware – Cog’s Head 7 degrees of freedom Human speed and range of motion

Visual and Inertial Sensors 3-axis inertial sensor Peripheral View Peripheral View Foveal View Foveal View

Computational System Designed for real-time responses Network of 24 PC’s ranging from 200-800 MHz QNX real-time operating system Implementation shown today consists of ~26 QNX processes ~75 QNX threads