 Masanao Toda in 60s: › Integligence is NOT about solving one task › We will not learn much about inteligance testing systems in artificial lab enviroment.

Slides:



Advertisements
Similar presentations
Chapter 6: Nonverbal Communication: Messages Without Words
Advertisements

Affective Facial Expressions Facilitate Robot Learning Joost Broekens Pascal Haazebroek LIACS, Leiden University, The Netherlands.
Behavioral Theories of Motor Control
Intelligent Agents Russell and Norvig: 2
Conflict Resolution Presentation by: Bill Turner (Turner Training Services)
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Introduction to Robotics In the name of Allah. Introduction to Robotics o Leila Sharif o o Lecture #2: The Big.
Chapter 3 Nonverbal Communication. What is nonverbal communication? “Everything that communicates a message but does not use words” Facial expressions,
Supporting Collaboration: Digital Desktops to Intelligent Rooms Mary Lou Maher Design Computing and Cognition Group Faculty of Architecture University.
Overview of Computer Vision CS491E/791E. What is Computer Vision? Deals with the development of the theoretical and algorithmic basis by which useful.
Chapter 4: Towards a Theory of Intelligence Gert Kootstra.
Jochen Triesch, UC San Diego, 1 Real Artificial Life: Robots.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
Ch 4: Perceiving Persons Part 1: Sept. 17, Social Perception Get info from people, situations, & behavior – We make quick 1 st impressions of people.
RoboCup: The Robot World Cup Initiative Based on Wikipedia and presentations by Mariya Miteva, Kevin Lam, Paul Marlow.
Emotional Intelligence and Agents – Survey and Possible Applications Mirjana Ivanovic, Milos Radovanovic, Zoran Budimac, Dejan Mitrovic, Vladimir Kurbalija,
Non-Verbal Communication
Sociable Machines Cynthia Breazeal MIT Media Lab Robotic Presence Group.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
Types of Nonverbal Communication and Body Language
IF A MAD SCIENTIST WERE TO REPLACE YOUR BEST FRIENDS BRAIN WITH A COMPUTER- HOW WOULD YOU KNOW SOMETHING WAS DIFFERENT? Emotion.
Effective Communication Objectives:   Identify the components of effective communications   Organize information needed to complete a task   Compare.
Enabling enactive interaction in virtualized experiences Stefano Tubaro and Augusto Sarti DEI – Politecnico di Milano, Italy.
THE NEW ERA OF LIFE. Introduction: Artificial Intelligence (AI) is the area of computer science focusing on creating machines that can engage on behaviors.
Nonverbal Communication
Emotion Module 12. What are emotions? full body responses, involving: 1. physiological arousal (increased heart rate) 2. expressive behaviors (smiling,
EMOTIONS Emotion is a relatively brief reaction to stimuli involving subjective feelings, physiological arousal, and observable behavior.
EWatchdog: An Electronic Watchdog for Unobtrusive Emotion Detection based on Usage Analysis Rayhan Shikder Department.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Artificial Intelligence Intro Agents
Affective Computing: Agents With Emotion Victor C. Hung University of Central Florida – Orlando, FL EEL6938: Special Topics in Autonomous Agents March.
The Power of Nonverbals in Competitive Speech
THEORIES OF EMOTION. EMOTION is a set of complex reactions to stimuli involving subjective feelings, physiological arousal, and observable behavior.
Cognitive Systems Foresight Language and Speech. Cognitive Systems Foresight Language and Speech How does the human system organise itself, as a neuro-biological.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
SEMINAR REPORT ON K.SWATHI. INTRODUCTION Any automatically operated machine that functions in human like manner Any automatically operated machine that.
Interactive Spaces Huantian Cao Department of Computer Science The University of Georgia.
Artificial Intelligence Intro Agents
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
SPEECH AND WRITING. Spoken language and speech communication In a normal speech communication a speaker tries to influence on a listener by making him:
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
University of Windsor School of Computer Science Topics in Artificial Intelligence Fall 2008 Sept 11, 2008.
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
 Motivated by desire for natural human-robot interaction  Encapsulates what the robot knows about the human  Identity  Location  Intentions Human.
Welcome! Nonverbal Communication
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
The Expression of Emotion: Nonverbal Communication.
Self-Organization, Embodiment, and Biologically Inspired Robotics Rolf Pfeifer, Max Lungarella, Fumiya Iida Science – Nov Rakesh Gosangi PRISM lab.
Cognitive ability: Challenge: How to recognize objects in a scene; where are the object’s boundaries? This problem is known as ‘image segmentation’ in.
Module 16 Emotion.
© 2003 Gina Joue & Brian Duffy Dr. Brian Duffy
MIT Artificial Intelligence Laboratory — Research Directions The Next Generation of Robots? Rodney Brooks.
Emotional Intelligence
Unit 4: Emotions.
DARPA Mobile Autonomous Robot Software BAA99-09 July 1999 Natural Tasking of Robots Based on Human Interaction Cues Cynthia Breazeal Rodney Brooks Brian.
Chapter 7 Affective Computing. Structure IntroductionEmotions Emotions & Computers Applications.
1. THE NERVOUS SYSTEM 2. THE NEURON 3. THE PERIPHERAL NERVOUS SYSTEM 4. THE CENTRAL NERVOUS SYSTEM.
Robot Intelligence Technology Lab. 10. Complex Hardware Morphologies: Walking Machines Presented by In-Won Park
Paralanguage: Nonverbal Communication I have learned to depend more on what people do than what they say in response to a direct question, to pay close.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
Simulation of Characters in Entertainment Virtual Reality.
Emotion and Sociable Humanoid Robots (Cynthia Breazeal) Yumeng Liao.
NTT-MIT Collaboration Meeting, 2001Leslie Pack Kaelbling 1 Learning in Worlds with Objects Leslie Pack Kaelbling MIT Artificial Intelligence Laboratory.
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
RoboCup: The Robot World Cup Initiative
The Organization and Planning of Movement Ch
Copyright © Allyn & Bacon 2006
COMMUNICATION.
Presentation transcript:

 Masanao Toda in 60s: › Integligence is NOT about solving one task › We will not learn much about inteligance testing systems in artificial lab enviroment  Inteligance is about: › dealing with the real-world enviroment (multiple tasks, unpredictability) complete systems have to be studied: „Systems that have to act and perform tasks autonomously in the real world” (Toda, 1982)

 Exampel:  Fungus Eaters - creatures exploring planets looking for uranium ore › Requirments - Complete system has to be:  Embodied (physical system)  Autonomous  Self-sufficient  Situated (use their sensors to learn)  Outline: 1. Real worlds vs. virtual worlds 2. Propeties of complete agent 3. Main part: 8 desining principles

In the real world: 1. Acquisition of information takes time 2. Aquired information is: › partial › error-prone › not divisiable into discrete states 3. Agent always has several things to do simultaneously 4. The real word changes all the time in higly unpredictable way agent is forced to act whether it is prepared or not Real world is challenging and „messy” Puts several constraints into an agent

 they follow from agent’s embodied nature A complete agent: 1. is subject to the laws of physics, e.g. gravity 2. generates sensory stimulation through interaction with the real word 3. affects its environment 4. is a complex dynamical system which, when it interacts with the environment, has tendance to settle into attractor states 5. performs morphological computations, i.e. certain processes are performed by the body without using the neural control system (brain). Note! System that are not complete hardy ever possess all these properties

 Example of how cognition might emerge from the simple, basic actions of walking or running  Observations: › Number of stable gaits for any given system is limited › Gates are „attractor states” that the robot falls into based its own (e.g. speed) and environment properties  Basin of attraction - area that ends up in the same state › Some gaits are more stable than others (larger basin of attraction) › Complex systems are characterized by higher number of attractor states, e.g. salamandra vs. puppy Complete agent is a dynamic system and its behaviours can be viewed as attractors.

 Where to start when we would like to design an agent? › The real world is ”messy” it is hard to define neat ”design principles” › It is rather a set of huristics providing a guidence how to build an agent!  Let’s go to the 8 design principles...

 When designing an agent we need to: › define its ecological niche › define its desired behaviors and tasks › design the agent Example: Sony AIBO vs. Mars Sojourner  Note : › Robot behaviors can be only indireclty designed, since they emerge from the agent-environment interaction › Scaffolding – way in which agents structure their environments to simplify the disired task, e.g. road signs replace geografical knowledge

When designing agents we must think about the complete agent behaving in the real world.  This principle is in contrast with ”divide and conquer” rule: › Artifacts may arise when treating problems in insolation, e.g segmentation in computer vision › Human brain is not comprised of separete modules, e.g. Hubel and Wiesel’s edge- detection cells are also involved in other activities › In designing agents we need to deal with complete sensory-motor loops, e.g. when grasping a cup

The more and better an agent exploits properties of the ecological niche and interaction with the enviroment, the simpler and ”cheaper” it will be  Exampels: › Dynamic Walker  Leg movements are entirely passive, driven only by gravity in a pendulum-like manner  very narrow niche - only slopes of certain angles › ”Danise”  Additional motors + control systems  a bit wider niche › Insect Walking:  Insect use interaction with environment to walk  pushing of one leg forward, pushes the whole body and other legs forward too.

 Human-Computer Interaction  Robotics ”In the vision of future, humans will be surrounded by intelligent systems (interfaces and robots) that are sensitive and responsive to the presence of different emotions and behaviour in a seamless way. ”

 Main focus: understanding certain human emotion and behaviors  Outline: › What is communicated, How, Why › Challenges, Building a system › State of the field

 Type of messages: › Affective states (fear, joy, stress); › Emblems › Manipulators › Illustrators › Regulators  All of them carry information, but lack of consensus regarding their specificity and universality  Six basic emotions: › Happiness, anger, sadness, surprise, disgust & fear  Additional ”socialy motivated” emotions: › interest, boredom, empathy etc.

 Cues: audio, visual, tactile  Vision: › Most important: (1) face & body, (2) face, (3) body › Association between posture and emotion:  e.g. static body=anger &sadness  Speech: › Not words (!) › Nonlinguistic messages:  important for humans  hard to reliably discretized for scientist  Physiological signals: › Are very acurate: pupillary diameter, heart rate, temperature, respiration velocity › Require direct tactile contact -> novel non-intrusive

 Behavioral signals convey usually more then one type of massage › E.g. squinted eyes: sensitivity to light or eye blink  Context is crucial to interpret a signal: › Place, task, who express the signal, other people involved

 Challenges: › Fusion of modalities depends on context › Initialization › Robustness › Speed › Training  In the paper ”pragmatic approach” is advocated: › User-centered approach › System training:  Large number of mixed emotions -> unsupervised learning  Learning in real environment (”complete system”)  Importance of fusion of different cues – system based on facial expression, body gestures, nonlinguistic vocalization

 Techniques focus on: › FACE: face detection and recognition, eye-gaze tracking (Tobii), facial expresion analysis (Noldus) › BODY: body detection and tracking, hand tracking, recognition of postures, gestures (Panasonic) and activity › VOCAL NONLINGUISTIC SIGNALS: based on auditory features such as pitch, intesity, speech rate; recognition of nonlinguistic vocalizations like laughs, cries, coughs etc.

 More about the state of the field: › Context sensing (Who? Where? What? How? When?) › Understanding human social signaling  Conclusions: › Great progress during the last two decades › Driven by: face recognition, video surveillance, gaming industry › Different parts of the field are still detached...