Embodied Machines Artificial vs. Embodied Intelligence –Artificial Intelligence (AI) –Natural Language Processing (NLP) Goal: write programs that understand.

Slides:



Advertisements
Similar presentations
Information and Software Technology Option: Artificial intelligence, simulation and modelling.
Advertisements

Pat Langley Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford, California USA
Map of Human Computer Interaction
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
What we know about linguistic relativity so far Linguistics 5430 Spring 2007.
When Media Aren’t Media: The Concept of (Tele)presence Matthew Lombard Temple University BTMM 3446/8446 March 2, 2010.
Agent Architectures and Hierarchical Control
Cognitive Linguistics Croft&Cruse
Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.
1 Language and kids Linguistics lecture #8 November 21, 2006.
Embedded System Lab Kim Jong Hwi Chonbuk National University Introduction to Intelligent Robots.
Artificial Intelligence
Human Computer Interaction
Design Activities in Usability Engineering laura leventhal and julie barnes.
Matching brain and body dynamics Daniel Wolpert: – "Why don't plants have brains?" – "Plants don't have to move!" Early phases of embodied artificial intelligence:
Paper Title Your Name CMSC 838 Presentation. CMSC 838T – Presentation Motivation u Problem paper is trying to solve  Characteristics of problem  … u.
Modelling language origins and evolution IJCAI-05 Demo THSim Tutorial modelling language evolution Paul Vogt.
Emergent Adaptive Lexicons Luc Steels 1996 Artificial Intelligence Laboratory Vrije Universiteit Brussel Presented by Achim Ruopp University of Washington.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
Mental Development and Representation Building through Motivated Learning Janusz A. Starzyk, Ohio University, USA, Pawel Raif, Silesian University of Technology,
B.RAMAMURTHY UNIVERSITY AT BUFFALO Introduction to Hardware (& Software) 5/30/2013 Amrita-UB-MSES-CSE
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
UNIT 21 Software Engineering.
Artificial Intelligence By Ryan Shoultes & Jeremy Creighton.
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
1 UNIT 20 Software Engineering Lecturer: Ghadah Aldehim.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Learning Science and Mathematics Concepts, Models, Representations and Talk Colleen Megowan.
Embodied Machines The Grounding (binding) Problem –Real cognizers form multiple associations between concepts Affordances - how is an object interacted.
Beyond Gazing, Pointing, and Reaching A Survey of Developmental Robotics Authors: Max Lungarella, Giorgio Metta.
Human Resources Training and Individual Development Adult Learning February 2 nd, 2004.
Artificial Intelligence Bodies of animals are nothing more than complex machines - Rene Descartes.
Boundary Assertion in Behavior-Based Robotics Stephen Cohorn - Dept. of Math, Physics & Engineering, Tarleton State University Mentor: Dr. Mircea Agapie.
Autonomy and Artificiality Margaret A. Boden Hojin Youn.
Lecture 15 – Social ‘Robots’. Lecture outline This week Selecting interfaces for robots. Personal robotics Chatbots AIML.
COGNITIVE SEMANTICS: INTRODUCTION DANA RETOVÁ CSCTR2010 – Session 1.
University of Windsor School of Computer Science Topics in Artificial Intelligence Fall 2008 Sept 11, 2008.
HYMES (1964) He developed the concept that culture, language and social context are clearly interrelated and strongly rejected the idea of viewing language.
Assistive Technology in the Classroom Setting Rebecca Puckett CAE6100 – GQ1 (24494) Dec. 7, 2009.
Course Instructor: K ashif I hsan 1. Chapter # 1 Kashif Ihsan, Lecturer CS, MIHE2.
A New Artificial Intelligence 7 Kevin Warwick. Embodiment & Questions.
Children Getting Lost: Language, space, and the development of cognitive flexibility in humans.
Object Lesson: Discovering and Learning to Recognize Objects Object Lesson: Discovering and Learning to Recognize Objects – Paul Fitzpatrick – MIT CSAIL.
Behavior Based Systems Behavior Based Systems. Key aspects of the behavior-based methodology: Situatedness: Situatedness:  The robot is an entity situated.
Page.132 Brown’s Book.  Style : general characteristics that differentiate one individual from another.  Strategy: any number of specific methods or.
Artificial Intelligence Research in Video Games By Jacob Schrum
Chapter 10. The Explorer System in Cognitive Systems, Christensen et al. Course: Robots Learning from Humans On, Kyoung-Woon Biointelligence Laboratory.
To name two contrasting theories of perception To explain what is meant by the phrase ‘Top Down’ processing To Outline Richard Gregory’s theory of perception.
Embodied Machines The Talking Heads Guessing Game –Speakers role: Speaker agent randomly searches environment, locates an area of interest (context) Focuses.
Chapter 15. Cognitive Adequacy in Brain- Like Intelligence in Brain-Like Intelligence, Sendhoff et al. Course: Robots Learning from Humans Cinarel, Ceyda.
ERTS: A Robotic Platform for Collaborative Experimental Research Steven D Johnson Professor Indiana University School of Informatics.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Figure and Ground Part 2 APLNG 597C LEJIAO WANG 03/16/2015.
Cognitive Development in Infancy and Childhood: information Processing
MIT Artificial Intelligence Laboratory — Research Directions Intelligent Perceptual Interfaces Trevor Darrell Eric Grimson.
The Agent and Environment Presented By:sadaf Gulfam Roll No:15156 Section: E.
IINDIVIDUAL LEARNING STYLE IN LANGUAGE LEARNING. Most children and adults can master some content - how they master, it is determined by individual learning.
Information Processing
Learning Fast and Slow John E. Laird
DRAWING APPRENTICE BY ANAMIKA SHARAF.
Overview of Year 1 Progress Angelo Cangelosi & ITALK team
Psychology of usability
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Power and limits of reactive intelligence
What is a Robot?.
Intro to fMRI studies BCS204 Week 3.2 1/30/2019.
Cognitive Science and its Applications
Map of Human Computer Interaction
Presentation transcript:

Embodied Machines Artificial vs. Embodied Intelligence –Artificial Intelligence (AI) –Natural Language Processing (NLP) Goal: write programs that understand and identify grammatical patterns Assign conventional meanings to words Context (word environment) can be looked at to some extent to disambiguate meaning Meanings are lists of associations and relations –Associations are human programmed ontologies

Embodied Machines –Embodied Intelligence –Artificial life –Self organizing intelligence Meaning is situated in experience: –Organisms structure world to suit their needs –Organisms perceive the world via a body Language emerges through self-organization out of local interactions of language users. Living ecology better metaphor for cognitive system than computer program Artificial systems need human-like cognitive capabilities to be effective users of human language

Embodied Machines Self organization –Bootstrapping Means of learning Drives/goals/tasks Experience in world Interaction with others Intelligence –Language evolves both in the individual and in the community through negotiation

Embodied Machines Talking Heads Experiments (Luc Steels) –Create simple robots with Perceptual systems Language production and listening capabilities Learning capability –Put robots in environments containing objects of interest and other robots to talk to –Give robots a task requiring speaker/hearer interaction: Guessing Game –Goal: Observe how learning takes place Potential to modify environment in various ways Change participants, stimuli, etc.

Embodied Machines “Building colonies of physical autonomous robots roaming the world in search of stimulating environments and rich interactions with other robots is not feasible today. So how can we ever test seriously situated and socially embedded approaches to cognition?” Teleporting –Human hardware and software are not distinct –Talking heads have distinct heads and bodies –‘heads’ can be loaded into different bodies –Physical bodies can be located anywhere in the world

Embodied Machines Robot bodies –Physical bodies located somewhere in the world in real space Virtual agent –Software structure (memory, lexicon, grammar) Real agent –Exists when virtual agent is loaded in a physical robot body –Real agents can only interact when they are instantiated in the same physical environment No long distance communication

Embodied Machines Robot body Camera on pan/tilt motors Loudspeaker for output Microphone for input Computer For experimenters Television screen < camera Computer screen < computer

Embodied Machines Environment –White board containing basic shapes of various sizes and colors

Embodied Machines Agent’s Brain Architecture –Perceptual layer Sensory system visual & auditory –Conceptual layer Categorization/ontology - no initial values –Lexical layer Words – no initial values –Syntactic layer Word order – no initial values –Pragmatic layer Scripts for language games

Embodied Machines Perceptual layer –Visual system Camera Segmentation programs –Easy environment: basic shapes, clear boundaries –Auditory system Microphone Auditory signal processing

Embodied Machines Conceptual layer –Categorization/ontology - no initial values –World is a collection of objects (shapes on whiteboard) –Robots want to build a set of meanings –Meaning is a region represented by a prototype A particular color, area and location –The category of every object is the region represented by its nearest prototype –An object is discriminated if its category is different from all the others in the context

Embodied Machines ROBOT’S PROTOTYPES: a=(0.15, 0.25) b=(0.35, 0.3) A is categorised as a B is categorised as b C is categorised as b A is discriminated B and C are not A B C a b CONTEXT: A=(0.1, 0.3) B=(0.3, 0.3) C=(0.25, 0.15)

Embodied Machines –Lexical layer Words initially created randomly Associated with categories Word-category association strengthened through use –Pragmatic layer Scripts for guessing game Provides robot’s raison d’etre Drive module

Embodied Machines The Guessing Game –Speakers role: Speaker agent randomly searches environment, locates an area of interest (context) Focuses hearer’s attention on same context Chooses an object in context (topic) Describes object to hearer Red squareRed one

Embodied Machines The Guessing Game –Hearer’s role: Hearer tries to guess what speaker is referring to Indicates guess by pointing at topic (focusing) Game succeeds if hearer guesses right Associations between word and category strengthened –If hearer guesses wrong Speaker points to topic as well Speaker and hearer adjust strength of association between lexical item and category

Embodied Machines Why Guessing Game can fail: –Speaker has no word for object of interest –Hearer does not have word –Hearer has word but has assigned it to some other concept –Speaker and Hearer have different vantage points

Embodied Machines If speaker has no word for object of interest. –Speaker creates word –Speaker and hearer strengthen association between new word and target

Embodied Machines If hearer doesn’t have word, –Speaker points to target –Hearer creates association between new word and target –Speaker reduces strength of association between word and target

Embodied Machines If hearer has word, but it refers to a different concept –Hearer points to (wrong) target –Game fails –Speaker points to correct target –Hearer creates association between word and new target

Embodied Machines Carving up reality –No a priori categories are given to agents –Agents can perceive edges  contours  shapes, color, luminance, location of centers Possible categorization strategies: High thing Left thing

Embodied Machines –Correlates in biological Cognition Cotton, thistle, flax –Human: clothing source (cotton,flax)/ weed (thistle) –Boll Weevil: food (cotton)/weeds (thistle & flax) Absolute vs. relative reckoning systems Young woman/old woman

Embodied Machines Speaker: categorizes topic as VPOS says ‘lu’ Hearer: categorizes ‘lu’ as HPOS says ‘lu?’ (which lu are you talking about?) Speaker: points to target Hearer:categorizes topic as VPOS VP: HP: 0-0.5

Embodied Machines Speaker and hearer have different vantage points –Assume agents both have Left/Right distinction –Left and Right have body based vantage point SH

Embodied Machines –Correlates with metaphorical vantage points Sandwich in a garbage can –Food or garbage? –Depends on life circumstances, personal tolerances, etc. 16 year old murderer –Child or adult? –Depends on purposes of categorizer

Embodied Machines –Ambuiguity in the system Arises when agent has already associated a category with a word Speaker introduces new word for same category Negotiation takes place Across population, forms become strengthened or pruned Ambiguity can be maintained