Interactive Task Learning: Language Processing for Rosie John E. Laird and Peter Lindes University of Michigan 1.

Slides:



Advertisements
Similar presentations
Language Processing Hierarchy
Advertisements

Reference Resolution And Cognitive Grammar Susanne Salmon-Alt Laurent Romary Loria - Nancy, France ICCS-01 San Sebastian, May 2001.
SWG Strategy (C) Copyright IBM Corp. 2006, All Rights Reserved. P4 Task 2 Fact Extraction using a CNL Current Status David Mott, Dave Braines, ETS,
Pat Langley School of Computing and Informatics Arizona State University Tempe, Arizona USA Modeling Social Cognition in a Unified Cognitive Architecture.
C O N T E X T - F R E E LANGUAGES ( use a grammar to describe a language) 1.
Proceedings of the Conference on Intelligent Text Processing and Computational Linguistics (CICLing-2007) Learning for Semantic Parsing Advisor: Hsin-His.
CPSC 322 Introduction to Artificial Intelligence November 5, 2004.
Learning Tasks through Situated Interactive Instruction James Kirk, John Laird Soar Workshop
Introduction to SOAR Based on “a gentle introduction to soar: an Architecture for Human Cognition” by Jill Fain Lehman, John Laird, Paul Rosenbloom. Presented.
C. Varela; Adapted w/permission from S. Haridi and P. Van Roy1 Declarative Computation Model Defining practical programming languages Carlos Varela RPI.
The Importance of Architecture for Achieving Human-level AI John Laird University of Michigan June 17, th Soar Workshop
PSY 369: Psycholinguistics Some basic linguistic theory part3.
Polyscheme John Laird February 21, Major Observations Polyscheme is a FRAMEWORK not an architecture – Explicitly does not commit to specific primitives.
Models of Human Performance Dr. Chris Baber. 2 Objectives Introduce theory-based models for predicting human performance Introduce competence-based models.
© 2004 Soar Technology, Inc.  July 14, 2015  Slide 1 Thinking… …inside the box Randolph M. Jones Commercializing Soar: Software Engineering Perspective.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing INTRODUCTION Muhammed Al-Mulhem March 1, 2009.
Stages of Second Language Acquisition
Interactively Learning Game Formulations in a Physically Instantiated Environment James Kirk Soar Workshop 2013 June 6,
Learning Prepositions for Spatial Relationships in BOLT Soar Workshop 2012 James Kirk, John Laird 6/21/
Guide to Simulation Run Graphic: The simulation runs show ME (memory element) activation, production matching and production firing during activation of.
The Cognitive Perspective in Information Science Research Anthony Hughes Kristina Spurgin.
Reading. How do you think we read? -memorizing words on the page -extracting just the meanings of the words -playing a mental movie in our heads of what.
CISC 471 First Exam Review Game Questions. Overview 1 Draw the standard phases of a compiler for compiling a high level language to machine code, showing.
Chapter 6 Programming Languages (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
A Procedural Model of Language Understanding Terry Winograd in Schank and Colby, eds., Computer Models of Thought and Language, Freeman, 1973 발표자 : 소길자.
Introduction to CL & NLP CMSC April 1, 2003.
Procedural Knowledge.
Instructional Task Analysis The Essential Gagne. Learning Outcomes  Learning Outcomes are Behaviors:  The observable result of internal states called.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Future Memory Research in Soar Nate Derbinsky University of Michigan.
CSE573 Autumn /20/98 Planning/Language Administrative –PS3 due 2/23 –Midterms back today –Next topic: Natural Language Processing reading Chapter.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
CSE573 Autumn /23/98 Natural Language Processing Administrative –PS3 due today –PS4 out Wednesday, due Friday 3/13 (last day of class) special.
Following the 9/11 attacks, teams of robots were called to Ground Zero to aid in the rescue effort. These teams came from different organizations across.
Thought & Language. Thinking Thinking involves manipulating mental representations for a purpose. Thinking incorporates the use of: Words Mental Images.
Chapter 10. The Explorer System in Cognitive Systems, Christensen et al. Course: Robots Learning from Humans On, Kyoung-Woon Biointelligence Laboratory.
LANGUAGE IMPAIRED. ELIGIBILITY CRITERIA Language Impaired (LI) An impairment in the language system is an abnormal processing or production of: Form including.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Knowledge Structure Vijay Meena ( ) Gaurav Meena ( )
The “Spatial Turing Test” Stephan Winter, Yunhui Wu
1 UNIT-3 KNOWLEDGE REPRESENTATION. 2 Agents that reason logically(Logical agents) A Knowledge based Agent The Wumpus world environment Representation,
1 Situation Comprehension and Emotion Bob Marinier University of Michigan June 2005.
Competence-Preserving Retention of Learned Knowledge in Soar’s Working and Procedural Memories Nate Derbinsky, John E. Laird University of Michigan.
By Kyle McCardle.  Issues with Natural Language  Basic Components  Syntax  The Earley Parser  Transition Network Parsers  Augmented Transition Networks.
LOGICAL AGENTS CHAPTER 7 AIMA 3. OUTLINE  Knowledge-based agents  Wumpus world  Logic in general - models and entailment  Propositional (Boolean)
Modeling Primitive Skill Elements in Soar
Cognitive Language Processing for Rosie
Learning Fast and Slow John E. Laird
CS 326 Programming Languages, Concepts and Implementation
System Software Unit-1 (Language Processors) A TOY Compiler
Basic Parsing with Context Free Grammars Chapter 13
Learning the Problem Space from Primitives
Understanding Agent Knowledge through Conversation
How can Rosie tell me what it can do for me?
Natural Language Understanding
Intelligent Agents Chapter 2.
Knowledge Representation
Presentation by Julie Betlach 7/02/2009
Cognitive Language Comprehension in Rosie
Natural Language - General
Games (and concepts) Learnable by Rosie
Learning the Task Definition of Games through ITL
Natural Language Understanding
Knowledge Representation I (Propositional Logic)
Presented by Yuqian Jiang 2/27/2019
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
The Classical Approach to Categorization
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Presentation transcript:

Interactive Task Learning: Language Processing for Rosie John E. Laird and Peter Lindes University of Michigan 1

Interactive Task Learning Shiwali Mohan, James Kirk, Aaron Mininger An agent that learns new task specifications objects, features, relations, goals and subgoals, possible actions (physical and conceptual), situational constraints on behavior, policy for behavior, and when task is appropriate; using natural interaction: – language, gestures, sketches, demonstrations; comprehends task description and uses its cognitive and physical capabilities to perform task; learns fast (small numbers of experiences); learns native representation (assimilate, fast execution). 2

Current Research in Interactive Task Learning in Soar: Rosie Learns novel tasks using language (and goal demonstration) Games and Puzzles: James Kirk Mobile Robot tasks: Aaron Mininger 3

Processing for Task Learning 1.Perceive Environment 2.Parse Language in Context 3.Construct Task Representation 4.Interpret Task Representation 5.Search for Solution 6.Act in the World 4

Extract internal representation of objects in the world Perception, Semantic Memory → Working Memory 5 br2 r1 y1 b1 g2 p1 (w1 ^object (y1 ^type block ^color yellow ^size small) (r1 ^type block ^color red ^size medium)... ^relation (x1 ^type on1 ^arg1 y1 ^arg2 r1)... ^property (p1 ^name clear ^object y1)) Working Memory Visual Memory Use learned classifiers Extract relevant properties and relations Extract learned mappings to words from semantic memory Sensor

Parsing Language Perception, Working, Procedural, Semantic → Working Memory ›The name of the game is tower-of-hanoi. ›Ok, please teach me the actions and goals of the game. ›You can move a clear block onto a clear object that is larger than the block. ›I don’t know the concept clear. ›If an object is not below an object, then it is clear. ›Ok, I now understand the concept clear. ›The goal is that a large block is on the right location and a medium block is on the large block and a small block is on the medium block. ›The name of the game is tower-of-hanoi. ›Ok, please teach me the actions and goals of the game. ›You can move a clear block onto a clear object that is larger than the block. ›I don’t know the concept clear. ›If an object is not below an object, then it is clear. ›Ok, I now understand the concept clear. ›The goal is that a large block is on the right location and a medium block is on the large block and a small block is on the medium block. 6 (a1 ^action move22 ^modifier can ^arg1(^type block ^prop clear) ^arg2(^type object ^prop clear ^rel(larger x1)) Working Memory Deliberate reasoning (Procedural Memory) Our goal is sufficient, efficient language processing. Semantic structure

Construct Task Representation Working, Procedural → Working, Semantic (& Procedural) 7 (w1 ^game (g1 ^name tower-of-hanoi ^struct (a1 ^goal (g1...) ^operator (c1 ^name stack ^arg (C11...) (C12...)) (a1 ^action move22 ^modifier can ^arg1(^type block ^prop clear) ^arg2(^type object ^prop clear ^rel(larger x1)) (n1 ^message object-description ^arg1(^id of ^arg1(^id name ^arg1 game)) ^predicate tower-of-hanoi) (g1 ^message object-description ^arg1 (^id goal) ^subclause …) + + Working Memory Working Memory & Semantic Memory Deliberate reasoning (Procedural Memory) Goal Name Goal Description Action Description Learning (chunking) converts deliberate processing to reactive processing

Interpret and Operationalize Task Representation Working, Procedural → Working, Procedural 8 (w1 ^object (o1 ^type block ^color yellow ^size small) (o2 ^type block ^color red ^size medium)... ^relation (r1 ^type on ^arg1 o1 ^arg2 o2)... ^property (p1 ^name clear ^object o1)) (w1 ^game(g1 ^name tower-of-hanoi ^struct(a1 ^goal (g1...) ^operator(c1 ^name stack ^arg(C11...) (C12...)) (o1 ^name stack ^arg1 block1 ^arg2 block3) Working Memory Task Representation Environment Representation + Deliberate reasoning (Procedural Memory) Chunking converts deliberate processing to reactive processing (20x speedup) Environment

Search for a Solution Working, Procedural → Working, Procedural 9 If search fails, asks for instruction. Search is hierarchical, so can succeed at the abstract level, but fail for primitive execution and ask for help. Deliberate reasoning (Procedural Memory with Working Memory) Chunking coverts the search results to rules that implement a policy to select actions directly

Perception Word – Category Mappings Word – Category Mappings Parsing Interaction Verb Learning Noun Learning Prep Learning Action Knowledge Procedural Memory Preposition – Spatial Relation Mappings Verb – Operator Mappings Noun/Adjective – Perceptual Symbol Mappings Semantic Memory Fixed Locations Primitive Verb – Operator Mappings Episodic Memory Agent’s Experiences Working Memory Spatial Visual System Spatial Primitives Action Memory Structures Innate Words Task Structure (Goals & Operators) Task Learning Primitive Actions Mapping Knowledge Constructions Innate Mappings

Language Processing Goals Flexible, extendable parser for interactive task learning – Inspired by human-level processing Grounds understanding in environment (when possible) Use word by word, incremental repair-based parsing – Inspired by NL-Soar – Extend to constructions, word retrieval ambiguity resolution, and real-world referent grounding – Incorporates syntax, semantics, and pragmatic processing Use Embodied Construction Grammar (ECG) – Theory of complex language usage and connections between form (syntax) and meaning (semantics and pragmatics). – Syntax and semantics are associated with words, phrases, constructions 11

Two Approaches Laird: All language specific knowledge starts in semantic memory (syntax, semantics) – Pro: have (vague) story as to where this information could be learned by experience. – Pro: in production – used by Rosie for all language processing – Con: not linguistically sound (cognitive linguistics) Lindes: All language specific knowledge starts in procedural memory (syntax, semantics) – Pro: linguistically sound (cognitive linguistics) – Pro: compiler from embodied construction grammar formalism into Soar rules: English and Spanish! – Con: not a good story of how it can be learned – Con: not yet in production 12

Parser Properties Referring expressions grounded in environment. Uses context for referent resolution for objects. Move the red block behind the blue block to the right of the green block. Creates internal hypothetical objects and supports anaphoric references. If a location is next to a clear location but it is not diagonal with the clear location then it is adjacent to the clear location. 13 Adjacent

Example Sentences Last Year 1.Red is a color. 2.The large one is red. 3.This is a big triangle. 4.Store the green block. 5.What is inside the pantry? 6.It is on the big green block. 7.Move the green block to the left of the large green block to the pantry. 8.Stack the red triangle, the medium block, and the large block. 9.Move forward until you see a doorway. 14

Example Sentences this Year 1.If a card is on the deck and it is not below another card then it is the top card. 2.Put it on this. 3.The goal is that all the missionaries and cannibals are on the right side of the river. 4.You can move a person that is on the current bank and another person that is on the current bank and the boat onto the opposite bank. 5.If the locations between a clear location and a captured location are occupied then you can place a piece on the clear location. 6.The goal is that all locations are covered and the number of captured locations is more than the number of occupied locations. 15

Soar Agent Sentence processing in LUCIA 16 ECG Grammar Files Translator Grammar Rules Infra- structure Rules World Model Ontology Input Word s Action Messag es Comprehender Pick up the green sphere on the stove. Rosie Operations

The comprehension engine 17 Comprehender Words Action messages comprehend-word lexical-accessmatch-construction ground-x lookup-x attach-x resolve-pronoun comprehend-word-done ECG- generated grammar rules Hand-coded infrastructure rules

Lucia 18 Pick up the green sphere. Comprehender 1 Pick 2 up 3 the 4 green 5 sphere. ActOnIt object: large-green-sphere1

19 Stage 1: Pick comprehend-word lexical-access PickVerb PICK Pick match-construction Action Descriptor lookup-action

20 Stage 2: up comprehend-word lexical-access PickVerb PICK Pick match-construction Action Descriptor PickUp UP up Compositional

21 Stage 3: the comprehend-word lexical-access PickVerb PICK Pick Action Descriptor PickUp UP up THE the

22 Stage 4: green comprehend-word lexical-access PickVerb PICK Pick Action Descriptor PickUp UP up THE the GREE N gree n lookup-property Property r green 4 Grounded

Lucia 23 Stage 5: sphere. comprehend-word lexical-access PickVerb PICK Pick Action Descriptor PickUp UP up THE the GREE N green Property match-construction ground-reference match-construction SPHERE sphere. RefExpr Entity block sphere1 Reference Descriptor object block sphere1 green1large1 Transitive Command ActOnIt large-green-sphere1

Future Work Continue to extend to cover syntax, … “New” parser: – Language knowledge in semantic memory (Laird) – Linguistically sound – better ontology, … (Lindes) – Compile from ECG (Lindes) – Take advantage of spreading activation to aid retrieval in ambiguous words and constructions (S. Jones) 24