Impact of Working Memory Activation on Agent Design John Laird, University of Michigan 1.

Slides:



Advertisements
Similar presentations
BALANCING LIFES ISSUES, INC. Managing Multiple Priorities at Work.
Advertisements

Maurice Hendrix (Semi-)automatic authoring of AH.
Learning To Use Memory Nick Gorski & John Laird Soar Workshop 2011.
Integrated Episodic and Semantic Memory in Robotics Steve Furtwangler, with Robert Marinier, Jacob Crossman.
Intro to Computer Org. Pipelining, Part 2 – Data hazards + Stalls.
Stop You From Moving On Reasons 9. The road may be long, but it’s wide open. As Lao Tzu once said, “The journey of a thousand miles begins with one step.”
Examples of life goals: 1.Live on my own or with a family of my own. If I have this, I can use my non-working time how I see fit. FREE TIME! 2.Keep a job.
Motivation Are you motivated to achieve what you really want in life? And how hard do you push yourself to get things done? Wanting to do something and.
Confidence Intervals for Proportions
Expert System Human expert level performance Limited application area Large component of task specific knowledge Knowledge based system Task specific knowledge.
1 Chunking with Confidence John Laird University of Michigan June 17, th Soar Workshop
1 Episodic Memory for Soar Agents Andrew Nuxoll 11 June 2004.
1 Soar Semantic Memory Yongjia Wang University of Michigan.
Recursion.
A Soar’s Eye View of ACT-R John Laird 24 th Soar Workshop June 11, 2004.
Reinforcement Learning and Soar Shelley Nason. Reinforcement Learning Reinforcement learning: Learning how to act so as to maximize the expected cumulative.
Soar-RL: Reinforcement Learning and Soar Shelley Nason.
Mazin Assanie University of Michigan Soar 9.5 Beta and Explanation-Based Chunking.
Manage your mailbox V: Retrieve, back up, or share messages Use your stored messages Whether you’re using the Personal Folders method or the Archive method.
L-3 PROPRIETARY 1 Real-time IO-oriented Soar Agent for Base Station-Mobile Control David Daniel, Scott Hastings L-3 Communications.
Memory AP Psychology. Memory  Can you remember your first memory? Why do you think you can remember certain events in your life over others?
1 Plaxton Routing. 2 Introduction Plaxton routing is a scalable mechanism for accessing nearby copies of objects. Plaxton mesh is a data structure that.
Chapter 11: Artificial Intelligence
Using Excel for A – Z Analysis: ‘To Present’ items Jack Weinbender, Milligan College.
A Multi-Domain Evaluation of Scaling in Soar’s Episodic Memory Nate Derbinsky, Justin Li, John E. Laird University of Michigan.
Computer Organization CS345 David Monismith Based upon notes by Dr. Bill Siever and notes from the Patterson and Hennessy Text.
Theory Revision Chris Murphy. The Problem Sometimes we: – Have theories for existing data that do not match new data – Do not want to repeat learning.
Word Revise documents and keep track of changes. Use Track Changes and comments Course contents Overview: Insertions, deletions, comments Lesson 1: Stay.
…using Git/Tortoise Git
Introduction to search Chapter 3. Why study search? §Search is a basis for all AI l search proposed as the basis of intelligence l inference l all learning.
METHOD TEST PREP EDUCATIONAL SERIES The Five Most Common Student Mistakes on the SAT and ACT Evan Wessler Vice President of Education
Chapter 13 Recursion. Learning Objectives Recursive void Functions – Tracing recursive calls – Infinite recursion, overflows Recursive Functions that.
Introduction to search Chapter 3. Why study search? §Search is a basis for all AI l search proposed as the basis of intelligence l all learning algorithms,
IT253: Computer Organization
Integrating Background Knowledge and Reinforcement Learning for Action Selection John E. Laird Nate Derbinsky Miller Tinkerhess.
THE ACT TEST Austin English 11. What’s on the Test?????? in English 1.45 minutes – 75 items 1.Tests you knowledge on: Punctuation USAGE & GrammarMECHANICS.
CSIS 123A Lecture 9 Recursion Glenn Stevenson CSIS 113A MSJC.
Instructional Scaffolding. What is a scaffold? What does a scaffold do? What are some characteristics of scaffolding?
© 2006 by «Author»; made available under the EPL v1.0 | Date | Other Information, if necessary Doug Schaefer CDT DOM What is it? What should it be?
Future Memory Research in Soar Nate Derbinsky University of Michigan.
MYJ - Strengthening Family Relationships. Activities: View stories from p ‘You and Your Family’ article Discuss key points List the guidelines.
- Disk failure ways and their mitigation - Priya Gangaraju(Class Id-203)
Energy forms and transformations. What is energy? We use the word all the time – but very few people have a strong understanding what it is It.
Intro to Planning Or, how to represent the planning problem in logic.
(1) Test Driven Development Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii Honolulu.
SOAR A cognitive architecture By: Majid Ali Khan.
Beyond Chunking: Learning in Soar March 22, 2003 John E. Laird Shelley Nason, Andrew Nuxoll and a cast of many others University of Michigan.
Autonomous Mission Management of Unmanned Vehicles using Soar Scott Hanford Penn State Applied Research Lab Distribution A Approved for Public Release;
The Last Lecture CS 5010 Program Design Paradigms "Bootcamp" Lesson © Mitchell Wand, This work is licensed under a Creative Commons Attribution-NonCommercial.
The Naive Physics Perplex “Common sense is a wild thing, savage, and beyond rules.” (G. K. Chesterton, 1906) How do we build systems with “common-sense.
Competence-Preserving Retention of Learned Knowledge in Soar’s Working and Procedural Memories Nate Derbinsky, John E. Laird University of Michigan.
3/14/20161 SOAR CIS 479/579 Bruce R. Maxim UM-Dearborn.
Retele de senzori Curs 2 - 1st edition UNIVERSITATEA „ TRANSILVANIA ” DIN BRAŞOV FACULTATEA DE INGINERIE ELECTRICĂ ŞI ŞTIINŢA CALCULATOARELOR.
Language Learning for Busy People These documents are private and confidential. Please do not distribute.. Intermediate: I Disagree.
Action Modeling with Graph-Based Version Spaces in Soar
Modeling Primitive Skill Elements in Soar
Knowledge Representation and Reasoning
Learning Fast and Slow John E. Laird
Executable Specification for Soar Theory
Knowledge Representation and Reasoning
Soar 9.6.0’s Instance-Based Model of Semantic Memory
Achieving parsimony between NGS and the Michigan approach
John Laird, Nate Derbinsky , Jonathan Voigt University of Michigan
John E. Laird 32nd Soar Workshop
Playing with Semantic Memory
Spreading With Edge Weights
Bryan Stearns University of Michigan Soar Workshop - May 2018
Bryan Stearns University of Michigan Soar Workshop - May 2018
SOAR 1/18/2019.
CIS 488/588 Bruce R. Maxim UM-Dearborn
Presentation transcript:

Impact of Working Memory Activation on Agent Design John Laird, University of Michigan 1

What’s the Problem? How do you write a Soar agent so it can survive automatic removal from working memory activation decay? Many problems arise from the violation of Soar’s close-world assumption – Soar assumes that if something isn’t in working memory, it is not true, whereas it could be it just haven’t retrieved it from semantic memory. 2

Working Memory Activation & Semantic Memory Reminders 1.Only applies to WME’s with long-term identifiers. – Don’t have to worry about random structures disappearing. 2.Complete object is removed from working memory – Not individual WME’s [which violated closed-world assumption even more!] 3.Always use mirroring with semantic memory – Keeps WM in sync with SMEM without deliberate saves – If something is removed from WM it is in SMEM 3

Mirroring WME’s When something a WME is added to a LTI, then that WME is added to semantic memory. Eliminates need for most deliberate saves. For a graph structure (such as a topological map) 1.Deliberately save the root to SMEM. 2.All incremental substructure is automatically added to SMEM. 4

Example From Robot World 5 Room 1 Room 2 Room 3 Door 11 Door 22 Gateway 10 Gateway 13 (state ^id 1 ^type ^gateway ^neighbor ^next ^id 33 ^type wall ^x 34 ^y 44 ^dir south) … ^id 10 ^type gateway ^x 35 ^y 64 ^to 11) … ^id 11 ^type doorway … ^gateway … …) … ^id 2 ^type room …)

Desired Behavior 6

7

8

9

Problem #1 - Retrieval How does an agent maintain what it needs in working memory? 1.Proactive Retrieval – Maintain “cloud” of possibly relevant structures. 2.Reactive Retrieval – Retrieve structures when don’t know what to do. 10

Proactive 11

Proactive 12

Proactive 13

Proactive Retrieval 1.Manually identify those structures and substructures that are necessary for making decisions in current state. – If trying to reach the storage room, and the storage room is next to the current room, go to the gateway that leads to the storage room. 2.If ever those structures are missing, select an operator to retrieve them. – If the next room is missing, then retrieve it. sp {xxx (state ^name substate-name ^current-location.next ) ( -^id) --> ( ^operator +, =, >) ( ^lti-retrieve )} – If don’t prefer operator, agent might select a task operator based on incomplete information. 14

Representation Lesson Must have common attribute ( ^id ) with unique values for objects so can check if structure is in working memory. Otherwise will keep trying to retrieve even if successful. 15

Reactive Retrieval 1.If agent can’t make progress without information, retrieve when hits state no-change. – If trying to move to a gateway, and don’t have information on where it is, will impasse when try to select operators to move to it. sp {xxx (state ^impasse no-change ^attribute state ^superstate ) ( ^name go-to-gateway ^destination-gateway ) ( -^id) --> ( ^operator +, =) ( ^lti-retrieve )} 16

Proactive 17 ?

Reactive 18

Discussion Reactive is more elegant – Only done when needs it – Can imagine task-independent rules that retrieve missing structures – But relies on complete failure to make progress. I don’t believe it is possible to design agent so that only it needs reactive retrieval, but beyond scope of this talk. 19

Problem #2: Incremental Creation (avoid duplicate creation) 1.When creating a representation of a new room in working memory (and mirrored in SMEM), must use operators to create structures. – Need operators because room structure (wall, gateways) must be persistent and because there might be variable numbers of substructure. 2.Ambiguity when a wall structure has been removed. – Don’t know if a missing wall is in SMEM (can’t test id). – Should a new wall be created or should retrieve wall? 3.Solution – Record on room structure the id of a substructure when it is created: ^wall-id 34 ^gateway-id 45, … – Only create new wall if its id is not already created. 20

Rules for Recording Walls sp {robot*propose*record-smem-new-wall ( ^name robot ^current-location ^io.input-link.area ) ( ^id -^wall-id ) ( ^id ^wall ) ( ^id ) --> ( ^operator + =) ( ^name record-smem-new-wall ^wall )} sp {apply*record-smem-current-area-wall (state ^operator ^current-location ) ( ^id ) ( ^name record-smem-new-wall ^wall ) ( ^x ^y ^id ^direction ) --> ( ^wall ^wall-id ) ( ^type wall ^x ^y ^id ^direction )} 21

Problem #3 If takes too long to perform internal subtask, may continual forget and never complete 1.During extended internal planning, agent can forget original structures 2.Recalling structures might blow away intermediate results Approaches: 1.Do retrievals in substates – sometimes difficult 2.Cheat – add rules whose purpose is to keep structures active (redundant elaboration rules). 3.Use chunking (eliminates need for lots of intermediate structures) 4.Use different search methods (progressive deepening!!!). 22

Problem #4: Negation (closed world) Negations don’t help activation! – If the only tests of an attribute are negations, it will decay and be removed. – Structures that prevent action do not receive activation. Can’t have a lone negation of an object: – ( -^attribute value ) Can’t tell if does not have that value, or object has decayed – Always test id as well as rest of negation. – ( ^id -^attribute value) 23

Problem #5 Hypothetical Reasoning Hypothetical reasoning could overwrite contents of SMEM if you use mirroring and aren’t careful. – Changes to SMEM structures in look ahead will cause them to be changed in SMEM. – Must copy SMEM structures if they will be modified Standard in look-ahead, but might have new twists No experience with this yet. 24

Preliminary Data on Activation in Robot Task 25 Decay Rate determines how fast activation decay. 3 = Very slow. Few if any removals. Similar to no decay. 6 = Fast. Sometimes has difficulty maintaining everything it needs for the problem.

Preliminary Data on Activation in Robot Task 26 Decay Rate determines how fast activation decay. 3 = Very slow. Few if any removals. Similar to no decay. 6 = Fast. Sometimes has difficulty maintaining everything it needs for the problem.

Preliminary Data on Activation in Robot Task 27 Decay Rate determines how fast activation decay. 3 = Very slow. Few if any removals. Similar to no decay. 6 = Fast. Sometimes has difficulty maintaining everything it needs for the problem.

Preliminary Data on Activation in Robot Task 28 Decay Rate determines how fast activation decays. 3 = Very slow. Few if any removals. Similar to no decay. 6 = Fast. Sometimes has difficulty maintaining everything it needs for the problem.

Nuggets and Coal Takes more care when writing your agent – Must include special attributes (id) on smem objects. Creates some pressure for flat representations. Following a few basic rules worked for me. 29