Integrated Episodic and Semantic Memory in Robotics Steve Furtwangler, with Robert Marinier, Jacob Crossman.

Slides:



Advertisements
Similar presentations
Lukas Blunschi Claudio Jossen Donald Kossmann Magdalini Mori Kurt Stockinger.
Advertisements

1 CHAPTER 4 RELATIONAL ALGEBRA AND CALCULUS. 2 Introduction - We discuss here two mathematical formalisms which can be used as the basis for stating and.
Situation Calculus for Action Descriptions We talked about STRIPS representations for actions. Another common representation is called the Situation Calculus.
Problem Solving by Searching Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 3 Spring 2007.
1 Latent Semantic Mapping: Dimensionality Reduction via Globally Optimal Continuous Parameter Modeling Jerome R. Bellegarda.
Michael Armbrust A Functional Query Optimization Framework.
Multidimensional Data. Many applications of databases are "geographic" = 2­dimensional data. Others involve large numbers of dimensions. Example: data.
Luigi Portinale, Pietro Torasso and Diego Margo Selecting Most Adaptable Diagnostic Solutions through Pivoting-Based Retrieval Teacher : C.S. Ho Student.
Topic Denormalisation S McKeever Advanced Databases 1.
Multiple-key indexes Index on one attribute provides pointer to an index on the other. If V is a value of the first attribute, then the index we reach.
Instance Based Learning
Chapter 4: Trees Radix Search Trees Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm Analysis in Java.
1 Episodic Memory for Soar Agents Andrew Nuxoll 11 June 2004.
Impact of Working Memory Activation on Agent Design John Laird, University of Michigan 1.
1 Soar Semantic Memory Yongjia Wang University of Michigan.
Case-based Reasoning System (CBR)
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Instance Based Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán) 1.
Latent Semantic Analysis (LSA). Introduction to LSA Learning Model Uses Singular Value Decomposition (SVD) to simulate human learning of word and passage.
INSTANCE-BASE LEARNING
Soar-RL: Reinforcement Learning and Soar Shelley Nason.
CIS607, Fall 2005 Semantic Information Integration Article Name: Clio Grows Up: From Research Prototype to Industrial Tool Name: DH(Dong Hwi) kwak Date:
Multidimensional Data Many applications of databases are ``geographic'' = 2­dimensional data. Others involve large numbers of dimensions. Example: data.
1.3 ORGANIZATIONAL PLANNING & DECISION MAKING INTRODUCTION TO DECISION TREES (HIGHER LEVEL CONTENT)
Longbiao Kang, Baotian Hu, Xiangping Wu, Qingcai Chen, and Yan He Intelligent Computing Research Center, School of Computer Science and Technology, Harbin.
CHAPTER 3: DEVELOPING LITERATURE REVIEW SKILLS
QUIZ!!  T/F: The forward algorithm is really variable elimination, over time. TRUE  T/F: Particle Filtering is really sampling, over time. TRUE  T/F:
A Multi-Domain Evaluation of Scaling in Soar’s Episodic Memory Nate Derbinsky, Justin Li, John E. Laird University of Michigan.
Computer Go : A Go player Rohit Gurjar CS365 Project Proposal, IIT Kanpur Guided By – Prof. Amitabha Mukerjee.
PowerPoint Presentation for Dennis, Wixom, & Tegarden Systems Analysis and Design with UML, 4th Edition Copyright © 2009 John Wiley & Sons, Inc. All rights.
27th April 2006Semantics & Ontologies in GI Services Semantic similarity measurement in a wayfinding service Martin Raubal
Bell Work Write the answers on the left hand side of your IAN
THE ACT TEST Austin English 11. What’s on the Test?????? in English 1.45 minutes – 75 items 1.Tests you knowledge on: Punctuation USAGE & GrammarMECHANICS.
1 5 Normalization. 2 5 Database Design Give some body of data to be represented in a database, how do we decide on a suitable logical structure for that.
A fast algorithm for the generalized k- keyword proximity problem given keyword offsets Sung-Ryul Kim, Inbok Lee, Kunsoo Park Information Processing Letters,
Data Structures and Algorithms Lecture 1 Instructor: Quratulain Date: 1 st Sep, 2009.
Future Memory Research in Soar Nate Derbinsky University of Michigan.
Exceptions, cont’d. Factory Design Pattern COMP 401 Fall 2014 Lecture 12 9/30/2014.
Event retrieval in large video collections with circulant temporal encoding CVPR 2013 Oral.
In this session, you will learn to: Describe data redundancy Describe the first, second, and third normal forms Describe the Boyce-Codd Normal Form Appreciate.
NR 143 Study Overview: part 1 By Austin Troy University of Vermont Using GIS-- Introduction to GIS.
Information Integration By Neel Bavishi. Mediator Introduction A mediator supports a virtual view or collection of views that integrates several sources.
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
Learning to Share Meaning in a Multi-Agent System (Part I) Ganesh Padmanabhan.
Rational Agency CSMC Introduction to Artificial Intelligence January 8, 2007.
Rational Agency CSMC Introduction to Artificial Intelligence January 8, 2004.
Game tree search Chapter 6 (6.1 to 6.3 and 6.6) cover games. 6.6 covers state of the art game players in particular. 6.5 covers games that involve uncertainty.
PowerPoint Presentation for Dennis, Wixom, & Tegarden Systems Analysis and Design with UML, 5th Edition Copyright © 2015 John Wiley & Sons, Inc. All rights.
User Interfaces and Information Retrieval Dina Reitmeyer WIRED (i385d)
CSCI 6962: Server-side Design and Programming Shopping Carts and Databases.
CognitiveViews of Learning Chapter 7. Overview n n The Cognitive Perspective n n Information Processing n n Metacognition n n Becoming Knowledgeable.
Kansas State University Department of Computing and Information Sciences CIS 890: Special Topics in Intelligent Systems Wednesday, November 15, 2000 Cecil.
Competence-Preserving Retention of Learned Knowledge in Soar’s Working and Procedural Memories Nate Derbinsky, John E. Laird University of Michigan.
Probabilistic Reasoning Inference and Relational Bayesian Networks.
SQL IMPLEMENTATION & ADMINISTRATION Indexing & Views.
Action Modeling with Graph-Based Version Spaces in Soar
Containers and Lists CIS 40 – Introduction to Programming in Python
Testing and Debugging.
How can Rosie tell me what it can do for me?
Problem Solving by Searching
Soar 9.6.0’s Instance-Based Model of Semantic Memory
Case-Based Reasoning.
Multimedia Information Retrieval
Routing and Logistics Arc Routing 2018/11/19.
Playing with Semantic Memory
Spreading With Edge Weights
Joseph Xu Soar Workshop 31 June 2011
Arrays.
Memory-Based Learning Instance-Based Learning K-Nearest Neighbor
Presentation transcript:

Integrated Episodic and Semantic Memory in Robotics Steve Furtwangler, with Robert Marinier, Jacob Crossman

Introduction Robotics domain has some unique challenges General patterns or issues we encountered working in robotics Specifically, I will talk about Measuring similarity in semantic memory Using episodic and semantic memory together Required or prohibited query conditions Recreation of state 2

Using Episodic Memory for Partial Matches The agent creates a statistical model of its world The statistics are stored in semantic memory Long-term identifiers created for each thing we are modeling Statistics kept on these identifiers Sometimes need to find similar things Semantic memory doesn’t support partial matches We decided to leverage episodic memory to do this instead Example: If the agent has no/little statistical data for this exact situation Can ask if it was ever in a situation like this one If so, look up statistical data for that situation in semantic memory 3

Episode Representation - Unique Cues - Cell S1 cell C1 x 1 2 High Low y road trees Invariable Attributes Variable Attributes Q1 x 1 y 2 x 1 y 2 id Find LTI with that cue, or create a new one if it is not found 4

Episode Representation - Unique Cues - Path S1 path P1 cell C1 High Low road trees Create one level deep cue for path, using unique-ids of cells and order of cells cell C2 cell id Paths which may have complex, deep working memory structures cue 3 5

Measuring Similarity Low Med High V.Low V.High M.Low M.High 6 Cue Results Match Scores C1 feature Low C1 feature High C1 feature Med0 0 C1 feature1 Low M.Low feature2 C1 feature1 Med M.Low feature2 C1 feature1 High M.High feature2 1 0 One dimension doesn’t capture similarity Adding a second dimension helps

Episodic and Semantic Memory Conflicts The objects in memory are identified in semantic memory Some of the attributes on these objects (statistics) change over time These long-term identifiers are referenced on the topstate So they show up in episodic memory However, when episodic memory recreates the episode It recreates the attributes and values that the LTI had at the time 7

Example of Problem object value object value Q1 query OldR1 value object Q1 object query result value Problem: Cannot distinguish value in episode from current value Attribute “value” becomes a multivalued attribute 8

Q1 cell C1 road Med Low trees Episodic Memory Cue Solution: Long-Term Identifier Usage Pattern Problem Space S1 cell C1 x 1 2 High Low y road x 1 y 2 id Episodic Memory result Q2 Semantic Memory cue Semantic Memory result id 2 success failure 12 Statistics not stored with episodes Conceptually, two kinds of LTIs 9

Required (or Prohibited) Query Conditions Queries to Episodic Memory often have two different kinds of conditions Things that have to exist in the episode (or should not exist) This tends to decide of the episode is even relevant or not Things which are optional, but should be as similar as possible Example: Query for a similar situation where the agent decided to go right In order to reason about what might happen if I turn right, now Result is a situation like the current situation, only the agent went left Has to be prohibited, until the agent gets a memory of going right Leads to a common pattern… 10

Solution: Episodic Memory Loop Pattern Construct Query Retrieve Episode Filter Episode Continue? Map ResultAnnotate Input 11

Time Spent Recreating State Often create episodic memory queries to answer a specific question “When I was last in this location, what time of day was it?” Retrieving the episode creates a lot of WMEs to recreate the whole state “Last time you were at this location, it was a Tuesday, it was raining, your fuel was at 90%... Yada yada yada… oh, and it was 5:35pm.” The time to recreate a state is, in part, based on size of that state We often look at one small piece of that result and throw it all way Causing all of those WMEs to immediately be removed The filtering loop may cause this to happen many times 12

Nuggets Reduced instances of repeated failure Agent doesn’t do the same dumb thing twice Constructed model for environment/plans Accuracy of estimations improve with experience Incorporated models of similar environments/plans Agent came to useful conclusions for new (untested) plans 13

Wish List (Coal) Partial matching for semantic memory Using episodic memory to achieve this is a hack Metric/Custom comparison functions Necessary for queries about similarity in space or to weight features Safeguards for episodic memory retrievals of long-term identifiers To reason about what a LTI looked like in the past as opposed to now Require/Prohibit queries in episodic and semantic memory Eliminate the epmem loop pattern used to filter out bad results Ability to specify sub-section of state to retrieve from episodic memory Often only care about a few key WMEs Reconstructing the entire state takes time 14