Shane T. Mueller, Ph.D. Indiana University Klein Associates/ARA Rich Shiffrin Indiana University and Memory, Attention & Perception Lab REM-II: A model.

Slides:



Advertisements
Similar presentations
Chapter Five The Cognitive Approach II: Memory, Imagery, and Problem Solving.
Advertisements

Hogeschool van Amsterdam Interactieve Media Memory & media Hoorcollege - Using Media week 4.
Benjamin Allred 벤자민 알레드 Contents  Questions to Think About  Definitions  Recognition Versus Recall  Single Process Models  Generate-Recognize Models.
Cody Reardon Human Behavior
The Impact of Task and Corpus on Event Extraction Systems Ralph Grishman New York University Malta, May 2010 NYU.
Statistical Issues in Research Planning and Evaluation
Probability - 1 Probability statements are about likelihood, NOT determinism Example: You can’t say there is a 100% chance of rain (no possibility of.
Probabilistic inference in human semantic memory Mark Steyvers, Tomas L. Griffiths, and Simon Dennis 소프트컴퓨팅연구실오근현 TRENDS in Cognitive Sciences vol. 10,
Retrieval: How We Recall the Past from Episodic Memory
SSP Re-hosting System Development: CLBM Overview and Module Recognition SSP Team Department of ECE Stevens Institute of Technology Presented by Hongbing.
Emergence in Cognitive Science: Semantic Cognition Jay McClelland Stanford University.
Knowing Semantic memory.
A probabilistic approach to semantic representation Paper by Thomas L. Griffiths and Mark Steyvers.
Latent Semantic Analysis (LSA). Introduction to LSA Learning Model Uses Singular Value Decomposition (SVD) to simulate human learning of word and passage.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Maximum Entropy Model & Generalized Iterative Scaling Arindam Bose CS 621 – Artificial Intelligence 27 th August, 2007.
Statistics in Bioinformatics May 12, 2005 Quiz 3-on May 12 Learning objectives-Understand equally likely outcomes, counting techniques (Example, genetic.
Cognitive Psychology, 2 nd Ed. Chapter 8 Semantic Memory.
Cognitive Science Overview Ausubel’s Meaningful Reception Learning Theory Schema Theory Advance Organizers Managing Essential Processing Application to.
A Review of Children, Humanoid Robots and Caregivers (Arsenio, 2004) COM3240 – Week 3 Presented by Gizdem Akdur.
Information Processing Approach Define cognition and differentiate among the stage, levels-of-processing, parallel distributed processing, and connectionist.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Semantic Memory Memory for meaning
Long-Term Memory Ch. 3 Review A Framework Types of Memory stores Building Blocks of Cognition Evolving Models Levels of Processing.
Cognitive Architectures: A Way Forward for the Psychology of Programming Michael Hansen Ph.D Student in Comp/Cog Sci CREST / Percepts & Concepts Lab Indiana.
Cognitive Information Processing Dr. K. A. Korb University of Jos.
Memory Objectives To give the concept of memory To discuss the process of memory To understand different problems with the memory To learn about memory.
MEMORY. Sensory Memory Sensory Memory: The sensory memory retains an exact copy of what is seen or heard (visual and auditory). It only lasts for a few.
Age of acquisition and frequency of occurrence: Implications for experience based models of word processing and sentence parsing Marc Brysbaert.
Encoding Specificity Memory is improved when information available at encoding is also available at retrieval.
Katrin Erk Vector space models of word meaning. Geometric interpretation of lists of feature/value pairs In cognitive science: representation of a concept.
 How does memory affect your identity?  If you didn’t have a memory how would your answer the question – How are you today?
PS Introduction to Psychology December 12, 2011 Memory.
Emergence of Semantic Knowledge from Experience Jay McClelland Stanford University.
Memory Chapter 7. What Is Memory?Memory Use for the Short TermLong-Term Memory: Encoding and RetrievalStructures in Long-Term MemoryBiological Aspects.
SINGULAR VALUE DECOMPOSITION (SVD)
Memory. What is Memory? Memory is a system that encodes, stores and retrieves information –Process by which information is taken in, converted to meaningful.
Chapter Five The Cognitive Approach II: Memory, Imagery, and Problem Solving.
CHAPTER FIVE The Cognitive Approach II: Memory, Imagery, and Problem Solving.
1 Statistical NLP: Lecture 7 Collocations. 2 Introduction 4 Collocations are characterized by limited compositionality. 4 Large overlap between the concepts.
Similarity and Attribution Contrasting Approaches To Semantic Knowledge Representation and Inference Jay McClelland Stanford University.
HUH? : WHEN MEMORY LAPSES.  Hermann Ebbinghaus tested memory  Created Forgetting Curve: graphs retention and forgetting over time  Showed steep drop.
Rapid integration of new schema- consistent information in the Complementary Learning Systems Theory Jay McClelland, Stanford University.
WHEN THE PROBE IS WILLING BUT THE MIND IS WEAK This research was supported by an NSERC operating grant For additional information please contact Marty.
Lesson Probability Rules. Objectives Understand the rules of probabilities Compute and interpret probabilities using the empirical method Compute.
The Memory Function of Sleep Group 1 Amanda Ayoub Hannah Stolarczyk Stephanie Regan Alicia Iafonaro.
QUANTITATIVE MODELS OF MEMORY The value of explicit models –Precision of thinking –Explanatory power –Interval- or ratio-scale predictions –The macho factor.
SUMARIZING INSTRUCTIONS: 1. State the author's most important idea. This is commonly referred to as the Main Idea and can be found in the author's thesis.
The Emergent Structure of Semantic Knowledge
Associative Theories of Long- Term Memory. Network Theory The basic notion that we need to explore is that memory consists of a large number of associations.
In a recognition test, participants typically make more hits and fewer false alarms on low-frequency words compared to high frequency words (A pattern.
CognitiveViews of Learning Chapter 7. Overview n n The Cognitive Perspective n n Information Processing n n Metacognition n n Becoming Knowledgeable.
Verbal Representation of Knowledge
Long Term Memory LONG TERM MEMORY (LTM)  Variety of information stored in LTM:  The capital of Turkey  How to drive a car.
Chapter 7 Memory. Objectives 7.1 Overview: What Is Memory? Explain how human memory differs from an objective video recording of events. 7.2 Constructing.
Elements Of Modeling. 1.Data Modeling  Data modeling answers a set of specific questions that are relevant to any data processing application. e.g. ◦
Chapter 9 Knowledge. Some Questions to Consider Why is it difficult to decide if a particular object belongs to a particular category, such as “chair,”
Chapter 7: Memory Key Terms
Memory Module One: Booklet #8.
Draw a Penny.
Memory Module One: Booklet #8.
CSC 594 Topics in AI – Natural Language Processing
[Human Memory] 10.Knowledge
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
Objects as Attributes for Scene Classification
Thinking about knowledge
Models of memory Nisheeth April 4th 2019.
Lecture 09: Introduction Image Recognition using Neural Networks
Presentation transcript:

Shane T. Mueller, Ph.D. Indiana University Klein Associates/ARA Rich Shiffrin Indiana University and Memory, Attention & Perception Lab REM-II: A model of the formation and use of episodic memory and semantic knowledge

Overview Prior knowledge helps us interpret world, storing meaningful episodic information Sparse but durable information accrues to form knowledge. Events Knowledge Episodic Memories

Past Approaches: Episodic Memory and Semantic Knowledge Episodic Memory  SAM, Minerva, TODAM  REM, BCDMEM Semantic Memory/Knowledge Structure  Collins & Quillian / Collins & Loftus, Rosch  HAM, ACT-R  Rumelhart & McClelland  Attractor Network Models, ART Produce Representations of real words  Webster, Roget, CYC, WordNet, Word Association Norms  LSA, HAL, Topics Model (corpus analysis)

Goals of REM-II Begin with a reasonably sophisticated model of Episodic Memory Specify how episodes accrue to form knowledge Specify how knowledge is used to encode new episodes Model the iterative process by which knowledge is formed and used.

Overview of REM-II Model 1)Feature-based representational system 2)Knowledge accumulates as feature co- occurrences 3)Episodic memory is encoded from knowledge 4)Memory search is a bayesian process 5)Knowledge differentiates through sharing features with co-occurrent concepts

(1) Episodic Memory Representation Features are any sufficiently unitized memory structure. Value-Strength Traces Enables coding of strength and value Represents impoverished episodic traces

(2) Feature Representations in Semantic Knowledge A concept corresponds to an accumulation of feature co-occurrences in episodes. Knowledge Matrix:  Set of multiple conditional representations [ ]

(2) Forming LTM Co-occurrence matrix from Episodes Any episodic trace can produce a co-occurrence matrix and form co- occurrence matrix at right Knowledge accrues by incorporating co-occurrence matrix from individual episodes into current knowledge structure.      

(2) LTM Co-occurrence matrix Over time, complex matrix representation will form. Word has two primary concepts: and is stronger than 34 Primary meanings could be extracted using factor analysis)

(3) Encoding Episodes from Knowledge Matrix Generic Encoding: pick row, pick feature, pick new row based on what has been sampled, repeat. Biased Encoding: pick row from another trace, pick feature from that row of current matrix, repeat.

(4) Formation of Model Mental Lexicon How do representations with multiple aspects form? Principle: concepts that appear together grow more similar (Hebbian-like principle). Start off each word with a 'unique' representation During encoding, keep track of local feature context Words encoded in a way biased by context. Knowledge re-stored along with some features from local context.

Semantic Spaces Demonstration The model experiences an environment in which:  Words from Set A and B do not occur together.  Polysemous words 1 & 2 occur with A and B but not together Local context formed by selecting features in recent past which were unlikely to occur by chance. In demo:  Form matrix representation for each word  Compute similarity between each obtained matrix  Perform 2-D MDS to visualize space over time.

Results: Bank Demonstration Each row depicts composite representation of word Words grow similar with experience

MDS Visualization of encoded episodes

Model for Episodic Memory Tasks Previous simulation shows that semantic spaces can emerge within the matrix representation These semantic spaces may produce typical frequency effects seen in episodic recognition memory. System must be extended to form model of episodic memory tasks (e.g., list memory) Three main steps:  Form Model Lexicon  Encode Episodes  Compute Likelihood

Step 1: Form Model Mental Lexicon Real words occur in different frequencies. Typically, the few most common words happen a lot, and there are a lot of rare words that happen a little.

Step 1: Form Model Lexicon Each word initialized with a unique representation. During lexicon formation:  observed tokens are encoded by consulting knowledge  features from local semantic context are added to relational features of trace.  New augmented trace is added to current semantic matrix for word. In this process, things that co-occur will gradually begin to share relational features. High-frequency words will grow more similar to one another; low frequency words remain unique.

Step 2: Encode Episodic Traces Episodic traces encoded from knowledge network Traces stored for all words on list At test, a probe is encoded as another episodic trace. Response depends on likelihood calculation  (Step 3): probability trace being produced by that probe.

Step 3: Calculate Likelihood and Make Response Essence: Compute probability that target came from probe (as opposed to from somewhere else). Compare values on features. Matches (c)(p(cat,feat)) + (1-c) (br.p(cat,feat))/(br.p(cat,feat)) Mismatches: (c)(0)+(1- c)(br.p(cat,feat))/br.p(cat,feat) = 1-c If likelihood is high enough, make positive response

Recognition Memory Demonstration Task: present 40-item list for study comprised of high and low-frequency words. Later, present 20 old words and 20 new words; participant says “Old” or “New” for each. Typical finding: LF words remembered better; higher hits and lower false alarms. In the model, lexicon formation makes HF words more similar to one another.

Mirror Frequency Effect

Model Summary Knowledge representation as a feature-based co- occurrence matrix. Concepts that co-occur grow more similar by sharing features. Together:  Form semantic spaces  Captures multiple senses and meanings of a concept  Accounts for frequency effect in recognition memory  Allow encoding biases to be explored. Unites previously independent approaches toward memory and knowledge, bringing new insights to both.

Applications Semantic Spaces Frequency Effects in Episodic Memory Biases in encoding Priming Implicit Association Test  Consistent correlations in environment embed multiple connotations in concepts. Text comprehension/disambiguation Whorfian Hypothesis Corpus Analysis