False Memory Phenomena James Raftery Psy/Orf 322 Spring 2004.

Slides:



Advertisements
Similar presentations
Pattern Association.
Advertisements

Chapter 9 Memory pt. 3: Motivated Forgetting and Memory Reconstruction.
Control path Recall that the control path is the physical entity in a processor which: fetches instructions, fetches operands, decodes instructions, schedules.
Episodic Memory Memory for an episode or event in your own life.
Cognitive Science Jose Tabares Psych 202B January 23, 2006.
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
When Memories Go Wrong What happens when your memory of an event does not correspond to what actually happened? –In what ways can our decisions get warped.
Memory II Reconstructive Memory Forgetting. Observe this crime scene.
Lecture 6 – Long Term Memory (2)1 1. Do we learn only with intention – or also without intention? We learn with and without intention. 2.Is learning influenced.
Does the Brain Use Symbols or Distributed Representations? James L. McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford.
Readings 25 & 26. Reading 25: Classic Memory and the eye-witness Experiment 1 Experiment 2 Conclusion Reading 26: Contemporary Misinformation Effect Memory.
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Cognitive Psychology Chapter 7. Cognitive Psychology: Overview  Cognitive psychology is the study of perception, learning, memory, and thought  The.
Knowing Semantic memory.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Presenting: Itai Avron Supervisor: Chen Koren Final Presentation Spring 2005 Implementation of Artificial Intelligence System on FPGA.
Modeling Cross-Episodic Migration of Memory Using Neural Networks by, Adam Britt Definitions: Episodic Memory – Memory of a specific event, combination.
Chapter Seven The Network Approach: Mind as a Web.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #31 4/17/02 Neural Networks.
Bartlett’s concept of schema in reconstruction
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Recalling Memories Memory is affected by the nature of your engagement with the information Levels-of-Processing Theory.
Memory Chapter 6.
Forgetting Memory.
1 How to Take Tests 3 Using Concepts, Procedures, and Rules.
General Knowledge Dr. Claudia J. Stanny EXP 4507 Memory & Cognition Spring 2009.
O.N. Varma Associate Professor, Education Studies Dept 2006 Copyright: This material can be downloaded and freely distributed for use in the classroom.
Hybrid AI & Machine Learning Systems Using Ne ural Network and Subsumption Architecture Libraries By Logan Kearsley.
Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley.
Repression- Freud Freud came up with the idea that we forcibly forget facts that provoke anxiety or unhappiness, therefore protecting ourselves against.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
Memory Chapter 7 Continued…. How is knowledge organized?  Clustering: the tendency to remember similar or related items in groups  Conceptual Hierarchy:
Traditional vs. Alternative Assessment
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
Developing Effective Study Groups Working Collaboratively.
Reliability of one cognitive process
AI & Machine Learning Libraries By Logan Kearsley.
Framework For PDP Models Psych /719 Jan 18, 2001.
The Emergent Structure of Semantic Knowledge
Emergent Semantics: Meaning and Metaphor Jay McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford University.
Chapter 9 Memory pt. 3: Motivated Forgetting and Memory Reconstruction.
Brunning – Chapter 5 Retrieval Processes. Encoding Specificity Tulving & Osler (1968) –Encoding is enhanced when conditions at retrieval match those present.
Memory Day 3 LONG TERM MEMORY: ENCODING. Long term Memory  The information processing model posits that long-term memory is a system that encodes, stores,
How Do We Retrieve Memories? Whether memories are implicit or explicit, successful retrieval depends on how they were encoded and how they are cued.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Chapter 9 Knowledge. Some Questions to Consider Why is it difficult to decide if a particular object belongs to a particular category, such as “chair,”
Factors affecting the accuracy of memory
Interactive Topic Test
Hardware Descriptions of Multi-Layer Perceptions with Different Abstraction Levels Paper by E.M. Ortigosa , A. Canas, E.Ros, P.M. Ortigosa, S. Mota , J.
Psychology 209 – Winter 2017 January 31, 2017
Deep Feedforward Networks
Neural Networks.
Implementing a sequence machine using spiking neurons
Chapter 9 Memory pt. 3: Motivated Forgetting and Memory Reconstruction
Level of Encoding and False Memory Typicality
Evaluation of Mobile Interfaces
OVERVIEW OF BIOLOGICAL NEURONS
Ying Dai Faculty of software and information science,
Cognitive Level of Analysis: Cognitive Processes
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
The Network Approach: Mind as a Web
Introduction to Neural Network
Ch6: AM and BAM 6.1 Introduction AM: Associative Memory
GOSCIENCE TRAINING: ENHANCING COMPREHENSION IN SCIENCE EDUCATION
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

False Memory Phenomena James Raftery Psy/Orf 322 Spring 2004

Can we trust our memory? Victim Testimony “Repressed” abuse in therapy

What is the issue? Memory as constructive process Coherent memory –Bits of actual memory –Context –Repetition

False Memory Experiment Quantitative Measurements –Inherent difficulties Simple measurable tasks = little error Reproductive VS Reconstructive? –Henry Roediger/Kathleen McDermott’s experiments

Method Lists of Words “Critical Lure” Bidirectional associations 42-item lists read aloud Subject wrote down as many as could be remembered without ‘guessing’ At the end asked to judge if they had seen certain words on the list

Experiment Results Critical Lure 55% (higher than in exp1) Recall Condition Enhanced Effect Had physical memories of the lures

Implications False Memory is quantifiable All memory is constructive, not reproductive – even rote recall Effects were substantial in a lab setting, in a traumatic situation effects are likely even more powerful

Parallel Distributed Process Experiences are represented in the mind by their several aspects (within descriptive categories) Different experiences share aspects Connections between two aspects are strengthened when both are activated

Dynamic Nature Of PDP Memory Later events alter connections When one aspect is used as trigger for memory, it activates all others that it is connected with A lack of a strong connection will allow for several weak connections to be asserted, giving rise to confusion or generalization

False Memory Simulator Purpose: To replicate Roediger and McDermott’s results with a neural network All words in list related to each other Primarily activate themselves, but to some extent activate similar words If a word (critical lure) reaches a certain threshold of excitement it will register as having been on the list

Implementation Each word has a weight with all other words on list –Weights stored in a matrix –Strong weight associated between a word and itself –Vector of input words put through matrix determines net excitation of all words –If a specific threshold is reached, the network will output it as a recalled word