Chapter 8: Recurrent associative networks and episodic memory

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Chapter3 Pattern Association & Associative Memory
Memristor in Learning Neural Networks
G53MLE | Machine Learning | Dr Guoping Qiu
Gain control in insect olfaction for efficient odor recognition Ramón Huerta Institute for Nonlinear Science UCSD.
I said to myself about the language of men, when they prove God, and see that they themselves are beasts: the case of humans and the case of beasts are.
Artificial Spiking Neural Networks
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Chapter Seven The Network Approach: Mind as a Web.
Introduction to Recurrent neural networks (RNN), Long short-term memory (LSTM) Wenjie Pei In this coffee talk, I would like to present you some basic.
1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.
1 8. Recurrent associative networks and episodic memory Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
Learning and Memory Computational Cognitive Neuroscience Randall O’Reilly.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Topological organization of the information coding in hippocampus Yuri Dabaghian Frank Lab Sloan Swartz Centers for Integrative Neuroscience UCSF.
Artificial Neural Networks An Overview and Analysis.
Explorations in Neural Networks Tianhui Cai Period 3.
Asymmetric-Key Cryptography Also known as public-key cryptography, performs encryption and decryption with two different algorithms. Each node announces.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
0 Chapter 1: Introduction Fundamentals of Computational Neuroscience Dec 09.
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
1 8. Auto-associative memory and network dynamics Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
The Perceptron. Perceptron Pattern Classification One of the purposes that neural networks are used for is pattern classification. Once the neural network.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
The Supervised Network Self- Organizing Map for Classification of Large Data Sets Authors: Papadimitriou et al, Advisor: Dr. Hsu Graduate: Yu-Wei Su.
Distributed Pattern Recognition System, Web-based by Nadeem Ahmed.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
0 Chapter 3: Simplified neuron and population models Fundamentals of Computational Neuroscience Dec 09.
Ghent University An overview of Reservoir Computing: theory, applications and implementations Benjamin Schrauwen David Verstraeten and Jan Van Campenhout.
Memory. Hopfield Network Content addressable Attractor network.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
An Evolutionary Algorithm for Neural Network Learning using Direct Encoding Paul Batchis Department of Computer Science Rutgers University.
Ch 7. Computing with Population Coding Summarized by Kim, Kwonill Bayesian Brain: Probabilistic Approaches to Neural Coding P. Latham & A. Pouget.
4 Starting Tips to Keep Your Car in Top Condition
1 Introduction to Neural Networks Recurrent Neural Networks.
Cholinergic Modulation of the Hippocampus Computational Models of Neural Systems Lecture 2.5 David S. Touretzky September, 2007.
 ({ri}) ri (x,t) r (x,t) r (t)
Complementary Learning Systems
Neural Networks.
Intelligent Information System Lab
RECURRENT NEURAL NETWORKS FOR VOICE ACTIVITY DETECTION
مدلسازی سیستم های بیولو ژیکی توسط شبکه های عصبی بازگشتی
Artificial Neural Networks
Capacity of auto-associative networks
Dr. Unnikrishnan P.C. Professor, EEE
Prof. Carolina Ruiz Department of Computer Science
Chapter 7: Cortical maps and competitive population coding
Covariation Learning and Auto-Associative Memory
network of simple neuron-like computing elements
Chapter 6: Feedforward mapping networks
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Classification functions of chaotic pulse-coupled networks
Higher Human Biology Memory Part 4.
Prepared by: Mahmoud Rafeek Al-Farra
Chapter 10: The cognitive brain
Parallel Methods for Neuron Network
Chapter 2: Neurons and conductance-based models
Prediction Networks Prediction A simple example (section 3.7.3)
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
The Network Approach: Mind as a Web
Introduction to Neural Network
SISSA limbo CÉREBRO, VIDA e CULTURA We can easily describe the brain;
Continuous attractor neural networks (CANNs)
Word.
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Chapter 8: Recurrent associative networks and episodic memory Fundamentals of Computational Neuroscience Chapter 8: Recurrent associative networks and episodic memory Dec 09

Memory classification scheme (Squire)

Auto-associative memory and the hippocampus David Marr: Simple memory: a theory for archicortex, 1971

Point attractor neural network (ANN)

ann_cont.m

ann_fixpoint.m

Memory breakdown

Probabilistic RNN

Phase diagram

Noisy weights and diluted attractor networks

How o minimize interference between patterns?

Expansion re-coding (e.g. dentate gyrus)

Sparse patterns

Three nodes with  and -  coupling: Lorenz attractor

Cohen-Grossberg theorem

Recurrent networks with non-symmetric weights

Further readings