Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing.

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Pattern Association.
Distributed Representation, Connection-Based Learning, and Memory Psychology 209 February 1, 2013.
Biological Modeling of Neural Networks: Week 9 – Coding and Decoding Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 What is a good neuron model? - Models.
Part III: Models of synaptic plasticity BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters Laboratory.
Introduction: Neurons and the Problem of Neural Coding Laboratory of Computational Neuroscience, LCN, CH 1015 Lausanne Swiss Federal Institute of Technology.
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Biological Modeling of Neural Networks: Week 11 – Continuum models: Cortical fields and perception Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Transients.
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Correlation Matrix Memory CS/CMPE 333 – Neural Networks.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
1 Artificial Neural Networks: An Introduction S. Bapi Raju Dept. of Computer and Information Sciences, University of Hyderabad.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Introduction to Neural Network Justin Jansen December 9 th 2002.
Perceptron Learning Rule Assuming the problem is linearly separable, there is a learning rule that converges in a finite time Motivation A new (unseen)
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA.
COMP305. Part I. Artificial neural networks.. Topic 3. Learning Rules of the Artificial Neural Networks.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Biological Modeling of Neural Networks: Week 15 – Population Dynamics: The Integral –Equation Approach Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1.
Supervised Hebbian Learning
Biological Modeling of Neural Networks Week 3 – Reducing detail : Two-dimensional neuron models Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley.
Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction.
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
Biological Modeling of Neural Networks: Week 9 – Adaptation and firing patterns Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 Firing patterns and adaptation.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity.
Learning to perceive how hand-written digits were drawn Geoffrey Hinton Canadian Institute for Advanced Research and University of Toronto.
Biological Modeling of Neural Networks Week 8 – Noisy output models: Escape rate and soft threshold Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Biological Modeling of Neural Networks: Week 12 – Decision models: Competitive dynamics Wulfram Gerstner EPFL, Lausanne, Switzerland 12.1 Review: Population.
Approaches to A. I. Thinking like humans Cognitive science Neuron level Neuroanatomical level Mind level Thinking rationally Aristotle, syllogisms Logic.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Announcements a3 is out, due 2/15 11:59pm Please please please start early quiz will be graded in about a week. a1 will be graded shortly—use glookup to.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Biological Modeling of Neural Networks Week 4 – Reducing detail - Adding detail Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley to.
Neuronal Dynamics: Computational Neuroscience of Single Neurons
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Biological Modeling of Neural Networks: Week 15 – Fast Transients and Rate models Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1 Review Populations.
3.Learning In previous lecture, we discussed the biological foundations of of neural computation including  single neuron models  connecting single neuron.
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 9 Model of Hopfield
Biological Modeling of Neural Networks: Week 10 – Neuronal Populations Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1 Cortical Populations - columns.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity - Hebbian Learning - Short-term Plasticity.
Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Neural Networks.
Real Neurons Cell structures Cell body Dendrites Axon
CSE 473 Introduction to Artificial Intelligence Neural Networks
Covariation Learning and Auto-Associative Memory
Corso su Sistemi complessi:
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
CSSE463: Image Recognition Day 17
Supervised Hebbian Learning
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Biological Modeling of Neural Networks Reading for week 5: NEURONAL DYNAMICS - Ch Cambridge Univ. Press

5.1. memory -president -first day at EPFL -apple Our memory has multiple aspects - recent and far-back - events, places, facts, concepts

5.1. Systems for computing and information processing Brain Computer CPU memory input Von Neumann architecture (10 transistors) 1 CPU Distributed architecture 10 (10 proc. Elements/neurons) No separation of processing and memory 10

neurons 3 km of wire 1mm 5.1. Systems for computing and information processing

Brain Distributed architecture neurons No separation of processing and memory 4 10 connections/neurons 5.1. Systems for computing and information processing neurons 3 km of wire 1mm

Read this text NOW! I find it rea*l* amazin* t*at y*u ar* abl* to re*d t*is tex* despit* th* fac* *hat more t*an t*ent* perc*n* of t** char*cte*s a*e mis*ing. *his mean* t*at you* brai* i* abl* ** fill in missin* info*matio* Associations, Associative memory

Noisy word pattern completion/word recognition brain List of words brai* atom brave brain brass Output the closest one Your brain fills in missing information: ‘associative memory’ 5.1. Associations, Associative memory

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5: Networks of Neurons-Introduction

image 5.2 Classification by similarity: pattern recognition Classification: comparison with prototypes A B T Z Prototypes Noisy image

Prototypes Noisy image Classification by closest prototype Blackboard: 5.2 Classification by similarity: pattern recognition

Noisy image Aim: Understand Associative Memory Associative memory/ collective computation Brain-style computation Full image Partial word Full word 5.2 pattern recognition and Pattern completion

Quiz 5.1: Connectivity A typical neuron in the brain makes connections -To 6-20 neighbors -To neurons nearby -To more than 1000 neurons nearby -To more than 1000 neurons nearby or far away. In a typical cristal in nature, each atom interacts -with 6-20 neighbors -with atoms nearby -with more than 1000 atoms nearby -with more than 1000 atoms nearby or far away.

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5: Networks of Neurons-Introduction

5.3 Detour: magnetism S N

Noisy magnetpure magnet 5.3 Detour: magnetism

Elementary magnet S i = +1 S i = -1 Blackboard: example 5.3 Detour: magnetism dynamics Sum over all interactions with i

Elementary magnet S i = +1 S i = -1 blackboard Anti-ferromagnet w ij = +1 w ij = Detour: magnetism dynamics Sum over all interactions with i

5.3 Magnetism and memory patterns Elementary pixel S i = +1 S i = -1 blackboard w ij = +1 w ij = -1 w ij = +1 Hopfield model: Several patterns  next section dynamics Sum over all interactions with i

Exercise 1: Associative memory (1 pattern) Elementary pixel S i = +1 S i = -1 w ij = +1 9 neurons - define appropriate weights - what happens if one neuron wrong? - what happens if n neurons wrong? Next lecture at 10h15 dynamics Sum over all interactions with i Exercises: later

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5: Networks of Neurons-Introduction

5.4 Hopfield Model of Associative Memory dynamics Sum over all interactions with i Hopfield model Prototype p 1 Prototype p 2 interactions Sum over all prototypes Blackboard: p i p j

dynamics all interactions with i DEMO Random patterns, fully connected: Hopfield model Prototype p 1 interactions Sum over all prototypes This rule is very good for random patterns It does not work well for correlated patters 5.4 Hopfield Model of Associative Memory J. Hopfield, 1982

5.4 Hopfield Model of Associative Memory overlap Blackboard

Hopfield model Prototype p 1 Finds the closest prototype i.e. maximal overlap (similarity) Interacting neurons Computation - without CPU, - without explicit memory unit 5.4 Hopfield Model of Associative Memory

Exercise 1: Associative memory (1 pattern) Elementary pixel S i = +1 S i = -1 w ij = +1 9 neurons - define appropriate weights - what happens if one neuron wrong? - what happens if n neurons wrong? Next lecture at 10h15 dynamics Sum over all interactions with i

Exercise 2.1 (now) and start with 2.2 Prototype p 1 Assume 4 patterns. At time t=0, overlap with Pattern 3, no overlap with other patterns. discuss temporal evolution (of overlaps) (assume that patterns are orthogonal) Sum over all interactions with i Next lecture at 11h15

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5-5: Learning of Associations

Hebbian Learning pre j post i When an axon of cell j repeatedly or persistently takes part in firing cell i, then j’s efficiency as one of the cells firing i is increased Hebb, 1949 k - local rule - simultaneously active (correlations) Where do the connections come from? 5.5 Learning of Associations

5.5 Hebbian Learning of Associations

item memorized 5.5 Hebbian Learning of Associations

item recalled Recall: Partial info 5.5 Hebbian Learning: Associative Recall

5.5 Learned concepts Image: Neuronal Dynamics, Gerstner et al., Cambridge Univ. Press (2014), Adapted from Quiroga et al. (2005), Nature 435: Activity of neurons in human brain

5.5 Associative Recall Tell me the object shape for the following list of 5 items: Tell me the color for the following list of 5 items: be as fast as possible: time

Tell me the color for the following list of 5 items: Red Blue Yellow Green Red Stroop effect: Slow response: hard to work Against natural associations be as fast as possible: time 5.5 Associative Recall

Hierarchical organization of Associative memory animals birds fish Name as fast as possible an example of a bird swan (or goose or raven or …) Write down first letter: s for swan or r for raven … 5.5 Associative Recall

name as fast as possible an example of a hammer red Apple violin tool color music instrument fruit Nommez au plus vite possible un exemple d’un /d’une outil couleur fruit instrument de musique 5.5 Associative Recall

Biological Modeling of Neural Networks Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing - associative memory 5.2 Classification by similarity 5.3 Detour: Magnetic Materials 5.4 Hopfield Model 5.5 Learning of Associations 5.6 Storage Capacity Week 5-5: Learning of Associations

5.6. learning of several prototypes Prototype p 1 Prototype p 2 interactions Sum over all prototypes (1) Question: How many prototypes can be stored? dynamics all interactions with i

Prototype p 1 Prototype p 2 Interactions (1) 5.6. Storage capacity: How many prototypes can be stored? Dynamics (2) Random patterns Minimal condition: pattern is fixed point of dynamics -Assume we start directly in one pattern (say pattern ) -Pattern must stay Attention: Retrieval requires more (pattern completion) blackboard

Q: How many prototypes can be stored? A: If too many prototypes, errors (wrong pixels) show up. The number of prototypes M that can be stored is proportional to number of neurons N; memory load = M/N Image: Neuronal Dynamics, Gerstner et al., Cambridge Univ. Press (2014), Gaussian Error-free if blackboard

Exercise 3 now: Associative memory Q: How many prototypes can be stored? Random patterns  random walk a) show relation to erf function: importance of p/N b) network of 1000 neurons – allow at most 1 wrong pixel? c) network of N neurons – at most 1 promille wrong pixels? End of lecture, exercise+ Computer exercise : 12:00 Prototype p 1 Prototype p 2 Interactions (1) Dynamics (2)

The end