1 4. Associators and Synaptic Plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.

Slides:



Advertisements
Similar presentations
Computational Neuroscience 03 Lecture 8
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
Artificial neural networks:
Justin Besant BIONB 2220 Final Project
Neurophysics Part 1: Neural encoding and decoding (Ch 1-4) Stimulus to response (1-2) Response to stimulus, information in spikes (3-4) Part 2: Neurons.
Spike Timing-Dependent Plasticity Presented by: Arash Ashari Slides mostly from: 1  Woodin MA, Ganguly K, and Poo MM. Coincident pre-
Part III: Models of synaptic plasticity BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters Laboratory.
Spike timing-dependent plasticity: Rules and use of synaptic adaptation Rudy Guyonneau Rufin van Rullen and Simon J. Thorpe Rétroaction lors de l‘ Intégration.
Learning crossmodal spatial transformations through STDP Gerhard Neumann Seminar B, SS 06.
Spike timing-dependent plasticity Guoqiang Bi Department of Neurobiology University of Pittsburgh School of Medicine.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Plasticity in the nervous system Edward Mann 17 th Jan 2014.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
By Eamon Quick. The Rundown Long-Term Potentiation (LTP): activity-dependent increase in synaptic activity –Dependent upon NMDA receptor activation Favors.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Synapses are everywhere neurons synapses Synapse change continuously –From msec –To hours (memory) Lack HH type model for the synapse.
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
A globally asymptotically stable plasticity rule for firing rate homeostasis Prashant Joshi & Jochen Triesch
Artificial neural networks.
Copyright © 2007 Wolters Kluwer Health | Lippincott Williams & Wilkins Neuroscience: Exploring the Brain, 3e Chapter 25: Molecular Mechanisms of Learning.
Unsupervised learning
Learning and Stability. Learning and Memory Ramón y Cajal, 19 th century.
The BCM theory of synaptic plasticity.
1 6. Feed-forward mapping networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering.
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
HEBB’S THEORY The implications of his theory, and their application to Artificial Life.
synaptic plasticity is the ability of the connection, or synapse, between two neurons to change in strength in response to either use or disuse of transmission.
1 1. Introduction Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Graduate.
7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
1 8. Auto-associative memory and network dynamics Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
Unsupervised learning
Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity.
Mechanisms for memory: Introduction to LTP Bailey Lorv Psych 3FA3 November 15, 2010.
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 7. Associators and synaptic plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 Financial Informatics –XVII: Unsupervised Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
Strong claim: Synaptic plasticity is the only game in town. Weak Claim: Synaptic plasticity is a game in town. Biophysics class: section III The synaptic.
Synaptic plasticity DENT/OBHS 131 Neuroscience 2009.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Trends in Biomedical Science Making Memory. The following slides are mostly derived from The Brain from Top to Bottom, an Interactive Website about the.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks. Molecules Levels of Information Processing in the Nervous System 0.01  m Synapses 1m1m Neurons 100  m Local Networks 1mm Areas /
Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002.
Keeping the neurons cool Homeostatic Plasticity Processes in the Brain.
0 Chapter 4: Associators and synaptic plasticity Fundamentals of Computational Neuroscience Dec 09.
Perceptron vs. the point neuron Incoming signals from synapses are summed up at the soma, the biological “inner product” On crossing a threshold, the cell.
University of Jordan1 Physiology of Synapses in the CNS- L4 Faisal I. Mohammed, MD, PhD.
Synaptic Plasticity Synaptic efficacy (strength) is changing with time. Many of these changes are activity-dependent, i.e. the magnitude and direction.
1 5. Representations and the neural code Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Brain, Mind, and Computation Part I: Computational Brain Brain, Mind, and Computation Part I: Computational Brain Brain-Mind-Behavior Seminar May 18, 2011.
Jochen Triesch, UC San Diego, 1 Part 3: Hebbian Learning and the Development of Maps Outline: kinds of plasticity Hebbian.
Ch 8. Synaptic Plasticity 8.9 ~ 8.10 Adaptive Cooperative Systems, Martin Beckerman, Summarized by Kim, S. –J. Biointelligence Laboratory, Seoul.
Introduction to Connectionism Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
1 9. Continuous attractor and competitive networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
Brain, Mind, and Computation Part I: Computational Brain Brain, Mind, and Computation Part I: Computational Brain Brain-Mind-Behavior Seminar May 14, 2012.
Biointelligence Laboratory, Seoul National University
6. Feed-forward mapping networks
Capacity of auto-associative networks
Stochastic Fluctuations of the Synaptic Function
Financial Informatics –XVII: Unsupervised Learning
9. Continuous attractor and competitive networks
7. Associators and synaptic plasticity
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
Synaptic Transmission and Integration
Supervised Hebbian Learning
Presentation transcript:

1 4. Associators and Synaptic Plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Graduate Programs in Cognitive Science, Brain Science and Bioinformatics Brain-Mind-Behavior Concentration Program Seoul National University This material is available online at Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2010.

(C) 2010 SNU CSE Biointelligence Lab, Outline Associative memory and Hebbian learning The physiology and biophysics of synaptic plasticity Mathematical formulation of Hebbian plasticity Synaptic scaling and weight distributions

(C) 2010 SNU CSE Biointelligence Lab, Associative memory and Hebbian learning To find the general principles of brain development is one of the major scientific quests in neuroscience Not all characteristics of the brain can be specified by a genetic code  The number of genes would certainly be too small to specify all the detail of the brain networks  Advantageous that not all the brain functions are specified genetically  To adapt to particular circumstances in the environment An important adaptation mechanism that is thought to form the basis of building associations Adapting synaptic efficiencies (learning algorithm) 3

(C) 2010 SNU CSE Biointelligence Lab, Hebbian learning Donald O. Hebb, The Organization of Behavior “When an axon of a cell A is near enough to excite cell B or repeatedly or persistently takes part in firing it, some growth or metabolic change takes place in both cells such that A’s efficiency, as one of the cells firing B, is increased.” Brain mechanisms and how they can be related to behavior Cell assemblies The details of synaptic plasticity Experimental result and evidence Hebbian learning 4

(C) 2010 SNU CSE Biointelligence Lab, Associations (1) Computer memory  Information is stored in magnetic or other physical form  Using memory address for recalling  Natural systems cannot work with such demanding precision The human memory  Recall vivid memories of events from small details  Learn associations  Trigger memories based on related information  Only partial information can be sufficient to recall memories  Association memory  The basis for many cognitive functions 5

(C) 2010 SNU CSE Biointelligence Lab, Associations (2) 6 Fig. 4.2 A network of associative nodes Fig. 4.1 A simplified neuron that receives a large number of inputs r i in.

(C) 2010 SNU CSE Biointelligence Lab, Associations (3) (A), threshold = Fig. 4.3 Examples of an associative node that is trained on two feature vectors with a Hebbian- type learning algorithm that increases the synaptic strength by δw = 0.1 each time a presynaptic spike occurs in the same temporal window as a postsynaptic spike (4.1)

(C) 2010 SNU CSE Biointelligence Lab, Hebbian learning in the conditioning framework (1) The mechanisms of an associator  The first stimulus, the odour of the hamburger, was already effective in eliciting a response of neuron before learning  Unconditioned stimulus (USC)  Based on the random initial weight distribution  For the second input, the visual image of the hamburger, the response of the neuron changes during learning  Conditioned stimulus (CS) The input vector to the assciator is a mixture of UCS and CS as illustrated in Fig. 4.4A 8

(C) 2010 SNU CSE Biointelligence Lab, Hebbian learning in the conditioning framework (2) 9 Fig. 4.4 Different models of associative nodes resembling the principal architecture found in biological nervous systems

(C) 2010 SNU CSE Biointelligence Lab, Features of associators and Hebbian learning Pattern completion and generalization  Recall from partial input  The output node responds to all patterns with a certain similarity to the trained pattern Prototypes and extraction of central tendencies  The ability to extract central tendencies  Noise reduction Graceful degradation  The loss of some components of system should not make the system fail completely.  Fault tolerance 10

(C) 2010 SNU CSE Biointelligence Lab, The physiology and biophysics of synaptic plasticity Synaptic machinery can change  The number of release sites  The possibility of neurotransmitter release  The number of transmitter receptors  The conductance and kinetics of ion channels 11

(C) 2010 SNU CSE Biointelligence Lab, Typical plasticity experiments (1) The experiment by Bliss and Lomo (1973) in hippocampus cultures  LTP  High-frequency stimulus applied  Increased amplitude of EFP (Fig. 4.5A)  LTD  Low-frequency stimulus applied  Decreased ampliduce of EFP (Fig. 4.5B) The experiment by Fine and Enoki in hippocampal slices of rats  High-frequency stimulus applied  Increased amplitude of EPSP (Fig. 4.6A) 12

(C) 2010 SNU CSE Biointelligence Lab, Typical plasticity experiments (2) 13 Fig. 4.5

(C) 2010 SNU CSE Biointelligence Lab, Short-term plasticity 14 Fig. 4.6

(C) 2010 SNU CSE Biointelligence Lab, Spike timing dependent plasticity Relative changes of EPSC amplitudes (and therefore weights) for different values of the time differences between pre- and postsynaptic spikes.  Assymetrical Hebbian plasticity (Fig. 4.7B-C)  Symmetrical Hebbian plasticity (Fig. 4.7D-E) 15 Fig. 4.7 Several examples of the schematic dependence of synaptic efficiencies on the temporal relations between pre- and postsynaptic spikes

(C) 2010 SNU CSE Biointelligence Lab, The calcium hypothesis and modeling chemical pathways 16 Fig. 4.8

(C) 2010 SNU CSE Biointelligence Lab, Mathematical formulation of Hebbian plasticity Synaptic plasticity by a change of weight values The weight values are not static but can change over time The variation of weight values after time steps Δt in a discrete fashion as The dependence of the weight changes on various factors  Activity-dependent synaptic plasticity  Depend on the firing times of the pre- and postsynaptic neuron  The strength of synapse can vary within some interval 17 (4.2)

(C) 2010 SNU CSE Biointelligence Lab, Spike timing dependent plasticity rules General form  Kernel function  Additive rule  Multiplicative rule 18

(C) 2010 SNU CSE Biointelligence Lab, Hebbian learning in population rate models (1) The average behavior of neurons or cell assemblies 19

(C) 2010 SNU CSE Biointelligence Lab, Hebbian learning in population rate models (2) Fig Fig. 4.11

(C) 2010 SNU CSE Biointelligence Lab, Negative weights and crossing synapses In single neurons  Synaptic learning rules should not change the sign of weight values. In population nodes  Domain crossing of weight values is allowed. 21

(C) 2010 SNU CSE Biointelligence Lab, Synaptic scaling and weight distribution Repeated application of additive association rules results in runway (unstable) synaptic values. Real neural systems need some balance and competition between synapses to allow stable learning. 22

(C) 2010 SNU CSE Biointelligence Lab, Examples of STDP with spiking neurons (1) Asymmetric Hebbian rules for spiking neurons 23 Fig (A) Firing rate (decreasing curve) and C v, the coefficient of variation (increasing and fluctuating curve), of an IF-neuron that is driven by 1000 excitatory Poisson spike trains while the synaptic efficiencies are changed according to an additive Hebbian rule with asymmetric Gaussian plasticity windows. (B) Distribution of weight values after 5 minutes of simulated training time (which is similar to the distribution after 3 minutes). The weights were limited to be in the range of The distribution has two maxima, one at each boundary of the allowed interval.

(C) 2010 SNU CSE Biointelligence Lab, Examples of STDP with spiking neurons (2) The firing time of the IF-neuron is mainly determined by the average firing input current Measure this statement using cross-correlation function 24 Fig Average cross-correlation function between pre-synaptic Poisson spike trains and the postsynaptic spike train (averaged over all presynaptic spike trains) in simulation of an IF-neuron with 1000 input channels. The spike trains that lead to the results shown by stars were generated with each weight value fixed to value The cross-correlations are consistent with zero when considered within the variance indicated by the error bars. The squares represent the simulation results from simulations of the IF-neuron driven by the same presynaptic spike trains as before, but with the weight matrix after Hebbian learning shown in Fig Some presynaptic spike trains caused postsynaptic spiking with a positive peak in the average cross-correlation functions when the presynaptic spikes precede the postsynaptic spike. No error bars are shown for this curve for clarity.

(C) 2010 SNU CSE Biointelligence Lab, Weight distributions in rate models After learning Np with Hebbian covariance rule (eq. 4.11), w ij 0 =0 Rate models of recurrent networks trained with the Hebbian training rule on random patterns have Gaussian distribution weight component 25 Fig Normalized histograms of weight values from simulations of a simplified neuron (sigma node) simulating average firing rates after training with the basic Hebbian learning rules 4.11 on exponentially distributed random patterns. A fit of a Gaussian distribution to the data is shown as a solid line.

(C) 2010 SNU CSE Biointelligence Lab, Competitive synaptic scaling and weight decay (1) The dependence of overall synaptic efficiencies on the average postsynaptic firing rate  Crucial to keep the neurons in the regime of high variability  Keep neurons sensitive for information processing in the nervous systems Many experiments have demonstrated  Synaptic efficiencies are scaled by the average postsynaptic activity The threshold where LTP is induced can depend on the time- averaged recent activity of the neuron Weight normalization Weight decay 26

(C) 2010 SNU CSE Biointelligence Lab, Competitive synaptic scaling and weight decay (2) 27

(C) 2010 SNU CSE Biointelligence Lab, Oja’s rule and principal component analysis (PCA) Oja’s rule (in the previous section) also implements efficiently an algorithm called PCA PCA is to reduce the dimension of high-dimensional data. 28 Fig. 4.15Fig. 4.16