7. Associators and synaptic plasticity

Slides:



Advertisements
Similar presentations
Computational Neuroscience 03 Lecture 8
Advertisements

Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
Artificial neural networks:
Neurophysics Part 1: Neural encoding and decoding (Ch 1-4) Stimulus to response (1-2) Response to stimulus, information in spikes (3-4) Part 2: Neurons.
Spike Timing-Dependent Plasticity Presented by: Arash Ashari Slides mostly from: 1  Woodin MA, Ganguly K, and Poo MM. Coincident pre-
Learning crossmodal spatial transformations through STDP Gerhard Neumann Seminar B, SS 06.
Spike timing-dependent plasticity Guoqiang Bi Department of Neurobiology University of Pittsburgh School of Medicine.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Synapses are everywhere neurons synapses Synapse change continuously –From msec –To hours (memory) Lack HH type model for the synapse.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
A globally asymptotically stable plasticity rule for firing rate homeostasis Prashant Joshi & Jochen Triesch
How facilitation influences an attractor model of decision making Larissa Albantakis.
Critical periods A time period when environmental factors have especially strong influence in a particular behavior. –Language fluency –Birds- Are you.
Unsupervised learning
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Molecular mechanisms of memory. How does the brain achieve Hebbian plasticity? How is the co-activity of presynaptic and postsynaptic cells registered.
Neural Plasticity: Long-term Potentiation Lesson 15.
synaptic plasticity is the ability of the connection, or synapse, between two neurons to change in strength in response to either use or disuse of transmission.
7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.
Unsupervised learning
Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity.
1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 7. Associators and synaptic plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 4. Associators and Synaptic Plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 Financial Informatics –XVII: Unsupervised Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Trends in Biomedical Science Making Memory. The following slides are mostly derived from The Brain from Top to Bottom, an Interactive Website about the.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002.
0 Chapter 4: Associators and synaptic plasticity Fundamentals of Computational Neuroscience Dec 09.
Synaptic Plasticity Synaptic efficacy (strength) is changing with time. Many of these changes are activity-dependent, i.e. the magnitude and direction.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Jochen Triesch, UC San Diego, 1 Part 3: Hebbian Learning and the Development of Maps Outline: kinds of plasticity Hebbian.
Byoung-Tak Zhang Biointelligence Laboratory
Chapter Thirteen The Biology of Learning and Memory
Synaptic transmission
6. Feed-forward mapping networks
Simple learning in connectionist networks
Financial Informatics –XVII: Unsupervised Learning
9. Continuous attractor and competitive networks
Polarity of Long-Term Synaptic Gain Change Is Related to Postsynaptic Spike Firing at a Cerebellar Inhibitory Synapse  Carlos D Aizenman, Paul B Manis,
Long term potentiation and depression
John Widloski, Ila R. Fiete  Neuron 
Brendan K. Murphy, Kenneth D. Miller  Neuron 
Volume 40, Issue 6, Pages (December 2003)
Bidirectional Modification of Presynaptic Neuronal Excitability Accompanying Spike Timing-Dependent Synaptic Plasticity  Cheng-yu Li, Jiang-teng Lu, Chien-ping.
Gregory O. Hjelmstad, Roger A. Nicoll, Robert C. Malenka  Neuron 
Linking Memories across Time via Neuronal and Dendritic Overlaps in Model Neurons with Active Dendrites  George Kastellakis, Alcino J. Silva, Panayiota.
Volume 36, Issue 5, Pages (December 2002)
Artificial neurons Nisheeth 10th January 2019.
Experience-Dependent Asymmetric Shape of Hippocampal Receptive Fields
Rosanna P. Sammons, Claudia Clopath, Samuel J. Barnes  Cell Reports 
H.Sebastian Seung, Daniel D. Lee, Ben Y. Reis, David W. Tank  Neuron 
Zhiru Wang, Ning-long Xu, Chien-ping Wu, Shumin Duan, Mu-ming Poo 
Long-Term Depression Properties in a Simple System
Simple learning in connectionist networks
Volume 16, Issue 3, Pages (March 1996)
Huibert D Mansvelder, Daniel S McGehee  Neuron 
Victor Z Han, Kirsty Grant, Curtis C Bell  Neuron 
Hippocampal Interneurons Express a Novel Form of Synaptic Plasticity
Serotonergic Modulation of Sensory Representation in a Central Multisensory Circuit Is Pathway Specific  Zheng-Quan Tang, Laurence O. Trussell  Cell Reports 
Introduction to Neural Network
Synaptic Transmission and Integration
Supervised Hebbian Learning
Volume 27, Issue 1, Pages (July 2000)
Rapid Neocortical Dynamics: Cellular and Network Mechanisms
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Desdemona Fricker, Richard Miles  Neuron 
Neuroscience: Exploring the Brain, 3e
Presentation transcript:

7. Associators and synaptic plasticity Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002. Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Graduate Programs in Cognitive Science, Brain Science and Bioinformatics Brain-Mind-Behavior Concentration Program Seoul National University E-mail: btzhang@bi.snu.ac.kr This material is available online at http://bi.snu.ac.kr/

Outline 7.1 7.2 7.3 7.4 7.5 7.6 7.7 7.8 Associative memory and Hebbian learning An example of learning associations The biochemical basis of synaptic plasticity The temporal structure of Hebbian plasticity: LTP and LTD Mathematical formulation of Hebbian plasticity Weight distributions Neuronal response variability, gain control, and scaling Features of associators and Hebbian learning

7.1 Associative memory and Hebbian learning To find the general principles of brain development is one of the major scientific quests in neuroscience Not all characteristics of the brain can be specified by a genetic code The number of genes would certainly be too small to specify all the detail of the brain networks Advantageous that not all the brain functions are specified genetically To adapt to particular circumstances in the environment An important adaptation mechanism that is thought to form the basis of building associations Adapting synaptic efficiencies (learning algorithm)

7.1.1 Synaptic plasticity Synaptic plasticity is a major key to adaptive mechanisms in the brain Artificial neural networks Abstract synaptic plasticity Learning rules are not biologically realistic Learn entirely from experience Genetic coding would be of minimal importance in the brain development The mechanism of a neural network for Self-organization Associative abilities

7.1.2 Hebbian learning Donald O. Hebb, The Organization of Behavior “When an axon of a cell A is near enough to excite cell B or repeatedly or persistently takes part in firing it, some growth or metabolic change takes place in both cells such that A’s efficiency, as one of the cells firing B, is increased.” Brain mechanisms and how they can be related to behavior Cell assemblies The details of synaptic plasticity Experimental result and evidence Hebbian learning

7.1.3 Associations Computer memory The human memory Information is stored in magnetic or other physical form Using memory address to recalling Natural systems cannot work with such demanding precision The human memory Recall vivid memories of events from small details Learn associations Trigger memories based on related information Only partial information can be sufficient to recall memories Association memory The basis for many cognitive functions

7.1.4 The associative node 7.1.5 The associative network Fig. 7.1 Associative node and network architecture. (A) A simplified neuron that receives a large number of inputs riin. The synaptic efficiency is denoted by wi. the output of the neuron, rout depends on the particular input stimulus. (B) A network of associative nodes. Each component of the input vector, riin, is distributed to each neuron in the network. However, the effect of the input can be different for each neuron as each individual synapse can have different efficiency values wij, where j labels the neuron in the network.

7.2 An example of learning associations (A) , threshold = 1.5 (7.1) Fig. 7.2 Examples of an associative node that is trained on two feature vectors with a Hebbian-type learning algorithm that increases the synaptic strength by δw = 0.1 each time a presynaptic spike occurs in the same temporal window as a postsynaptic spike

7.2.1 Hebbian learning in the conditioning framework The mechanisms of an associative neuron The first stimulus was already effective in eliciting a response of neuron before neuron Unconditioned stimulus (USC) Based on the random initial weight distribution For the second input the response of the neuron changes during learning Conditioned stimulus (CS) Fig. 7.3 Different models of associative nodes resembling the principal architecture found in biological nervous systems such as (A) cortical neurons in mammalian cortex

7.2.2 Alternative plasticity schemes Fig. 7.3 Different models of associative nodes resembling the principal architecture found in biological nervous systems such as (B) Purkinje cells in the cerebellum, which have strong input from climbing fibers through many hundreds or thousands of synapses. In contrast, the model as shown in (C) that utilizes specific input to a presynaptic terminal as is known to exist in invertebrate systems, would have to supply the UCS to all synapses simultaneously in order to achieve the same kind of result as in the previous two models. Such architectures are unlikely to play an important role in cortical processing.

7.2.3 Issues around synaptic plasticity Store information with associative learning Imprinting an event-response pattern Recall the response from partial information about the event Synaptic plasticity is thought to be the underlying principle behind associative memory Formulate the learning rules more precisely Synaptic potentiation The synaptic efficiencies would then become too large so that the response of the node is less specific to input pattern Synaptic depression

7.3 The biochemical basis of synaptic plasticity Activity-dependent synaptic plasticity The co-activation of pre- and postsynaptic neurons Backfiring The basis of signaling the postsynaptic state NMDA receptors Open when the postsynaptic membrane becomes depolarized Allow influx of calcium ions The excess of intracellular calcium can thus indicate the co-activation of pre- and postsynaptic activity Longlasting synaptic changes Lifelong memories The phosphorylation of proteins

7. 4 The temporal structure of Hebbian plasticity: LTP and LTD 7. 4 7.4 The temporal structure of Hebbian plasticity: LTP and LTD 7.4.1 Experimental example of Hebbian plasticity Results of experiments with varying pre- and postsynaptic conditions EPSC: EPSP-related current Fig. 7.4 (A) Relative EPSC amplitudes between glutamatergic neurons in hippocampal slices. A strong postsynaptic stimulation was introduced at the t =0 for 1 minute that induced spiking of the postsynaptic neuron. The postsynaptic firing was induced in relation to the onset of an EPSC that resulted from the stimulation of a presynaptic neuron at 1 Hz. The squares mark the results when the postsynaptic firing times followed the onset of EPSCs within a short time window of 5 ms. The enhancement of synaptic efficiencies demonstrates LTP. The circles mark the results when the postsynaptic neuron was fired 5 ms before the onset of the EPSC. The reduction of synaptic efficiencies demonstrates LTD.

7.4.2 LTP and LTD Long-term potentiation (LTP): The amplifications in the synaptic efficiency Long-term depression (LTD): The reductions in the synaptic efficiency Whether such synaptic changes can persist for the lifetime of an organism is unknown Such forms of synaptic plasticity support the basic model of association LTP can enforce associative response to a presynaptic firing pattern that is temporally linked to postsynaptic firing LTD can facilitate the unlearning of presynaptic input that is not consistent with postsynaptic firing The basis of mechanisms of associative memories

7.4.3 Time window of Hebbian plasticity The crucial temporal relation between pre- and postsynaptic spikes by varying the time between pre- postsynaptic spikes Fig. 7.4 (B) The relative changes in EPSC amplitudes are shown for various time windows between the onset of an EPSC induced by presynaptic firing and the time of induction of spikes in the postsynaptic neuron

7.4.4 Variation of temporal Hebbian plasticity The asymmetric and symmetric form of Hebbian plasticity Fig. 7.5 Several examples of the schematic dependence of synaptic efficiencies on the temporal relations between pre- and postsynaptic spikes

7.4.5 Dependence of synaptic changes on initial strength Whether the size of synaptic changes depends on the strength of a synapse The absolute strength of the synaptic efficiencies in LTD is proportional to the initial synaptic efficiency The relative changes of EPSC amplitudes for LTP are largest for small initial EPSC amplitudes LTD: LTP: : multiplicative : additive (7.2) (7.3) Fig. 7.6 Dependence of LTP and LTD on the magnitude of the EPSCs before synaptic plasticity is induced.

7.5 Mathematical formulation of Hebbian plasticity Synaptic plasticity by a change of weight values The weight values are not static but can change over time The variation of weight values after time steps Δt in a discrete fashion as The dependence of the weight changes on various factors Activity-dependent synaptic plasticity Depend on the firing times of the pre- and postsynaptic neuron The strength of synapse can vary within some interval (7.4)

7.5.1 Hebbian learning with spiking neurons The dependence of the synaptic changes (LTP: “+” LTD: “-”) Kernel function: exponential form : Threshold function that restricts LTP and LTD to the correct domains Amplitude factor f± 1. Additive rule with absorbing boundaries, 2. Multiplicative rule with more graded nonlinearity when approaching the boundaries, (7.5) (7.6) (7.7) (7.8) (7.9)

7.5.2 Hebbian learning in rate models The average behavior of neurons or cell assemblies (rate models) Cannot incorporate the spike timing The plasticity depends on the average correlation of pre- and postsynaptic firing ri: firing rate of a postsynaptic node i rj : firing rate of a presynaptic node i f1: learning rate f2 and f3: plasticity thresholds f4: weight decay The average change of synaptic weights is proportional to the covariance of the pre- and postsynaptic firing (cross-correlation function) : Hebbian plasticity rule (7.10) : Hebbian rule without decay term (7.11) : Covariance of ri and rj (7.12)

7.6 Weight distributions Synaptic efficiencies are continuously changing as long as learning rules are applied Problem Rapid changes of weight can lead to instabilities in the system The neuron should adapt to rapid changes A neuron should roughly maintain its main firing rate Solution The overall weight distribution stays relatively constant Hebbian models depend on the form of the weight distribution Fig. 7.7 Distribution of fluorescence intensities of synapses from a spinal neuron that were labeled with fluorescence antibodies, which can be regarded as an estimate of the synaptic efficiencies.

7.6.1 Example of weight distribution in a rate model Rate models of recurrent networks trained with the Hebbian training rule on random patterns have Gaussian distribution weight component Fig. 7.8 Normalized histograms of weight values from simulations of a simplified neuron (sigma node) simulating average firing rates after training with the basic Hebbian learning rules 7.11 on exponentially distributed random patterns. A fit of a Gaussian distribution to the data is shown as a solid line.

7.6.2 Change of synaptic characteristics Dale’s principle: Neurons make either excitatory or inhibitory synapses The synapses from a presynaptic neuron cannot change its specific characteristics The simulations above we did not restrict the synapses to be either inhibitory or excitatory Weight values can be set to cross the boundaries between positive and negative values, which is physiologically unrealistic But, simulations with such constraints produce similar results for the distribution of the weight matrix component Therefore, it is common to relax this biological detail(Dale’s principle) in the simulation

7.6.3 Examples with spiking neurons Asymmetric Hebbian rules for spiking neuron Fig. 7.9 (A) Average firing rate (decreasing curve) and Cv, the coefficient of variation (increasing and fluctuating curve), of an IF-neuron that is driven by 1000 excitatory Poisson spike trains while the synaptic efficiencies are changed according to an additive Hebbian rule with asymmetric Gaussian plasticity windows. (B) Distribution of weight values after 5 minutes of simulated training time (which is similar to the distribution after 3 minutes). The weights were limited to be in the range of 0-0.015. The distribution has two maxima, one at each boundary of the allowed interval.

7. 7 Neuronal response variability, gain control, and scaling 7. 7 7.7 Neuronal response variability, gain control, and scaling 7.7.1 Variability and gain control The firing time of the IF-neuron is mainly determined by the average firing input current Measure this statement using cross-correlation function (7.13) Fig. 7.10 Average cross-correlation function between pre-synaptic Poisson spike trains and the postsynaptic spike train (averaged over all presynaptic spike trains) in simulation of an IF-neuron with 1000 input channels. The spike trains that lead to the results shown by stars were generated with each weight value fixed to value 0.015. The cross-correlations are consistent with zero when considered within the variance indicated by the error bars. The squares represent the simulation results from simulations of the IF-neuron driven by the same presynaptic spike trains as before, but with the weight matrix after Hebbian learning shown in Fig. 7.9. Some presynaptic spike trains caused postsynaptic spiking with a positive peak in the average cross-correlation functions when the presynaptic spikes precede the postsynaptic spike. No error bars are shown for this curve for clarity.

7.7.2 Synaptic scaling The dependence of overall synaptic efficiencies on the average postsynaptic firing rate Crucial to keep the neurons in the regime of high variability Keep neurons sensitive for information processing in the nervous systems Many experiments have demonstrated Synaptic efficiencies are scaled by the average postsynaptic activity The threshold where LTP is induced can depend on the time-averaged recent activity of the neuron Weight normalization Weight decay

7.7.3 Oja’s rule and principal component Weight normalization through heterosynaptic depression (7.14) Fig. 7.11 Simulation of a linear node trained with Oja’s rule on training examples (indicated by the dots) drawn from a two-dimensional probability distribution with mean zero. The weight vector with initial conditions indicated by the cross converges to the weight vector (thick arrow), which has length |w| = 1 and points in the direction of the first principal component.

7.7.4 Short-term synaptic plasticity and neuronal gain control Cortical neurons typically have a transient response with a decreasing firing rate to a constant input current Short-term synaptic depression (STD) The computational consequences of short-term depression can be manifold Allows a neuron to respond strongly to input that has not been influencing the neuron recently and therefore has a strong novelty value Rapid spike trains that would exhaust the neuron can be weakened

7.8 Features of associators and Hebbian learning Pattern completion and generalization Recall from partial input The output node responds to all patterns with a certain similarity to the trained pattern Prototypes and extraction of central tendencies The ability to extract central tendencies Noise reduction Graceful degradation The loss of some components of system should not make the system fail completely. Fault tolerance

7.8.4 Biologically faithful learning rules The associative Hebbian learning rules Biologically faithful models: Unsupervised No specific learning signal Self-organization rule Reinforcement learning Local Only presynaptic and postsynaptic observable are required to change the synaptic weight values Benefit from true parallel distributed processing Online The learning rule does not require storage of firing patterns or network parameters

Conclusion What is the associative memory? Biochemical mechanisms of synaptic plasticity Hebbian learning rule Synaptic plasticity Temporal structure of Hebbian plasticity LTP and LTD Weight distribution Gain control, synaptic scaling, Oja’s rule and PCA Associators and Hebbian learning Hebbian learning rule is bilogically faithful learning rules Unsupervised, local, online