1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.

Slides:



Advertisements
Similar presentations
BME 6938 Neurodynamics Instructor: Dr Sachin S. Talathi.
Advertisements

Introduction to Neural Networks
What is the neural code? Puchalla et al., What is the neural code? Encoding: how does a stimulus cause the pattern of responses? what are the responses.
Kinetic Theory for the Dynamics of Fluctuation-Driven Neural Systems David W. McLaughlin Courant Institute & Center for Neural Science New York University.
Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
Neurophysics Part 1: Neural encoding and decoding (Ch 1-4) Stimulus to response (1-2) Response to stimulus, information in spikes (3-4) Part 2: Neurons.
1 Testing the Efficiency of Sensory Coding with Optimal Stimulus Ensembles C. K. Machens, T. Gollisch, O. Kolesnikova, and A.V.M. Herz Presented by Tomoki.
What is the language of single cells? What are the elementary symbols of the code? Most typically, we think about the response as a firing rate, r(t),
Why are cortical spike trains irregular? How Arun P Sripati & Kenneth O Johnson Johns Hopkins University.
NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Reading population codes: a neural implementation of ideal observers Sophie Deneve, Peter Latham, and Alexandre Pouget.
Stable Propagation of Synchronous Spiking in Cortical Neural Networks Markus Diesmann, Marc-Oliver Gewaltig, Ad Aertsen Nature 402: Flavio Frohlich.
Spike-triggering stimulus features stimulus X(t) multidimensional decision function spike output Y(t) x1x1 x2x2 x3x3 f1f1 f2f2 f3f3 Functional models of.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
The Integrate and Fire Model Gerstner & Kistler – Figure 4.1 RC circuit Threshold Spike.
Part I: Single Neuron Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 2-5 Laboratory of Computational.
A globally asymptotically stable plasticity rule for firing rate homeostasis Prashant Joshi & Jochen Triesch
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
1 6. Feed-forward mapping networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
1 2. Neurons and Conductance-Based Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
ECEN 5341/4341 Lecture 15 Feb 17, Noise Sources 2. Minimal levels of signal detection. 3. Some characteristic s of Neurons. An Important Reference.
Correlation-Induced Oscillations in Spatio-Temporal Excitable Systems Andre Longtin Physics Department, University of Ottawa Ottawa, Canada.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Biological Modeling of Neural Networks Week 8 – Noisy output models: Escape rate and soft threshold Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
Theoretical Neuroscience Physics 405, Copenhagen University Block 4, Spring 2007 John Hertz (Nordita) Office: rm Kc10, NBI Blegdamsvej Tel (office)
INTRODUCTION OF INTEGRATE AND FIRE MODEL Single neuron modeling By Pooja Sharma Research Scholar LNMIIT-Jaipur.
1 7. Associators and synaptic plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 4. Associators and Synaptic Plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Lecture 8: Integrate-and-Fire Neurons References: Dayan and Abbott, sect 5.4 Gerstner and Kistler, sects , 5.5, 5.6, H Tuckwell, Introduction.
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
ECE-7000: Nonlinear Dynamical Systems 2. Linear tools and general considerations 2.1 Stationarity and sampling - In principle, the more a scientific measurement.
Chapter 20 Statistical Considerations Lecture Slides The McGraw-Hill Companies © 2012.
Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002.
From LIF to HH Equivalent circuit for passive membrane The Hodgkin-Huxley model for active membrane Analysis of excitability and refractoriness using the.
CHAPTER – 1 UNCERTAINTIES IN MEASUREMENTS. 1.3 PARENT AND SAMPLE DISTRIBUTIONS  If we make a measurement x i in of a quantity x, we expect our observation.
0 Chapter 3: Simplified neuron and population models Fundamentals of Computational Neuroscience Dec 09.
1 5. Representations and the neural code Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Brain, Mind, and Computation Part I: Computational Brain Brain, Mind, and Computation Part I: Computational Brain Brain-Mind-Behavior Seminar May 18, 2011.
1 9. Continuous attractor and competitive networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of.
Effects of noisy adaptation on neural spiking statistics Tilo Schwalger,Benjamin Lindner, Karin Fisch, Jan Benda Max-Planck Institute for the Physics of.
Bayesian Brain - Chapter 11 Neural Models of Bayesian Belief Propagation Rajesh P.N. Rao Summary by B.-H. Kim Biointelligence Lab School of.
Byoung-Tak Zhang Biointelligence Laboratory
Stochastic Fluctuations of the Synaptic Function
Volume 20, Issue 5, Pages (May 1998)
Volume 20, Issue 5, Pages (May 1998)
Carlos D. Brody, J.J. Hopfield  Neuron 
Volume 40, Issue 6, Pages (December 2003)
7. Associators and synaptic plasticity
Gregory O. Hjelmstad, Roger A. Nicoll, Robert C. Malenka  Neuron 
Volume 36, Issue 5, Pages (December 2002)
Adaptation of Ca2+-Triggered Exocytosis in Presynaptic Terminals
Efficacy of Thalamocortical and Intracortical Synaptic Connections
Origin and Function of Tuning Diversity in Macaque Visual Cortex
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Michael Häusser, Beverley A Clark  Neuron 
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Rony Azouz, Charles M. Gray  Neuron 
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Presentation transcript:

1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Graduate Programs in Cognitive Science, Brain Science and Bioinformatics Brain-Mind-Behavior Concentration Program Seoul National University This material is available online at Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002.

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, 2 Outline Basic spiking neuron and population models Spike-time variability The neural code and the firing rate hypothesis Population dynamics Networks with non-classical synapses

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Basic spiking neurons Conductance-based model is too heavy to a large network simulation Integrate-and-fire neuron model  The form of spike generated by neuron is very stereotyped.  The precise form of the spike does not carry information.  The occurrence of spikes is important.  The relevance of the timing of the spike for information transmission.  Neglect the detailed ion-channel dynamics. 3

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, The leaky integrate-and-fire neuron Membrane potential, Membrane time constant, Input current, Synaptic efficiency, Firing time of presynaptic neuron of synapse j, Firing time of the postsynaptic neuron, Firing threshold, Reset membrane potential,  Absolute refractory time by holding this value Fig. 3.1 Schematic illustration of a leaky integrate-and-fire neuron. This neuron model integrates(sums) the external input, with each channel weighted with a corresponding synaptic weighting factors w i, and produces an output spike if the membrane potential reaches a firing threshold. (3.1) (3.3) (3.2) (3.4)

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Response of IF neurons to constant input current (1) Simple homogeneous differential equation,  Initial membrane potential 0  u(t=0)=1. very short input pulse.  Equilibrium equation of the membrane potential after a constant current has been applied for a long time IF-neuron driven by a constant input current  Low enough to prevent the firing.  After some transient time, the membrane potential dose not change The differential equation for constant input (current) for all times after the constant current I ext = const is applied:  Exponential decay of potential at u(t=0) 5 (3.5) (3.6) (3.8)(3.7) (3.9)

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Response of IF neurons to constant input current (2) 6 Fig. 3.2 Simulated spike trains and membrane potential of a leaky integrate-and-fire neuron. The threshold is set at 10 and indicated as a dashed line. (A) Constant input current of strength RI = 8, which is too small to elicit a spike. (B) Constant input current of strength RI = 12, strong enough to elicit spikes in regular intervals. Note that we did not include the form of the spike itself in the figure but simply reset the membrane potential while indicating that a spike occurred by plotting a dot in the upper figure.

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Activation function The time t f is given by the time when the membrane reaches the firing threshold, Activation or gain function define as the inverse of t f or the firing rate  Absolute refractory time This function quickly reaches an asymptotic linear behavior  A threshold-linear function is often used to approximate the gain function of IF-neurons 7 Fig. 3.3 Gain function of a leaky integrate- and-fire neuron for several values of the reset potential u res and refractory time t ref. (3.10) (3.11)

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, The Izhikevich neuron (1) A model which is computationally efficient while still being a ble to capture a large variety of the subthreshold dynamics of t he membrane potential.  Subthreshold dynamics  Firing and reset condition 8

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, The Izhikevich neuron (2) 9

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Spike time variability Neurons in brain do not fire regularly but seem extremely noisy. Neurons that are relatively inactive emit spikes with low frequencies that are very irregular. High-frequency responses to relevant stimuli are often not very regular. The coefficient of variation, C v =σ/μ (3.18)  C v ≈0.5-1 for regularly spiking neurons in V1 and MT  Spike trains are often well approximated by Poisson process, C v =1 10 Fig. 3.5 Normalized histogram of interspike intervals (ISIs). (A) data from recordings of one cortical cell (Brodmann’s area 46) that fired without task-relevant characteristics with an average firing rate of about 15 spikes/s. The coefficient of variation of the spike trains is C v ≈ (B) Simulated data from a Poisson distributed spike trains I which a Gaussian refractory time has been included. The solid line represents the probability density function of the exponential distribution when scaled to fit the normalized histogram of the spike train. Note hat the discrepancy for small interspike intervals is due to the inclusion of a refractory time.

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Biological irregularities Biological networks do not have the regularities of the engineering-like designs of the IF-neurons Consider irregularities from different sources in the biological nervous system  The external input to the neuron  Structural irregularities Use a statistical approach 11

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Noise models for IF-neurons Noise in the neuron models  Stochastic threshold  Random reset  Noisy integration The stochastic process of a neuron  Appropriate choices for the random variables η (1), η (2), and η (3). 12 Fig. 3.6 Three different noise models of I&F neurons (3.22) (3.23) (3.24)

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Simulating variabilitiy of real neurons(1) The appropriate choice of the random process, probability distribution, time scale  Cannot give general anwers  Fit experimental data Noise in IF model by noisy input.  Central limit theorem Lognormal distribution 13 Fig. 3.7 Simulated interspike interval (ISI) distribution of a leaky IF-neuron with the threshold 10 and time constant τ m =10. The underlying spike train was generated with noisy input around the mean value RI = 12. The fluctuation were therefore distributed with a standard normal distribution. The resulting ISI histogram is well approximated by a lognormal distribution (solid line). The coefficient of variation of the simulated spike train is C v ≈ 0.43 (3.25) (3.26)

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Simulation of an IF-neuron that has no internal noise but is driven by 500 independent incoming Poisson spike trains. 14 w=0.5 Firing threshold EPSP amplitude w=0.25 Fig. 3.8 Simulation of IF-neuron that has no internal noise but is driven by 500 independent incoming spike trains with a corrected Poisson distribution. (A) The sums of the EPSPs, simulated by an α-function for each incoming spike with amplitude w = 0.5 for the upper curve and w = 0.25 for the lower curve. The firing threshold for the neuron is indicated by the dashed line. The ISI histograms from the corresponding simulations are plotted in (B) for the neuron with EPSP amplitude of w = 0.5 and in (C) for the neuron with EPSP amplitude of w = Simulating variabilitiy of real neurons(2)

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, The activation function depends on input The activation function of the neuron depends on the variations in the input spike train. The average firing rate for a stochastic IF-neuron [Tuckewell, 1988] 15 low σ: sharp transition high σ: linearized Fig. 3.9 The gain function of an IF- neuron that is driven by an external current that is given a normal distribution with mean μ=RI and variance σ. The reset potential was set to U res = 5 and the firing threshold of the IF-neuron was set to 10. The three curves correspond to three different variance parameters σ. (3.27) (3.28)

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, The neural code and the firing rate hypothesis (1) Firing rate of sensory neurons increase considerably in a short time interval following the presentation of an effective stimul us to the recorded neurons.  The stretch receptor on the frog muscle (Fig. 3.10) 16

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, The neural code and the firing rate hypothesis (2)  The tuning curve of simple cells (Fig. 3.11) Other parts of spike patterns can convey information (sec ) 17

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Coreelations codes and coincidence detectors (1) Co-occurrence of the spikes of the two neurons, but no signif icant variation of the firing rate in them. 18

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Coreelations codes and coincidence detectors (2) Temporal proximities(coincidence) of spikes  can make a difference in the information processing of the brain.  can be detected by leaky integrator neurons. 19

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, How accurate is spike timing It is widely held belief that neural spiking is not very reliable, and that there is a lot of variability in neuronal responses (Fig. 3.13A). Populations of neuron can rapidly convey information in a ne ural network (Fig. 3.13B). 20

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Population dynamics: modelling the av erage behavior of neurons Many of the models in computational neuroscience, in particu lar on a cognitive level, are based on descriptions that do no ta ke the individual spikes of neurons into account, but instead d escribe the average activity of neurons or neuronal population s. 21

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Firing rates and population averages (1) Estimating firing rate of a single neuron with a kernel function (or window)  With rectangular window  With Gaussian window 22

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Firing rates and population averages (2) Estimating average population activity of neurons 23

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Population dynamics for slow varyin g input Population dynamics  τ : membrane time constant  g : population activation function  Derived from (Eq. 3.41) Stationary state (dA/dt=0) 24

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Rapid response of populations Very short time constants, much shorter than typical membran e time constants, have to be considered when using Eq t o approximate the dynamics of population response to rapidly varying inputs. 25

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Common activation functions 26

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Networks with non-classical synapses: the sigma-pi node We assumed additive(linear) characteristics of synaptic curren ts. However, single neurons show also non-linear interactions bet ween different inputs. 27

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Logical AND sigma-pi nodes Only if two spikes are present within the time interval, on the order of the decay time of EPSPs, can a postsynaptic spike be generated. For the population model, the probability of having two spike s of two different presynaptic neurons in the same interval is p roportional to the product of the two individual probabilities.  The activation of node i 28

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Divisive inhibition Shunting inhibition (Fig. 3.19A) 29

(C) 2009 SNU CS Biointelligence Lab(C) 2009 SNU CSE Biointelligence Lab, Further sources of modulatory effects between synaptic inputs NMDA synapse  The blockade of NMDA receptors is removed if membrane pote ntial is raised by EPSP from another non-NMDA synapse in its proximity (Fig. 3.19B) Afferent modulation  Direct influence of specific afferents on the release of neurotran smitters by presynaptic terminals (Fig. 3.18C) 30