Biological Modeling of Neural Networks Week 8 – Noisy output models: Escape rate and soft threshold Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.

Slides:



Advertisements
Similar presentations
BME 6938 Neurodynamics Instructor: Dr Sachin S. Talathi.
Advertisements

Biological Modeling of Neural Networks: Week 9 – Coding and Decoding Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 What is a good neuron model? - Models.
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
Lecture 9: Population genetics, first-passage problems Outline: population genetics Moran model fluctuations ~ 1/N but not ignorable effect of mutations.
An Introductory to Statistical Models of Neural Data SCS-IPM به نام خالق ناشناخته ها.
What is the language of single cells? What are the elementary symbols of the code? Most typically, we think about the response as a firing rate, r(t),
Introduction: Neurons and the Problem of Neural Coding Laboratory of Computational Neuroscience, LCN, CH 1015 Lausanne Swiss Federal Institute of Technology.
Why are cortical spike trains irregular? How Arun P Sripati & Kenneth O Johnson Johns Hopkins University.
NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Romain Brette Computational neuroscience of sensory systems Dynamics of neural excitability.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Biological Modeling of Neural Networks: Week 11 – Continuum models: Cortical fields and perception Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Transients.
Excitability Information processing in the retina Artificial neural networks Introduction to Neurobiology
Stable Propagation of Synchronous Spiking in Cortical Neural Networks Markus Diesmann, Marc-Oliver Gewaltig, Ad Aertsen Nature 402: Flavio Frohlich.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
The Integrate and Fire Model Gerstner & Kistler – Figure 4.1 RC circuit Threshold Spike.
Part I: Single Neuron Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 2-5 Laboratory of Computational.
Statistical analysis and modeling of neural data Lecture 4 Bijan Pesaran 17 Sept, 2007.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Basic Models in Theoretical Neuroscience Oren Shriki 2010 Integrate and Fire and Conductance Based Neurons 1.
Biological Modeling of Neural Networks: Week 15 – Population Dynamics: The Integral –Equation Approach Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1.
Biological Modeling of Neural Networks: Week 13 – Membrane potential densities and Fokker-Planck Wulfram Gerstner EPFL, Lausanne, Switzerland 13.1 Review:
Biological Modeling of Neural Networks Week 4 – Reducing detail - Adding detail Wulfram Gerstner EPFL, Lausanne, Switzerland 4.2. Adding detail - synapse.
Biological Modeling of Neural Networks Week 3 – Reducing detail : Two-dimensional neuron models Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley.
Biological Modeling of Neural Networks: Week 14 – Dynamics and Plasticity Wulfram Gerstner EPFL, Lausanne, Switzerland 14.1 Reservoir computing - Complex.
Biological Modeling of Neural Networks: Week 9 – Adaptation and firing patterns Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 Firing patterns and adaptation.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Correlation-Induced Oscillations in Spatio-Temporal Excitable Systems Andre Longtin Physics Department, University of Ottawa Ottawa, Canada.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Neural codes and spiking models. Neuronal codes Spiking models: Hodgkin Huxley Model (small regeneration) Reduction of the HH-Model to two dimensions.
Biological Modeling of Neural Networks: Week 12 – Decision models: Competitive dynamics Wulfram Gerstner EPFL, Lausanne, Switzerland 12.1 Review: Population.
Study on synchronization of coupled oscillators using the Fokker-Planck equation H.Sakaguchi Kyushu University, Japan Fokker-Planck equation: important.
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
Theoretical Neuroscience Physics 405, Copenhagen University Block 4, Spring 2007 John Hertz (Nordita) Office: rm Kc10, NBI Blegdamsvej Tel (office)
Lecture 21 Neural Modeling II Martin Giese. Aim of this Class Account for experimentally observed effects in motion perception with the simple neuronal.
INTRODUCTION OF INTEGRATE AND FIRE MODEL Single neuron modeling By Pooja Sharma Research Scholar LNMIIT-Jaipur.
1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Electrical properties of neurons Rubén Moreno-Bote
Computing in carbon Basic elements of neuroelectronics Elementary neuron models -- conductance based -- modelers’ alternatives Wiring neurons together.
Lecture 4: more ion channels and their functions Na + channels: persistent K + channels: A current, slowly inactivating current, Ca-dependent K currents.
Biological Modeling of Neural Networks: Week 9 – Optimizing Neuron Models For Coding and Decoding Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 What.
Biological Modeling of Neural Networks Week 4 – Reducing detail - Adding detail Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley to.
Biological Modeling of Neural Networks Week 7 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 7.1.
Chapter 3. Stochastic Dynamics in the Brain and Probabilistic Decision-Making in Creating Brain-Like Intelligence, Sendhoff et al. Course: Robots Learning.
Neuronal Dynamics: Computational Neuroscience of Single Neurons
Asymptotic behaviour of blinking (stochastically switched) dynamical systems Vladimir Belykh Mathematics Department Volga State Academy Nizhny Novgorod.
Lecture 8: Integrate-and-Fire Neurons References: Dayan and Abbott, sect 5.4 Gerstner and Kistler, sects , 5.5, 5.6, H Tuckwell, Introduction.
Biological Modeling of Neural Networks: Week 15 – Fast Transients and Rate models Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1 Review Populations.
Biological Modeling of Neural Networks: Week 10 – Neuronal Populations Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1 Cortical Populations - columns.
Biological Modeling of Neural Networks Week 2 – Biophysical modeling: The Hodgkin-Huxley model Wulfram Gerstner EPFL, Lausanne, Switzerland 2.1 Biophysics.
Biological Modeling of Neural Networks Week 4 Reducing detail: Analysis of 2D models Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley.
Week 1: A first simple neuron model/ neurons and mathematics Week 2: Hodgkin-Huxley models and biophysical modeling Week 3: Two-dimensional models and.
Biological Modeling of Neural Networks Week 10 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1.
Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of.
Week 5 NETWORKS of NEURONS and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 5.1 Introduction - networks of neuron - systems for computing.
Effects of noisy adaptation on neural spiking statistics Tilo Schwalger,Benjamin Lindner, Karin Fisch, Jan Benda Max-Planck Institute for the Physics of.
Bayesian Brain - Chapter 11 Neural Models of Bayesian Belief Propagation Rajesh P.N. Rao Summary by B.-H. Kim Biointelligence Lab School of.
Error rate due to noise In this section, an expression for the probability of error will be derived The analysis technique, will be demonstrated on a binary.
The Brain as an Efficient and Robust Adaptive Learner
Dayan Abbot Ch 5.1-4,9.
H.Sebastian Seung, Daniel D. Lee, Ben Y. Reis, David W. Tank  Neuron 
Uma R. Karmarkar, Dean V. Buonomano  Neuron 
Koen Vervaeke, Hua Hu, Lyle J. Graham, Johan F. Storm  Neuron 
The Brain as an Efficient and Robust Adaptive Learner
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Desdemona Fricker, Richard Miles  Neuron 
BIOL3833 Week 11b: Dendrites.
Presentation transcript:

Biological Modeling of Neural Networks Week 8 – Noisy output models: Escape rate and soft threshold Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation of membrane potential - white noise approximation 8.2 Autocorrelation of Poisson 8.3 Noisy integrate-and-fire - superthreshold and subthreshold 8.4 Escape noise - stochastic intensity 8.5 Renewal models 8.6 Comparison of noise models Week 8 – parts 4-6 Noisy Output: Escape Rate and Soft Threshold

- Intrinsic noise (ion channels) Na + K+K+ -Finite number of channels -Finite temperature -Network noise (background activity) -Spike arrival from other neurons -Beyond control of experimentalist Neuronal Dynamics – Review: Sources of Variability small contribution! big contribution! Noise models?

escape process, stochastic intensity stochastic spike arrival (diffusive noise) Noise models u(t) t escape ratenoisy integration Relation between the two models: later Now: Escape noise!

escape process u(t) t escape rate u Neuronal Dynamics – 8.4 Escape noise escape rate Example: leaky integrate-and-fire model

escape process u(t) t escape rate u Neuronal Dynamics – 8.4 stochastic intensity examples Escape rate = stochastic intensity of point process

u(t) t escape rate u Neuronal Dynamics – 8.4 mean waiting time t I(t) mean waiting time, after switch 1ms Blackboard, Math detour

u(t) t Neuronal Dynamics – 8.4 escape noise/stochastic intensity Escape rate = stochastic intensity of point process

Neuronal Dynamics – Quiz 8.4 Escape rate/stochastic intensity in neuron models [ ] The escape rate of a neuron model has units one over time [ ] The stochastic intensity of a point process has units one over time [ ] For large voltages, the escape rate of a neuron model always saturates at some finite value [ ] After a step in the membrane potential, the mean waiting time until a spike is fired is proportional to the escape rate [ ] After a step in the membrane potential, the mean waiting time until a spike is fired is equal to the inverse of the escape rate [ ] The stochastic intensity of a leaky integrate-and-fire model with reset only depends on the external input current but not on the time of the last reset [ ] The stochastic intensity of a leaky integrate-and-fire model with reset depends on the external input current AND on the time of the last reset

Biological Modeling of Neural Networks Week 8 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation of membrane potential - white noise approximation 8.2 Autocorrelation of Poisson 8.3 Noisy integrate-and-fire - superthreshold and subthreshold 8.4 Escape noise - stochastic intensity 8.5 Renewal models Week 8 – part 5 : Renewal model

Neuronal Dynamics – 8.5. Interspike Intervals Example: nonlinear integrate-and-fire model deterministic part of input noisy part of input/intrinsic noise  escape rate Example: exponential stochastic intensity t

escape process u(t) Survivor function t escape rate Neuronal Dynamics – 8.5. Interspike Interval distribution tt

escape process A u(t) Survivor function t escape rate Interval distribution Survivor function escape rate Examples now u Neuronal Dynamics – 8.5. Interspike Intervals

escape rate Example: I&F with reset, constant input Survivor function 1 Interval distribution Neuronal Dynamics – 8.5. Renewal theory

escape rate Example: I&F with reset, time-dependent input, Survivor function 1 Interval distribution Neuronal Dynamics – 8.5. Time-dependent Renewal theory

escape rate neuron with relative refractoriness, constant input Survivor function 1 Interval distribution Neuronal Dynamics – Homework assignement

Neuronal Dynamics – 8.5. Firing probability in discrete time Probability to survive 1 time step Probability to fire in 1 time step

Neuronal Dynamics – 8.5. Escape noise - experiments escape rate Jolivet et al., J. Comput. Neurosc. 2006

Neuronal Dynamics – 8.5. Renewal process, firing probability Escape noise = stochastic intensity -Renewal theory - hazard function - survivor function - interval distribution -time-dependent renewal theory -discrete-time firing probability -Link to experiments  basis for modern methods of neuron model fitting

Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation of membrane potential - white noise approximation 8.2 Autocorrelation of Poisson 8.3 Noisy integrate-and-fire - superthreshold and subthreshold 8.4 Escape noise - stochastic intensity 8.5 Renewal models 8.6 Comparison of noise models Week 8 – part 6 : Comparison of noise models

escape process (fast noise) stochastic spike arrival (diffusive noise) u(t) t escape ratenoisy integration Neuronal Dynamics – 8.6. Comparison of Noise Models

Assumption: stochastic spiking rate Poisson spike arrival: Mean and autocorrelation of filtered signal mean Autocorrelation of output Autocorrelation of input Filter

Stochastic spike arrival: excitation, total rate R e inhibition, total rate R i u EPSC IPSC Synaptic current pulses Diffusive noise (stochastic spike arrival) Langevin equation, Ornstein Uhlenbeck process Blackboard

Diffusive noise (stochastic spike arrival) Math argument: - no threshold - trajectory starts at known value

Diffusive noise (stochastic spike arrival) Math argument

u u Membrane potential density: Gaussian p(u) constant input rates no threshold Neuronal Dynamics – 8.6. Diffusive noise/stoch. arrival A) No threshold, stationary input noisy integration

u u Membrane potential density: Gaussian at time t p(u(t)) Neuronal Dynamics – 8.6 Diffusive noise/stoch. arrival B) No threshold, oscillatory input noisy integration

u Membrane potential density u p(u) Neuronal Dynamics – 6.4. Diffusive noise/stoch. arrival C) With threshold, reset/ stationary input

Superthreshold vs. Subthreshold regime u p(u) Nearly Gaussian subthreshold distr. Neuronal Dynamics – 8.6. Diffusive noise/stoch. arrival Image: Gerstner et al. (2013) Cambridge Univ. Press; See: Konig et al. (1996)

escape process (fast noise) stochastic spike arrival (diffusive noise) A C u(t) noise white (fast noise) synapse (slow noise) (Brunel et al., 2001) : first passage time problem Interval distribution t Survivor function escape rate escape ratenoisy integration Neuronal Dynamics – 8.6. Comparison of Noise Models Stationary input: -Mean ISI -Mean firing rate Siegert 1951

Neuronal Dynamics – 8.6 Comparison of Noise Models Diffusive noise - distribution of potential - mean interspike interval FOR CONSTANT INPUT - time dependent-case difficult Escape noise - time-dependent interval distribution

stochastic spike arrival (diffusive noise) Noise models: from diffusive noise to escape rates escape rate noisy integration

Comparison: diffusive noise vs. escape rates escape rate subthreshold potential Probability of first spike diffusive escape Plesser and Gerstner (2000)

Neuronal Dynamics – 8.6 Comparison of Noise Models Diffusive noise - represents stochastic spike arrival - easy to simulate - hard to calculate Escape noise - represents internal noise - easy to simulate - easy to calculate - approximates diffusive noise - basis of modern model fitting methods

Neuronal Dynamics – Quiz 8.4. A. Consider a leaky integrate-and-fire model with diffusive noise: [ ] The membrane potential distribution is always Gaussian. [ ] The membrane potential distribution is Gaussian for any time-dependent input. [ ] The membrane potential distribution is approximately Gaussian for any time-dependent input, as long as the mean trajectory stays ‘far’ away from the firing threshold. [ ] The membrane potential distribution is Gaussian for stationary input in the absence of a threshold. [ ] The membrane potential distribution is always Gaussian for constant input and fixed noise level. B. Consider a leaky integrate-and-fire model with diffusive noise for time-dependent input. The above figure (taken from an earlier slide) shows that [ ] The interspike interval distribution is maximal where the determinstic reference trajectory is closest to the threshold. [ ] The interspike interval vanishes for very long intervals if the determinstic reference trajectory has stayed close to the threshold before - even if for long intervals it is very close to the threshold [ ] If there are several peaks in the interspike interval distribution, peak n is always of smaller amplitude than peak n-1. [ ] I would have ticked the same boxes (in the list of three options above) for a leaky integrate-and-fire model with escape noise.