Effects of noisy adaptation on neural spiking statistics Tilo Schwalger,Benjamin Lindner, Karin Fisch, Jan Benda Max-Planck Institute for the Physics of.

Slides:



Advertisements
Similar presentations
What is the neural code? Puchalla et al., What is the neural code? Encoding: how does a stimulus cause the pattern of responses? what are the responses.
Advertisements

Synapses and Multi Compartmental Models
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
Lecture 9: Population genetics, first-passage problems Outline: population genetics Moran model fluctuations ~ 1/N but not ignorable effect of mutations.
An Introductory to Statistical Models of Neural Data SCS-IPM به نام خالق ناشناخته ها.
Neurophysics Part 1: Neural encoding and decoding (Ch 1-4) Stimulus to response (1-2) Response to stimulus, information in spikes (3-4) Part 2: Neurons.
Spectral analysis for point processes. Error bars. Bijan Pesaran Center for Neural Science New York University.
What is the language of single cells? What are the elementary symbols of the code? Most typically, we think about the response as a firing rate, r(t),
Introduction: Neurons and the Problem of Neural Coding Laboratory of Computational Neuroscience, LCN, CH 1015 Lausanne Swiss Federal Institute of Technology.
Why are cortical spike trains irregular? How Arun P Sripati & Kenneth O Johnson Johns Hopkins University.
NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Romain Brette Computational neuroscience of sensory systems Dynamics of neural excitability.
Point process and hybrid spectral analysis.
Development of Improved Noise Metrics and Auditory Risk Assessment Procedure June 22, 2009 Won Joon Song and Jay Kim Mechanical Engineering Department.
Excitability Information processing in the retina Artificial neural networks Introduction to Neurobiology
Three Lessons Learned Never discard information prematurely Compression can be separated from channel transmission with no loss of optimality Gaussian.
Correlations (part I). 1.Mechanistic/biophysical plane: - What is the impact of correlations on the output rate, CV,... Bernarder et al ‘94, Murphy &
Spike-triggering stimulus features stimulus X(t) multidimensional decision function spike output Y(t) x1x1 x2x2 x3x3 f1f1 f2f2 f3f3 Functional models of.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
Part I: Single Neuron Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 2-5 Laboratory of Computational.
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Basic Models in Theoretical Neuroscience Oren Shriki 2010 Integrate and Fire and Conductance Based Neurons 1.
Biological Modeling of Neural Networks: Week 15 – Population Dynamics: The Integral –Equation Approach Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1.
Biological Modeling of Neural Networks: Week 13 – Membrane potential densities and Fokker-Planck Wulfram Gerstner EPFL, Lausanne, Switzerland 13.1 Review:
Extracting Time and Space Scales with Feedback and Nonlinearity André Longtin Physics + Cellular and Molecular Medicine CENTER FOR NEURAL DYNAMICS UNIVERSITY.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
Correlation-Induced Oscillations in Spatio-Temporal Excitable Systems Andre Longtin Physics Department, University of Ottawa Ottawa, Canada.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Neural codes and spiking models. Neuronal codes Spiking models: Hodgkin Huxley Model (small regeneration) Reduction of the HH-Model to two dimensions.
Biological Modeling of Neural Networks Week 8 – Noisy output models: Escape rate and soft threshold Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Precise and Approximate Representation of Numbers 1.The Cartesian-Lagrangian representation of numbers. 2.The homotopic representation of numbers 3.Loops.
Study on synchronization of coupled oscillators using the Fokker-Planck equation H.Sakaguchi Kyushu University, Japan Fokker-Planck equation: important.
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
Theoretical Neuroscience Physics 405, Copenhagen University Block 4, Spring 2007 John Hertz (Nordita) Office: rm Kc10, NBI Blegdamsvej Tel (office)
INTRODUCTION OF INTEGRATE AND FIRE MODEL Single neuron modeling By Pooja Sharma Research Scholar LNMIIT-Jaipur.
1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Andre Longtin Physics Department University of Ottawa Ottawa, Canada Effects of Non-Renewal Firing on Information Transfer in Neurons.
Content: 1) Dynamics of beta-cells. Polynomial model and gate noise. 2) The influence of noise. Phenomenological. 3) The Gaussian method. 4) Wave block.
Sara A. Solla Northwestern University
Content: 1) Dynamics of beta-cells. Polynomial model and gate noise. 2) The influence of noise. Phenomenological. 3) The Gaussian method. 4) Wave block.
Biological Modeling of Neural Networks Week 7 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 7.1.
Chapter 3. Stochastic Dynamics in the Brain and Probabilistic Decision-Making in Creating Brain-Like Intelligence, Sendhoff et al. Course: Robots Learning.
Neuronal Dynamics: Computational Neuroscience of Single Neurons
Lecture 8: Integrate-and-Fire Neurons References: Dayan and Abbott, sect 5.4 Gerstner and Kistler, sects , 5.5, 5.6, H Tuckwell, Introduction.
From LIF to HH Equivalent circuit for passive membrane The Hodgkin-Huxley model for active membrane Analysis of excitability and refractoriness using the.
0 Chapter 3: Simplified neuron and population models Fundamentals of Computational Neuroscience Dec 09.
Biological Modeling of Neural Networks Week 10 – Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1.
Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of.
Bayesian Brain - Chapter 11 Neural Models of Bayesian Belief Propagation Rajesh P.N. Rao Summary by B.-H. Kim Biointelligence Lab School of.
Diffusion over potential barriers with colored noise
786 Data Analysis Options.
Integrate and Fire Neurons
Multiplexed Spike Coding and Adaptation in the Thalamus
Linking neural dynamics and coding
9.8 Neural Excitability and Oscillations
? Dynamical properties of simulated MEG/EEG using a neural mass model
Dayan Abbot Ch 5.1-4,9.
Optimizing Neural Information Capacity through Discretization
Uma R. Karmarkar, Dean V. Buonomano  Neuron 
Multiplexed Spike Coding and Adaptation in the Thalamus
Multiplexed Spike Coding and Adaptation in the Thalamus
Perceptual learning Nisheeth 15th February 2019.
Adaptive Rescaling Maximizes Information Transmission
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Timothy O’Leary, Eve Marder  Current Biology 
Presentation transcript:

Effects of noisy adaptation on neural spiking statistics Tilo Schwalger,Benjamin Lindner, Karin Fisch, Jan Benda Max-Planck Institute for the Physics of Complex Systems, Dresden Stochastic Models in Neuroscience, Marseille 2010 LMU, Munich

Neurons show spike-frequency adaptation Gabbiani & Krapp, J Neurophysiol 2006 Intrinsic negative feedback

Outline What is the source of noise? ● Stratagy and introduction of the model ● ISI statistics for two cases i.“deterministic adaptation” + stochastic receptor current ii.“Stochastic adaptation” + deterministic receptor current ● “mixed case”

What is the source of variability? – The strategy ● Introduce variability by stochastic ion currents (channel noise), e.g. ● Modelling signal transduction of auditory receptor cells ● Receptor current ● Sodium current ● Potassium (delayed rectifier) current ● Adaptation current (M-type)

What is the source of variability? – The strategy ● Introduce variability by stochastic ion currents (channel noise), e.g. ● Modelling signal transduction of auditory receptor cells ● Receptor current ● Sodium current ● Potassium (delayed rectifier) current ● Adaptation current (M-type) ● Study separately the effect of a specific channel noise ● on the interspike interval (ISI) statistics ● ISI density, firing rate, coefficient of variation (CV), skewness, excess, … ● ISI serial correlation coefficient ● Compare different predictions with experimental data Interspike interval [ms ] Probability density

What is the source of variability? – The strategy ● Introduce variability by stochastic ion currents (channel noise), e.g. ● Modelling signal transduction of auditory receptor cells ● Receptor current ● Sodium current ● Potassium (delayed rectifier) current ● Adaptation current (M-type) ● Study separately the effect of a specific channel noise ● on the interspike interval (ISI) statistics ● ISI density, firing rate, coefficient of variation (CV), skewness, excess, … ● ISI serial correlation coefficient ● Compare different predictions with experimental data ● Here: stochastic adaptation current vs. Stochastic receptor current in a simplified model Interspike interval [ms ] Probability density

M-type adaptation current ● Two-state channel model ● (voltage-gated) ● Adaptation current with finite number of channels ● voltage-gated potassium current ● Slow kinetics # M-channels open # M- channels

Perfect integrate-and-fire model Stochastic adaption current White noise: fast receptor channels

Perfect integrate-and-fire model membrane Potential V M-channel activation fraction of open M-channels W Time [ms] Stochastic adaption current White noise: fast receptor channels threshold reset

Perfect integrate-and-fire model membrane Potential V M-channel activation fraction of open M-channels W Time [ms] Stochastic adaption current White noise: fast receptor channels threshold reset Firing rate Time [ms]

Diffusion approximation of channel noise ● adaptation split up in deterministic and noise part ● Additive noise approximation:

Two limit cases Deterministic adaptation

Two limit cases Stochastic adaptation Deterministic adaptation

Deterministic adaptation (N a →∞) ● Mean adaptation approximation reduced input current much larger than mean ISI

Deterministic adaptation (N a →∞) Inverse Gaussian ISI density Negative ISI correlations Gerstein & Mandelbrot, 1964

Stochastic adaptation current acts as a colored noise ● Slow modulation of instantaneous firing rate ● due to slow noise process η(t) ● average over fast dynamics of

Stochastic adaptation current acts as a colored noise ● Slow modulation of instantaneous firing rate ● due to slow noise process η(t) ● average over fast dynamics of → Colored noise-driven PIF neuron

Colored noise model captures ISI density

ISI density: B. Lindner, Phys Rev E 2004 small noise

Stochastic adaptation yields positive ISI correlations Positive ISI correlations B. Lindner, Phys Rev E 2004

ISI density – comparison Determistic adaptation Stochastic adaptation Same CV Inverse Gaussian fails!

ISI density – comparison Determistic adaptation Stochastic adaptation

How to discriminate from an inverse Gaussian density? skewnes s Inverse Gaussian

How to discriminate from an inverse Gaussian density? skewnes s excess (kurtosis) Inverse Gaussian

How to discriminate from an inverse Gaussian density? skewnes s excess (kurtosis) Inverse Gaussian Defin e inverse Gaussian distribution:

Separation of deterministic and stochastic adaptation Rescaled skewness Rescaled excess

Dependence on the adaptation time- scale Rescaled excess Serial correlation coefficient at lag 1

Skewness and excess of ISIs – theory ● PIF model driven by colored noise ● Fokker-Planck equation for p(v,η,t) ● Initial condition + fire&reset rule

Skewness and excess of ISIs – theory ● PIF model driven by colored noise ● Fokker-Planck equation for p(v,η,t) ● Initial condition ● ISI density + fire&reset rule

Skewness and excess of ISIs – theory ● PIF model driven by colored noise ● Fokker-Planck equation for p(v,η,t) ● Initial condition ● ISI density ● Variable transformation ● Fokker-Planck equation + fire&reset rule

Skewness and excess of ISIs – theory ● Small noise expansion

Mixed case – fast and slow fluctuations D>0 fixed, vary number of channels

Simulation of Hodgkin-Huxley type model with M-current Modified Traub-Miles model (Ermentrout, 2000) Mixed case Serial correlations

Summary ● Introduced an integrate-and-fire-model with stochastic adaptation current ● (channel noise) ● Case of deterministic adaptation current and white current noise: ● inverse Gaussian ISI distribution and negative serial correlations ● Case of stochastic adaptation current and no white noise: ● peaked ISI distribution and positive serial correlations ● Results might be useful to determine the dominant source of noise

Acknowledgements Karin Fisch LMU, Munich Benjamin Lindner MPIPKS, Dresden Jan Benda LMU, Munich

Variability depends on sound intensity Karin Fisch & Jan Benda, LMU