Romain Brette Ecole Normale Supérieure, Paris Philosophy of the spike.

Slides:



Advertisements
Similar presentations
What is the neural code? Puchalla et al., What is the neural code? Encoding: how does a stimulus cause the pattern of responses? what are the responses.
Advertisements

Quantum One: Lecture 1a Entitled So what is quantum mechanics, anyway?
Evaluating Classifiers
Testing Theories: Three Reasons Why Data Might not Match the Theory.
Spontaneous recovery in dynamic networks Advisor: H. E. Stanley Collaborators: B. Podobnik S. Havlin S. V. Buldyrev D. Kenett Antonio Majdandzic Boston.
Predictability and Chaos EPS and Probability Forecasting.
An Introductory to Statistical Models of Neural Data SCS-IPM به نام خالق ناشناخته ها.
Artificial Spiking Neural Networks
BCS547 Neural Encoding.
The University of Manchester Introducción al análisis del código neuronal con métodos de la teoría de la información Dr Marcelo A Montemurro
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Ai in game programming it university of copenhagen Statistical Learning Methods Marco Loog.
For stimulus s, have estimated s est Bias: Cramer-Rao bound: Mean square error: Variance: Fisher information How good is our estimate? (ML is unbiased:
Linear Regression.
Spike-triggering stimulus features stimulus X(t) multidimensional decision function spike output Y(t) x1x1 x2x2 x3x3 f1f1 f2f2 f3f3 Functional models of.
Statistical inference
Evaluating Hypotheses
EG1204: Earth Systems: an introduction Meteorology and Climate Lecture 7 Climate: prediction & change.
Deterministic Solutions Geostatistical Solutions
1 Introduction to Biostatistics (PUBHLTH 540) Sampling.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Lecture 6: Descriptive Statistics: Probability, Distribution, Univariate Data.
Introduction to Regression Analysis, Chapter 13,
Chapter 6: Normal Probability Distributions
Theory testing Part of what differentiates science from non-science is the process of theory testing. When a theory has been articulated carefully, it.
1 Performance Evaluation of Computer Networks: Part II Objectives r Simulation Modeling r Classification of Simulation Modeling r Discrete-Event Simulation.
Statistics Primer ORC Staff: Xin Xin (Cindy) Ryan Glaman Brett Kellerstedt 1.
Fundamentals of Data Analysis Lecture 4 Testing of statistical hypotheses.
Testing Theories: Three Reasons Why Data Might not Match the Theory Psych 437.
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Statistical Analysis of Loads
Methods of Science Section 1.1. Methods of Science 3 areas of science: Life, Earth, Physical –What is involved in each? Scientific Explanations- not always.
Introduction Random Process. Where do we start from? Undergraduate Graduate Probability course Our main course Review and Additional course If we have.
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
ECE 8443 – Pattern Recognition Objectives: Error Bounds Complexity Theory PAC Learning PAC Bound Margin Classifiers Resources: D.M.: Simplified PAC-Bayes.
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
ENM 503 Lesson 1 – Methods and Models The why’s, how’s, and what’s of mathematical modeling A model is a representation in mathematical terms of some real.
Testing Theories: The Problem of Sampling Error. The problem of sampling error It is often the case—especially when making point predictions—that what.
Spike-based computation
Propositional Calculus CS 270: Mathematical Foundations of Computer Science Jeremy Johnson.
Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy Soundararajan Ezekiel Matthew Lang Computer Science Department Indiana University.
Quantum Two 1. 2 Evolution of Many Particle Systems 3.
Introduction: Brain Dynamics Jaeseung Jeong, Ph.D Department of Bio and Brain Engineering, KAIST.
BioSS reading group Adam Butler, 21 June 2006 Allen & Stott (2003) Estimating signal amplitudes in optimal fingerprinting, part I: theory. Climate dynamics,
6. Coping with Non-Ideality SVNA 10.3
Web page: Textbook. Abbott and Dayan. Homework and grades Office Hours.
BCS547 Neural Decoding. Population Code Tuning CurvesPattern of activity (r) Direction (deg) Activity
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
1 LES of Turbulent Flows: Lecture 2 (ME EN ) Prof. Rob Stoll Department of Mechanical Engineering University of Utah Spring 2011.
MA354 Math Modeling Introduction. Outline A. Three Course Objectives 1. Model literacy: understanding a typical model description 2. Model Analysis 3.
Statistical Methods. 2 Concepts and Notations Sample unit – the basic landscape unit at which we wish to establish the presence/absence of the species.
Introduction A probability distribution is obtained when probability values are assigned to all possible numerical values of a random variable. It may.
CHAPTER 4 ESTIMATES OF MEAN AND ERRORS. 4.1 METHOD OF LEAST SQUARES I n Chapter 2 we defined the mean  of the parent distribution and noted that the.
The accuracy of averages We learned how to make inference from the sample to the population: Counting the percentages. Here we begin to learn how to make.
Fundamentals of Data Analysis Lecture 4 Testing of statistical hypotheses pt.1.
[Chaos in the Brain] Nonlinear dynamical analysis for neural signals Jaeseung Jeong, Ph.D Department of Bio and Brain Engineering, KAIST.
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
Stats 242.3(02) Statistical Theory and Methodology.
Chapter 6 – Continuous Probability Distribution Introduction A probability distribution is obtained when probability values are assigned to all possible.
Dimension Review Many of the geometric structures generated by chaotic map or differential dynamic systems are extremely complex. Fractal : hard to define.
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
Lecture 2 – Monte Carlo method in finance
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Time and Quantum from Correlations
Processes & Patterns Spatial Data Models 5/10/2019 © J.M. Piwowar
Toward a Great Class Project: Discussion of Stoianov & Zorzi’s Numerosity Model Psych 209 – 2019 Feb 14, 2019.
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Presentation transcript:

Romain Brette Ecole Normale Supérieure, Paris Philosophy of the spike

The question Is neural computation based on spikes or on firing rates? SPIKES RATES Goal of this talk: to understand the question!

Three statements I have heard 1.“Both rate and spike timing are important for coding, so the truth is in between” 2.“Neural responses are variable in vivo, therefore neural codes can only be based on rates” 3.“A stochastic spike-based theory is nothing else than a rate-based theory, only at a finer timescale”

“Both rate and spike timing are important for coding, so the truth is in between”

The « golden mean »: between two extreme positions, an intermediate one must be true. Aristotle a.k.a. « the golden mean fallacy » Extreme Position A: there is a God Extreme Position B: there is no God => there is half a God! Are rate-based and spike-based views two extreme positions of the same nature?

Of spikes and rates dt Spikes: a well-defined timed event, the basis of neural interaction Rates: an abstract concept defined on spikes e.g. temporal or spatial average (defined in a large N limit); probabilistic expectation. Rate-based postulate: this concept/approximation captures everything relevant about neural activity Spike-based view: this postulate is not correct This does not mean that « rate » is irrelevant!

Rate in spike-based theories 1)Spike-based computation requires spikes 2)More spikes, more computation 3)Therefore, firing rate determines quantity of information Spike-based view: rate determines quantity of information Rate-based view: rate determines content of information

The tuning curve Firing rate varies with stimulus properties (rate-based) Firing rate « encodes » direction or:(spike-based) The neuron spends more energy at the « preferred » direction (rate is a correlate of computation) The question is not: « is firing rate or spike timing more informative/useful? » but: « which one is the basis of computation? »

“Both rate and spike timing are important for coding, so the truth is in between” Spike-based view: rate determines quantity of information Rate-based view: rate determines content of information

“Neural responses are variable in vivo, therefore neural codes can only be based on rates”

Neural variability Temporal irregularity rate (Hz), V1 ISI Close to Poisson statistics Rate-based view: spike trains have Poisson statistics (ad hoc hypothesis) Spike-based view: spike trains have Poisson statistics (maximum information) Lack of reproducibility - empirically questionable - could result from uncontrolled variable But let’s assume it’s true and examine the argument!

No reproducibility => rate-based? lack of reproducibility => either stochastic or chaotic This is about stochastic/chaotic vs. deterministic, not about rate-based vs. spike-based Implicit logic responses of N neurons are irreproducible => there exist N dynamic quantities that 1)completely characterize the state of the system and its evolution 2)determine the probability of firing of the neurons This is pure speculation!

A counter-example Sparse coding Imagine you want to code this signal: with the spike trains of N neurons, so that you can reconstruct the signal by summing the PSPs The problem is degenerate, so there are many solutions. For example this one:Or this one: (with a given rate)

A counter-example The problem is degenerate, so there are many solutions. For example this one:Or this one:

The argument strikes back Do rate-based theories account for neural variability? 1)Rate-based theories are deterministic 2)Deterministic description is obtained by averaging, a.k.a. removing variability Rate-based theories do not account for neural variability, they acknowledge that there is neural variability To account for variability of spike trains requires spikes, i.e., a stochastic/chaotic spike-based theory

“Neural responses are variable in vivo, therefore neural codes can only be based on rates” Rate-based theories do not account for neural variability, they acknowledge that there is neural variability, and postulate that it is irrelevant (averaging) To account for variability of spike trains requires spikes, i.e., a stochastic/chaotic spike-based theory

“A stochastic spike-based theory is nothing else than a rate-based theory, only at a finer timescale”

spikes rate In terms of stimulus-response properties, there is about the same information in the time- varying rate Rate-based postulate: 1)for each neuron, there exists a private quantity r(t) whose evolution only depends on the other quantities r i (t). 2)spike trains are derived from r(t) only r1r1 r2r2 rnrn r = f(r 1, r 2, r n ) It is assumed that this is approximately the same for all realizations stochastic

“A stochastic spike-based theory is nothing else than a rate-based theory, only at a finer timescale” Rate-based postulate: 1)for each neuron, there exists a private quantity r(t) whose evolution only depends on the other quantities r i (t). 2)spike trains are derived from r(t) only r1r1 r2r2 rnrn r = f(r 1, r 2, r n ) It is assumed that this is approximately the same for all realizations stochastic Implication: spike trains are realizations of independent random processes, with a source of stochasticity entirely intrinsic to the neuron. This has nothing to with the timescale!

Reformulating the question

Is neural computation based on spikes or on firing rates? Can neural activity and computation be entirely and consistently described by the dynamics of time-varying rates in the network? Spelling out the rate-based postulate 1)for each neuron, there exists a private quantity r(t) whose evolution only depends on the other quantities r i (t). 2)r i (t) is the expected firing probability of neuron i. 3)spike trains (realizations) depend on r(t) only, through a private stochastic process (independent neurons)

Spelling out the rate-based postulate 1)for each neuron, there exists a private quantity r(t) whose evolution only depends on the other quantities r i (t). 2)r i (t) is the expected firing probability of neuron i. 3)spike trains (realizations) depend on r(t) only, through a private stochastic process (independent neurons) This works for sparse random networks, but not in general. Example 1: random networks If true, then r i (t) can be found by writing self-consistent equations (cf. Brunel) Example 2: sparse coding Signal reconstruction is more accurate than with rates

Marr’s three levels The three levels of analysis of an information- processing system: 1.Computational level 2.Algorithmic/representational level 3.Physical level Marr (1982) Vision. MIT Press « Rate-based computation » is the postulate that levels #2 and #3 are independent The postulate is methodological (convenient), not based on either evidence or reasoning

Neurons: actors or observers? The coding metaphor The acting metaphor The neuron acts on its environment

The chaos argument “Neural networks are chaotic, therefore neural codes can only be based on rates” (a special case of the variability argument) A number of dynamical variables: v 1, v 2, v n... Their exact evolution is unpredictable => same as random variables Main problem: chaos is deterministic Lorenz attractor

The chaos argument transposed to climate Climate equations are chaotic, therefore one can replace the variables by random variables without loss Wrong: 1) In the Lorenz equations, the variables are still constrained to a deterministic set (the Lorenz attractor) 2) You can make short-term predictions that are better than seasonal means