Ising Models for Neural Data John Hertz, Niels Bohr Institute and Nordita work done with Yasser Roudi (Nordita) and Joanna Tyrcha (SU) Math Bio Seminar,

Slides:



Advertisements
Similar presentations
Dynamic causal Modelling for evoked responses Stefan Kiebel Wellcome Trust Centre for Neuroimaging UCL.
Advertisements

Lesson 10: Linear Regression and Correlation
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
Dynamic Causal Modelling for ERP/ERFs Valentina Doria Georg Kaegi Methods for Dummies 19/03/2008.
Lecture 13: Associative Memory References: D Amit, N Brunel, Cerebral Cortex 7, (1997) N Brunel, Network 11, (2000) N Brunel, Cerebral.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Outline input analysis input analyzer of ARENA parameter estimation
Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
Linear regression models
Ch11 Curve Fitting Dr. Deshi Ye
Lecture 14: Spin glasses Outline: the EA and SK models heuristic theory dynamics I: using random matrices dynamics II: using MSR.
Propagation of Error Ch En 475 Unit Operations. Quantifying variables (i.e. answering a question with a number) 1. Directly measure the variable. - referred.
LFPs 1: Spectral analysis Kenneth D. Harris 11/2/15.
III-28 [122] Spike Pattern Distributions in Model Cortical Networks Joanna Tyrcha, Stockholm University, Stockholm; John Hertz, Nordita, Stockholm/Copenhagen.
What is the language of single cells? What are the elementary symbols of the code? Most typically, we think about the response as a firing rate, r(t),
Why are cortical spike trains irregular? How Arun P Sripati & Kenneth O Johnson Johns Hopkins University.
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Biological Modeling of Neural Networks: Week 11 – Continuum models: Cortical fields and perception Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Transients.
For stimulus s, have estimated s est Bias: Cramer-Rao bound: Mean square error: Variance: Fisher information How good is our estimate? (ML is unbiased:
Spike-triggering stimulus features stimulus X(t) multidimensional decision function spike output Y(t) x1x1 x2x2 x3x3 f1f1 f2f2 f3f3 Functional models of.
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Lecture II-2: Probability Review
1 Terminating Statistical Analysis By Dr. Jason Merrick.
-Gaurav Mishra -Pulkit Agrawal. How do neurons work Stimuli  Neurons respond (Excite/Inhibit) ‘Electrical Signals’ called Spikes Spikes encode information.
Regression and Correlation Methods Judy Zhong Ph.D.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 7th~10th Belief propagation Appendix Kazuyuki Tanaka Graduate School of Information.
Lecture 11: Ising model Outline: equilibrium theory d = 1
Monte Carlo Simulation CWR 6536 Stochastic Subsurface Hydrology.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
CS433: Modeling and Simulation Dr. Anis Koubâa Al-Imam Mohammad bin Saud University 15 October 2010 Lecture 05: Statistical Analysis Tools.
Lecture 11: Networks II: conductance-based synapses, visual cortical hypercolumn model References: Hertz, Lerchner, Ahmadi, q-bio.NC/ [Erice lectures]
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Chapter 8 Curve Fitting.
KTH/CSC March 10, 2010Erik Aurell, KTH & Aalto University1 Dynamic cavity method and marginals of non-equilibrium stationary states KITPC/ITP-CAS Program.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Confidence intervals and hypothesis testing Petter Mostad
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
Propagation of Error Ch En 475 Unit Operations. Quantifying variables (i.e. answering a question with a number) 1. Directly measure the variable. - referred.
1 Always be mindful of the kindness and not the faults of others.
Workshop on Optimization in Complex Networks, CNLS, LANL (19-22 June 2006) Application of replica method to scale-free networks: Spectral density and spin-glass.
Image Stabilization by Bayesian Dynamics Yoram Burak Sloan-Swartz annual meeting, July 2009.
Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Jonathan Pillow HHMI and NYU Oct 5, Course.
Graduate School of Information Sciences, Tohoku University
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The KOSMOSHOWS What is it ? The statistic inside What it can do ? Future development Demonstration A. Tilquin (CPPM)
CHARACTERIZATION OF NONLINEAR NEURON RESPONSES AMSC 664 Final Presentation Matt Whiteway Dr. Daniel A. Butts Neuroscience.
Dynamic Causal Model for evoked responses in MEG/EEG Rosalyn Moran.
Ka-fu Wong © 2007 ECON1003: Analysis of Economic Data Lesson0-1 Supplement 2: Comparing the two estimators of population variance by simulations.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
Part 5 - Chapter
The general linear model and Statistical Parametric Mapping
Dynamic Causal Model for evoked responses in M/EEG Rosalyn Moran.
42.13 Spike Pattern Distributions for Model Cortical Networks P-8
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
John Widloski, Ila R. Fiete  Neuron 
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Ch11 Curve Fitting II.
Dynamic Causal Modelling for M/EEG
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
Cluster Variation Method for Correlation Function of Probabilistic Model with Loopy Graphical Structure Kazuyuki Tanaka Graduate School of Information.
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Presentation transcript:

Ising Models for Neural Data John Hertz, Niels Bohr Institute and Nordita work done with Yasser Roudi (Nordita) and Joanna Tyrcha (SU) Math Bio Seminar, SU, 26 March 2009 arXiv: v1 (2009 )

Background and basic idea: New recording technology makes it possible to record from hundreds of neurons simultaneously

Background and basic idea: New recording technology makes it possible to record from hundreds of neurons simultaneously But what to make of all these data?

Background and basic idea: New recording technology makes it possible to record from hundreds of neurons simultaneously But what to make of all these data? Construct a model of the spike pattern distribution: find “functional connectivity” between neurons

Background and basic idea: New recording technology makes it possible to record from hundreds of neurons simultaneously But what to make of all these data? Construct a model of the spike pattern distribution: find “functional connectivity” between neurons Here: results for model networks

Outline

Data

Outline Data Model and methods, exact and approximate

Outline Data Model and methods, exact and approximate Results: accuracy of approximations, scaling of functional connections

Outline Data Model and methods, exact and approximate Results: accuracy of approximations, scaling of functional connections Quality of the fit to the data distribution

Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory Population Inhibitory Population External Input (Exc.)

Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory external drive Excitatory Population Inhibitory Population External Input (Exc.)

Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory external drive HH-like neurons, conductance-based synapses Excitatory Population Inhibitory Population External Input (Exc.)

Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory external drive HH-like neurons, conductance-based synapses Random connectivity: Probability of connection between any two neurons is c = K/N, where N is the size of the population and K is the average number of presynaptic neurons. Excitatory Population Inhibitory Population External Input (Exc.)

Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory external drive HH-like neurons, conductance-based synapses Random connectivity: Probability of connection between any two neurons is c = K/N, where N is the size of the population and K is the average number of presynaptic neurons. Excitatory Population Inhibitory Population External Input (Exc.) Results here for c = 0.1, N = 1000

Tonic input inhibitory (100) excitatory (400) 16.1 Hz 7.9 Hz

R ext t (sec) Filtered white noise  = 100 ms Stimulus modulation: Rapidly-varying input

Rasters inhibitory (100) excitatory (400) 15.1 Hz 8.6 Hz

Correlation coefficients Data in 10-ms bins cc ~ ± tonic data

Correlation coefficients cc ~ ± Experiments: Cited values of cc~0.01 [Schneidmann et al, Nature (2006)] ”stimulus” data

Modeling the distribution of spike patterns Have sets of spike patterns {S i } k S i = ±1 for spike/no spike (we use 10-ms bins) (temporal order irrelevant)

Modeling the distribution of spike patterns Have sets of spike patterns {S i } k S i = ±1 for spike/no spike (we use 10-ms bins) (temporal order irrelevant) Construct a distribution P[S] that generates the observed patterns (i.e., has the same correlations)

Modeling the distribution of spike patterns Have sets of spike patterns {S i } k S i = ±1 for spike/no spike (we use 10-ms bins) (temporal order irrelevant) Construct a distribution P[S] that generates the observed patterns (i.e., has the same correlations) Simplest nontrivial model (Schneidman et al, Nature (2006), Tkačik et al, arXiv:q-bio.NC/ ): Ising model, parametrized by J ij, h i

An inverse problem: Have: statistics, want: h i, J ij

An inverse problem: Have: statistics, want: h i, J ij Exact method: Boltzmann learning

An inverse problem: Have: statistics, want: h i, J ij Exact method: Boltzmann learning

An inverse problem: Have: statistics, want: h i, J ij Exact method: Boltzmann learning Requires long Monte Carlo runs to compute model statistics

1. (Naïve) mean field theory

or Mean field equations:

1. (Naïve) mean field theory or Inverse susceptibility (inverse correlation) matrix Mean field equations:

1. (Naïve) mean field theory or Inverse susceptibility (inverse correlation) matrix So, given correlation matrix, invert it, and Mean field equations:

2. TAP approximation

Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses)

2. TAP approximation Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses)

2. TAP approximation Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses) Onsager “reaction term”

2. TAP approximation Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses) Onsager “reaction term”

2. TAP approximation Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses) Onsager “reaction term” A quadratic equation to solve for J ij

3. Independent-pair approximation

Solve the two-spin problem:

3. Independent-pair approximation Solve the two-spin problem: Solve for J :

3. Independent-pair approximation Solve the two-spin problem: Solve for J : Low-rate limit:

4. Sessak-Monasson approximation

A combination of naïve mean field theory and independent-pair approximations:

4. Sessak-Monasson approximation A combination of naïve mean field theory and independent-pair approximations:

4. Sessak-Monasson approximation A combination of naïve mean field theory and independent-pair approximations: (Last term is to avoid double-counting)

Comparing approximations: N=20 nMFTind pair low-rateTAP SMTAP/SM

Comparing approximations: N=20 N =200 nMFTind pairnMFTind pair low-rate TAP SM TAP/SM

Comparing approximations: N=20 N =200 nMFTind pairnMFTind pair low-rate TAP SM TAP/SM the winner!

Error measures SM/TAP SM SM/TAP SM TAP nMFT low-rate ind pair

N-dependence: How do the inferred couplings depend on the size of the set of neurons used in the inference algorithm?

N-dependence: How do the inferred couplings depend on the size of the set of neurons used in the inference algorithm? N = 20 N=200

N-dependence: How do the inferred couplings depend on the size of the set of neurons used in the inference algorithm? N = 20 N= largest and smallest J’s:

N-dependence: How do the inferred couplings depend on the size of the set of neurons used in the inference algorithm? N = 20 N= largest and smallest J’s: Relative sizes of different J’s preserved, absolute sizes shrink.

N-dependence of mean and variance of the J’s: theory

From MFT for spin glasses (assumes J’s iid) in normal (i.e., not glassy) state:

N-dependence of mean and variance of the J’s: theory From MFT for spin glasses (assumes J’s iid) in normal (i.e., not glassy) state:

N-dependence of mean and variance of the J’s: theory From MFT for spin glasses (assumes J’s iid) in normal (i.e., not glassy) state: Invert to find statistics of J’s:

N-dependence of mean and variance of the J’s: theory From MFT for spin glasses (assumes J’s iid) in normal (i.e., not glassy) state: Invert to find statistics of J’s: 1/(const +N) dependence in mean and variance

N-dependence: theory vs computed mean standard deviation TAP SM/TAP SM theory Boltzmann

Heading for a spin glass state?

Tkacik et al speculated (on the basis of their data, N up to 40) that the system would reach a spin glass transition around N = 100

Heading for a spin glass state? Tkacik et al speculated (on the basis of their data, N up to 40) that the system would reach a spin glass transition around N = 100 Criterion for stability of the normal (not SG) phase: (de Almeida and Thouless, 1978):

Heading for a spin glass state? Tkacik et al speculated (on the basis of their data, N up to 40) that the system would reach a spin glass transition around N = 100 Criterion for stability of the normal (not SG) phase: (de Almeida and Thouless, 1978):

Heading for a spin glass state? Tkacik et al speculated (on the basis of their data, N up to 40) that the system would reach a spin glass transition around N = 100 Criterion for stability of the normal (not SG) phase: (de Almeida and Thouless, 1978): In all our results, we always find

Quality of the Ising-model fit

The Ising model fits the means and correlations correctly, but it does not generally get the higher-order statistics right.

Quality of the Ising-model fit The Ising model fits the means and correlations correctly, but it does not generally get the higher-order statistics right. Quality-of- fit measure: the KL distance

Quality of the Ising-model fit The Ising model fits the means and correlations correctly, but it does not generally get the higher-order statistics right. Quality-of- fit measure: the KL distance Compare with an independent- neuron one ( J ij = 0):

Quality of the Ising-model fit The Ising model fits the means and correlations correctly, but it does not generally get the higher-order statistics right. Quality-of- fit measure: the KL distance Compare with an independent- neuron one ( J ij = 0): Goodness-of-fit measure:

Results (can only do small samples)

d Ising d ind

Results (can only do small samples) d Ising d ind ___

Results (can only do small samples) d Ising d ind ___ G

Results (can only do small samples) d Ising d ind  increasing run time extrapolation ___ G

Results (can only do small samples) d Ising d ind  increasing run time extrapolation Linear for small N, looks like G->0 for N ~ 200 ___ G

Results (can only do small samples) d Ising d ind  increasing run time extrapolation Linear for small N, looks like G->0 for N ~ 200 ___ G Model misses something essential about the distribution for large N

Summary

Ising distribution fits means and correlations of neuronal firing

Summary Ising distribution fits means and correlations of neuronal firing TAP and SM approximations give good, fast estimates of functional couplings J ij

Summary Ising distribution fits means and correlations of neuronal firing TAP and SM approximations give good, fast estimates of functional couplings J ij Spin glass MFT describes scaling of J ij ’s with sample size N

Summary Ising distribution fits means and correlations of neuronal firing TAP and SM approximations give good, fast estimates of functional couplings J ij Spin glass MFT describes scaling of J ij ’s with sample size N Quality of fit to data distribution deteriorates as N grows

Summary Ising distribution fits means and correlations of neuronal firing TAP and SM approximations give good, fast estimates of functional couplings J ij Spin glass MFT describes scaling of J ij ’s with sample size N Quality of fit to data distribution deteriorates as N grows Read more at arXiv: v1 (2009)