Ising Models for Neural Data John Hertz, Niels Bohr Institute and Nordita work done with Yasser Roudi (Nordita) and Joanna Tyrcha (SU) Math Bio Seminar, SU, 26 March 2009 arXiv: v1 (2009 )
Background and basic idea: New recording technology makes it possible to record from hundreds of neurons simultaneously
Background and basic idea: New recording technology makes it possible to record from hundreds of neurons simultaneously But what to make of all these data?
Background and basic idea: New recording technology makes it possible to record from hundreds of neurons simultaneously But what to make of all these data? Construct a model of the spike pattern distribution: find “functional connectivity” between neurons
Background and basic idea: New recording technology makes it possible to record from hundreds of neurons simultaneously But what to make of all these data? Construct a model of the spike pattern distribution: find “functional connectivity” between neurons Here: results for model networks
Outline
Data
Outline Data Model and methods, exact and approximate
Outline Data Model and methods, exact and approximate Results: accuracy of approximations, scaling of functional connections
Outline Data Model and methods, exact and approximate Results: accuracy of approximations, scaling of functional connections Quality of the fit to the data distribution
Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory Population Inhibitory Population External Input (Exc.)
Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory external drive Excitatory Population Inhibitory Population External Input (Exc.)
Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory external drive HH-like neurons, conductance-based synapses Excitatory Population Inhibitory Population External Input (Exc.)
Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory external drive HH-like neurons, conductance-based synapses Random connectivity: Probability of connection between any two neurons is c = K/N, where N is the size of the population and K is the average number of presynaptic neurons. Excitatory Population Inhibitory Population External Input (Exc.)
Get Spike Data from Simulations of Model Network 2 populations in network: Excitatory, Inhibitory Excitatory external drive HH-like neurons, conductance-based synapses Random connectivity: Probability of connection between any two neurons is c = K/N, where N is the size of the population and K is the average number of presynaptic neurons. Excitatory Population Inhibitory Population External Input (Exc.) Results here for c = 0.1, N = 1000
Tonic input inhibitory (100) excitatory (400) 16.1 Hz 7.9 Hz
R ext t (sec) Filtered white noise = 100 ms Stimulus modulation: Rapidly-varying input
Rasters inhibitory (100) excitatory (400) 15.1 Hz 8.6 Hz
Correlation coefficients Data in 10-ms bins cc ~ ± tonic data
Correlation coefficients cc ~ ± Experiments: Cited values of cc~0.01 [Schneidmann et al, Nature (2006)] ”stimulus” data
Modeling the distribution of spike patterns Have sets of spike patterns {S i } k S i = ±1 for spike/no spike (we use 10-ms bins) (temporal order irrelevant)
Modeling the distribution of spike patterns Have sets of spike patterns {S i } k S i = ±1 for spike/no spike (we use 10-ms bins) (temporal order irrelevant) Construct a distribution P[S] that generates the observed patterns (i.e., has the same correlations)
Modeling the distribution of spike patterns Have sets of spike patterns {S i } k S i = ±1 for spike/no spike (we use 10-ms bins) (temporal order irrelevant) Construct a distribution P[S] that generates the observed patterns (i.e., has the same correlations) Simplest nontrivial model (Schneidman et al, Nature (2006), Tkačik et al, arXiv:q-bio.NC/ ): Ising model, parametrized by J ij, h i
An inverse problem: Have: statistics, want: h i, J ij
An inverse problem: Have: statistics, want: h i, J ij Exact method: Boltzmann learning
An inverse problem: Have: statistics, want: h i, J ij Exact method: Boltzmann learning
An inverse problem: Have: statistics, want: h i, J ij Exact method: Boltzmann learning Requires long Monte Carlo runs to compute model statistics
1. (Naïve) mean field theory
or Mean field equations:
1. (Naïve) mean field theory or Inverse susceptibility (inverse correlation) matrix Mean field equations:
1. (Naïve) mean field theory or Inverse susceptibility (inverse correlation) matrix So, given correlation matrix, invert it, and Mean field equations:
2. TAP approximation
Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses)
2. TAP approximation Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses)
2. TAP approximation Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses) Onsager “reaction term”
2. TAP approximation Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses) Onsager “reaction term”
2. TAP approximation Thouless, Anderson, Palmer, Phil Mag 35 (1977) Kappen & Rodriguez, Neural Comp 10 (1998) Tanaka, PRE (1998) “TAP equations” (improved MFT for spin glasses) Onsager “reaction term” A quadratic equation to solve for J ij
3. Independent-pair approximation
Solve the two-spin problem:
3. Independent-pair approximation Solve the two-spin problem: Solve for J :
3. Independent-pair approximation Solve the two-spin problem: Solve for J : Low-rate limit:
4. Sessak-Monasson approximation
A combination of naïve mean field theory and independent-pair approximations:
4. Sessak-Monasson approximation A combination of naïve mean field theory and independent-pair approximations:
4. Sessak-Monasson approximation A combination of naïve mean field theory and independent-pair approximations: (Last term is to avoid double-counting)
Comparing approximations: N=20 nMFTind pair low-rateTAP SMTAP/SM
Comparing approximations: N=20 N =200 nMFTind pairnMFTind pair low-rate TAP SM TAP/SM
Comparing approximations: N=20 N =200 nMFTind pairnMFTind pair low-rate TAP SM TAP/SM the winner!
Error measures SM/TAP SM SM/TAP SM TAP nMFT low-rate ind pair
N-dependence: How do the inferred couplings depend on the size of the set of neurons used in the inference algorithm?
N-dependence: How do the inferred couplings depend on the size of the set of neurons used in the inference algorithm? N = 20 N=200
N-dependence: How do the inferred couplings depend on the size of the set of neurons used in the inference algorithm? N = 20 N= largest and smallest J’s:
N-dependence: How do the inferred couplings depend on the size of the set of neurons used in the inference algorithm? N = 20 N= largest and smallest J’s: Relative sizes of different J’s preserved, absolute sizes shrink.
N-dependence of mean and variance of the J’s: theory
From MFT for spin glasses (assumes J’s iid) in normal (i.e., not glassy) state:
N-dependence of mean and variance of the J’s: theory From MFT for spin glasses (assumes J’s iid) in normal (i.e., not glassy) state:
N-dependence of mean and variance of the J’s: theory From MFT for spin glasses (assumes J’s iid) in normal (i.e., not glassy) state: Invert to find statistics of J’s:
N-dependence of mean and variance of the J’s: theory From MFT for spin glasses (assumes J’s iid) in normal (i.e., not glassy) state: Invert to find statistics of J’s: 1/(const +N) dependence in mean and variance
N-dependence: theory vs computed mean standard deviation TAP SM/TAP SM theory Boltzmann
Heading for a spin glass state?
Tkacik et al speculated (on the basis of their data, N up to 40) that the system would reach a spin glass transition around N = 100
Heading for a spin glass state? Tkacik et al speculated (on the basis of their data, N up to 40) that the system would reach a spin glass transition around N = 100 Criterion for stability of the normal (not SG) phase: (de Almeida and Thouless, 1978):
Heading for a spin glass state? Tkacik et al speculated (on the basis of their data, N up to 40) that the system would reach a spin glass transition around N = 100 Criterion for stability of the normal (not SG) phase: (de Almeida and Thouless, 1978):
Heading for a spin glass state? Tkacik et al speculated (on the basis of their data, N up to 40) that the system would reach a spin glass transition around N = 100 Criterion for stability of the normal (not SG) phase: (de Almeida and Thouless, 1978): In all our results, we always find
Quality of the Ising-model fit
The Ising model fits the means and correlations correctly, but it does not generally get the higher-order statistics right.
Quality of the Ising-model fit The Ising model fits the means and correlations correctly, but it does not generally get the higher-order statistics right. Quality-of- fit measure: the KL distance
Quality of the Ising-model fit The Ising model fits the means and correlations correctly, but it does not generally get the higher-order statistics right. Quality-of- fit measure: the KL distance Compare with an independent- neuron one ( J ij = 0):
Quality of the Ising-model fit The Ising model fits the means and correlations correctly, but it does not generally get the higher-order statistics right. Quality-of- fit measure: the KL distance Compare with an independent- neuron one ( J ij = 0): Goodness-of-fit measure:
Results (can only do small samples)
d Ising d ind
Results (can only do small samples) d Ising d ind ___
Results (can only do small samples) d Ising d ind ___ G
Results (can only do small samples) d Ising d ind increasing run time extrapolation ___ G
Results (can only do small samples) d Ising d ind increasing run time extrapolation Linear for small N, looks like G->0 for N ~ 200 ___ G
Results (can only do small samples) d Ising d ind increasing run time extrapolation Linear for small N, looks like G->0 for N ~ 200 ___ G Model misses something essential about the distribution for large N
Summary
Ising distribution fits means and correlations of neuronal firing
Summary Ising distribution fits means and correlations of neuronal firing TAP and SM approximations give good, fast estimates of functional couplings J ij
Summary Ising distribution fits means and correlations of neuronal firing TAP and SM approximations give good, fast estimates of functional couplings J ij Spin glass MFT describes scaling of J ij ’s with sample size N
Summary Ising distribution fits means and correlations of neuronal firing TAP and SM approximations give good, fast estimates of functional couplings J ij Spin glass MFT describes scaling of J ij ’s with sample size N Quality of fit to data distribution deteriorates as N grows
Summary Ising distribution fits means and correlations of neuronal firing TAP and SM approximations give good, fast estimates of functional couplings J ij Spin glass MFT describes scaling of J ij ’s with sample size N Quality of fit to data distribution deteriorates as N grows Read more at arXiv: v1 (2009)