Download presentation
Presentation is loading. Please wait.
1
What does the synapse tell the axon?
Idan Segev Interdisciplinary Center for Neural Computation Hebrew University Thanks: Miki London Galit Fuhrman Adi Shraibman Elad Schneidman
2
Outline Introduction Information theory and synaptic efficacy
Questions in my lab. A brief history of the synapse and of “synaptic efficacy” - what does it mean? Complications with “synaptic efficacy” Information theory and synaptic efficacy Basic definitions (entropy, mutual information) The “Plug & Play” model Preliminary Results “Synaptic efficacy” In simple neuron models In passive dendritic structures In excitable dendrites Conclusions Future questions
3
Research focus in my group
1. Neuronal “noise” and input-output properties of neurons. Ion-channels, synaptic noise and AP reliability Optimization of information transmission with noise 2. Nonlinear cable theory. Threshold conditions for excitation in excitable dendrites Active propagation in excitable trees 3. “Learning rules” for ion channels and synapses. How to build a “H&H” axon? How to “read” synaptic plasticity? 4. The synapse: “what does it say”? Could dynamic synapses encode the timing of the pre-synaptic spikes? “Synaptic efficacy” - what does it mean?
4
THE “Synapse”
5
Motivation: Single synapse matters
6
“Synaptic efficacy” Artificial Neural Networks - synaptic efficacy reduced to a single number, Wij (Jij) Biophysics - Utilizing the (average) properties of the PSP (peak; rise-time; area, charge …) Cross-Correlation - Relating the pre-synaptic input to the post-synaptic output (the firing probability). How do synaptic properties affect the cross-correlation?
7
Complications: Who is more “effective” and by how much?
EPSP peak is equal but the rise time is different ? EPSP area is equal but the peak is different?
8
Complications: Background synaptic activity
Spontaneous in vivo voltage fluctuations in a neuron from the cat visual cortex L.J. Borg-Graham, C. Monier & Y. Frengac
9
The “Plug & Play” Model Input “Neuron” Noise Output Input Background
Mutual Information Input Background Activity Output
10
Compression, Entropy and Mutual Information
Compressed output Spike train given the input Mutual Information Information in the input? Compressed Spike train output Entropy Output Spike train The Mutual Information is the extra bits saved by knowing the input. Known Synaptic Input 1. Compressing output spike train (replacing longer strings with shorter ones). Entropy = length of string (the ration is entropy per character) 2. Given an input - *01 in input always goes with 01 in output (namely; given 01 in input we know the output) *given 001 in input; the output could be either 000 or Thus, a single bit is sufficient to encode which of these two possibilities occure (“0” for the first case; “1” for the second). *Output now is represented with 4 characters (this is all that I need to know about the output, assuming I know the input) *The # of bits that were “saved” because the input is known, is the MI. Compression Information estimation We use the CTW compression algorithm (best known today)
11
I&F (effect of potentiation)
Threshold background x5 Isolated synapse Background synapse
12
(I&F) - EPSP parameters and the MI
Fixed peak Fixed charge
13
Why MI corresponds best to EPSP peak?
Input Marking spikes following an EPSP. Only the cases with at least one spike (in a window of 50 msec) are plotted. For sharp EPSP, there were 200 cases (out of 2000 input EPSPs) with at list one spike in the chsen time-window. Most of these spike are locked to the EPSP. For broader EPSP there was a larger total # of spikes (500/2000 in the lower case) - but these spikes are less locked to the input. The cross correlation is essentially the histogram of the raster plot. The peak of the CC is larger for sharp EPSP whereas the width (and integral) of the CC is larger (more spikes). Can we use the CC to say who is more efficient? This result depends on bin size(3 msec here). E.g. with dT = 15 msec, the lower raster would be also locked. So why does the MI depends only weakly on bin-size? Sharp EPSP Less spikes, More accurate Smeared EPSP More spikes, Less accurate
14
Passive Cable with synapses
15
MI (efficacy) of distal synapses scales with EPSP peak
Proximal
16
MI with Active dendritic currents (Linear synapses)
proximal distal With active dendrites (Na and I_A) : 1. Background activity is boosted. (i) proximal synapse transmits less information (“noise” is larger and proximal EPSP is almost passive). (ii) Distal synapse is relatively more boosted due to large input impedance (active current is manifested more). Could become (ii) Intemediate synapse is boosted probably as much as the noise does; so it does not transmit much information Important: BOOSTING affects both individual synapse but also the “noise” intermediate distal
17
Conclusions Peak EPSP is the dominant parameter for Mutual information of synaptic input Validity & Generality of method Advantage of modeling for such issues Possibility to ask many questions (with control) applicability for experimental data
18
Future Questions Natural Generalizations Dendritic trees MI of Inhibitory synapses Depressing and facilitating synapses Other noise sources Efficacy of inhibitory synapse “Selfish” or Cooperative strategies for maximizing information transfer (each synapse may want to increase each EPSP peak, but others do too) Establishing and improving the method (confidence limits, better estimates …)
19
galit
21
Stochastic Model For Dynamic Synapses:
1 ? 2 3 AP 2 types of “randomness”: 1. Is there a vesicle in the release site? 2. Would a vesicle be released in response to a presynaptic AP?
22
How to quantify the relation between the input properties and its efficacy?
Which of these two inputs is more efficient? By how much?
23
Entropy estimation …… Given a sequence generated by a source the Shannon McMillan Breiman theorem states that: Two problems: The sequence is finite We don’t know the true probability p of the sequence (we can only estimate it).
25
Effect of bin size x5 x3 Wide Sharp Wide Sharp Control
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.