What does the synapse tell the axon?

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Neural Signaling: Postsynaptic Potentials Lesson 9.
Dendritic computation. Passive contributions to computation Active contributions to computation Dendrites as computational elements: Examples Dendritic.
Synapses and Multi Compartmental Models
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Neurophysics Part 1: Neural encoding and decoding (Ch 1-4) Stimulus to response (1-2) Response to stimulus, information in spikes (3-4) Part 2: Neurons.
Spike timing dependent plasticity Homeostatic regulation of synaptic plasticity.
1 Testing the Efficiency of Sensory Coding with Optimal Stimulus Ensembles C. K. Machens, T. Gollisch, O. Kolesnikova, and A.V.M. Herz Presented by Tomoki.
Artificial Neural Networks - Introduction -
1 Correlations Without Synchrony Presented by: Oded Ashkenazi Carlos D. Brody.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Idan Segev Interdisciplinary Center for Neural Computation Hebrew University Thanks to: Miki London Galit Fuhrman Adi Shraibman Elad Schneidman What does.
Romain Brette Computational neuroscience of sensory systems Dynamics of neural excitability.
Marseille, Jan 2010 Alfonso Renart (Rutgers) Jaime de la Rocha (NYU, Rutgers) Peter Bartho (Rutgers) Liad Hollender (Rutgers) Néstor Parga (UA Madrid)
Artificial Neural Networks - Introduction -
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Effects of Excitatory and Inhibitory Potentials on Action Potentials Amelia Lindgren.
Synapses are everywhere neurons synapses Synapse change continuously –From msec –To hours (memory) Lack HH type model for the synapse.
Neural Condition: Synaptic Transmission
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
How does the mind process all the information it receives?
How facilitation influences an attractor model of decision making Larissa Albantakis.
1 Session 5 The Neuron II: Synaptic Transmission PS111: Brain & Behaviour Module 1: Psychobiology.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Swimmy Copyright Dr. Franklin B. Krasne, 2008 Advantages of Swimmy 1)Very faithful simulation of neural activity 2)Flawless electrode placement. 3)No.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Top Score = 101!!!! Ms. Grundvig 2nd top score = 99 Mr. Chapman 3rd top score = Ms. Rodzon Skewness = -.57.
Swimmy Copyright Dr. Franklin B. Krasne, 2008 Advantages of Swimmy 1)Very faithful simulation of neural activity 2)Flawless electrode placement. 3)No.
Coding Theory Efficient and Reliable Transfer of Information
Neural Networks. Molecules Levels of Information Processing in the Nervous System 0.01  m Synapses 1m1m Neurons 100  m Local Networks 1mm Areas /
From LIF to HH Equivalent circuit for passive membrane The Hodgkin-Huxley model for active membrane Analysis of excitability and refractoriness using the.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
Synaptic Plasticity Synaptic efficacy (strength) is changing with time. Many of these changes are activity-dependent, i.e. the magnitude and direction.
Do Now 1/9/15 1.Name 3 glial cells and describe their function and location. 2.Which neural pathway transmits a signal when the internal body temperature.
The Neural Code Baktash Babadi SCS, IPM Fall 2004.
Biological Modeling of Neural Networks Week 11 – Variability and Noise: Autocorrelation Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Variation of.
The Patch Clamp Method 1976 by Erwin Neher and Bert Sakmann at the Max Planck Institute in Goettingen.
Biointelligence Laboratory, Seoul National University
Structure of a Neuron: At the dendrite the incoming
Synaptic transmission
Neurons, Signals, Synapses
Learning Objectives After this miniclass, you should be able to:
Synaptic integration in single neurons
Volume 20, Issue 5, Pages (May 1998)
Dr. Ayisha Qureshi MBBS, Mphil Department of Physiology
Emre O. Neftci  iScience  Volume 5, Pages (July 2018) DOI: /j.isci
Volume 56, Issue 6, Pages (December 2007)
Effects of Excitatory and Inhibitory Potentials on Action Potentials
Neural Condition: Synaptic Transmission
Volume 20, Issue 5, Pages (May 1998)
Volume 30, Issue 2, Pages (May 2001)
Volume 40, Issue 6, Pages (December 2003)
The Naïve Bayes (NB) Classifier
Neural Signaling: Postsynaptic Potentials
Review - Objectives Describe how excitation and inhibition are not equal opposites. Identify the key differences between feedforward, feedback and competitive,
Principles Governing the Operation of Synaptic Inhibition in Dendrites
Mechanisms of Concerted Firing among Retinal Ganglion Cells
Patrick Kaifosh, Attila Losonczy  Neuron 
Synaptic integration.
Encoding of Oscillations by Axonal Bursts in Inferior Olive Neurons
Synaptic Transmission and Integration
Introduction to Neural Networks
Learning Objectives After this class, you should be able to:
Volume 27, Issue 2, Pages (August 2000)
BIOL3833 Week 11b: Dendrites.
Neural Condition: Synaptic Transmission
Principles Governing the Operation of Synaptic Inhibition in Dendrites
Patrick Kaifosh, Attila Losonczy  Neuron 
Presentation transcript:

What does the synapse tell the axon? Idan Segev Interdisciplinary Center for Neural Computation Hebrew University Thanks: Miki London Galit Fuhrman Adi Shraibman Elad Schneidman

Outline Introduction Information theory and synaptic efficacy Questions in my lab. A brief history of the synapse and of “synaptic efficacy” - what does it mean? Complications with “synaptic efficacy” Information theory and synaptic efficacy Basic definitions (entropy, mutual information) The “Plug & Play” model Preliminary Results “Synaptic efficacy” In simple neuron models In passive dendritic structures In excitable dendrites Conclusions Future questions

Research focus in my group 1. Neuronal “noise” and input-output properties of neurons. Ion-channels, synaptic noise and AP reliability Optimization of information transmission with noise 2. Nonlinear cable theory. Threshold conditions for excitation in excitable dendrites Active propagation in excitable trees 3. “Learning rules” for ion channels and synapses. How to build a “H&H” axon? How to “read” synaptic plasticity? 4. The synapse: “what does it say”? Could dynamic synapses encode the timing of the pre-synaptic spikes? “Synaptic efficacy” - what does it mean?

THE “Synapse”

Motivation: Single synapse matters

“Synaptic efficacy” Artificial Neural Networks - synaptic efficacy reduced to a single number, Wij (Jij) Biophysics - Utilizing the (average) properties of the PSP (peak; rise-time; area, charge …) Cross-Correlation - Relating the pre-synaptic input to the post-synaptic output (the firing probability). How do synaptic properties affect the cross-correlation?

Complications: Who is more “effective” and by how much? EPSP peak is equal but the rise time is different ? EPSP area is equal but the peak is different?

Complications: Background synaptic activity Spontaneous in vivo voltage fluctuations in a neuron from the cat visual cortex L.J. Borg-Graham, C. Monier & Y. Frengac

The “Plug & Play” Model Input “Neuron” Noise Output Input Background Mutual Information Input Background Activity Output

Compression, Entropy and Mutual Information Compressed output Spike train given the input Mutual Information Information in the input? 0 1 1 0 Compressed Spike train output 0 10 0 11 0 11 10 0 0 10 0 11 0 11 10 0 Entropy Output Spike train 01000010010100100001 01 000 01 001 01 001 000 01 0 1 1 0 The Mutual Information is the extra bits saved by knowing the input. Known Synaptic Input 01 001 01 001 01 001 001 01 1. Compressing output spike train (replacing longer strings with shorter ones). Entropy = length of string (the ration is entropy per character) 2. Given an input - *01 in input always goes with 01 in output (namely; given 01 in input we know the output) *given 001 in input; the output could be either 000 or 001. Thus, a single bit is sufficient to encode which of these two possibilities occure (“0” for the first case; “1” for the second). *Output now is represented with 4 characters (this is all that I need to know about the output, assuming I know the input) *The # of bits that were “saved” because the input is known, is the MI. Compression Information estimation We use the CTW compression algorithm (best known today)

I&F (effect of potentiation) Threshold background x5 Isolated synapse Background synapse

(I&F) - EPSP parameters and the MI Fixed peak Fixed charge

Why MI corresponds best to EPSP peak? Input Marking spikes following an EPSP. Only the cases with at least one spike (in a window of 50 msec) are plotted. For sharp EPSP, there were 200 cases (out of 2000 input EPSPs) with at list one spike in the chsen time-window. Most of these spike are locked to the EPSP. For broader EPSP there was a larger total # of spikes (500/2000 in the lower case) - but these spikes are less locked to the input. The cross correlation is essentially the histogram of the raster plot. The peak of the CC is larger for sharp EPSP whereas the width (and integral) of the CC is larger (more spikes). Can we use the CC to say who is more efficient? This result depends on bin size(3 msec here). E.g. with dT = 15 msec, the lower raster would be also locked. So why does the MI depends only weakly on bin-size? Sharp EPSP Less spikes, More accurate Smeared EPSP More spikes, Less accurate

Passive Cable with synapses

MI (efficacy) of distal synapses scales with EPSP peak Proximal

MI with Active dendritic currents (Linear synapses) proximal distal With active dendrites (Na and I_A) : 1. Background activity is boosted. (i) proximal synapse transmits less information (“noise” is larger and proximal EPSP is almost passive). (ii) Distal synapse is relatively more boosted due to large input impedance (active current is manifested more). Could become (ii) Intemediate synapse is boosted probably as much as the noise does; so it does not transmit much information Important: BOOSTING affects both individual synapse but also the “noise” intermediate distal

Conclusions Peak EPSP is the dominant parameter for Mutual information of synaptic input Validity & Generality of method Advantage of modeling for such issues Possibility to ask many questions (with control) applicability for experimental data

Future Questions Natural Generalizations Dendritic trees MI of Inhibitory synapses Depressing and facilitating synapses Other noise sources Efficacy of inhibitory synapse “Selfish” or Cooperative strategies for maximizing information transfer (each synapse may want to increase each EPSP peak, but others do too) Establishing and improving the method (confidence limits, better estimates …)

galit

Stochastic Model For Dynamic Synapses: 1 ? 2 3 AP 2 types of “randomness”: 1. Is there a vesicle in the release site? 2. Would a vesicle be released in response to a presynaptic AP?

How to quantify the relation between the input properties and its efficacy? Which of these two inputs is more efficient? By how much?

Entropy estimation 000000100010101010101010101010101001001011…… Given a sequence generated by a source the Shannon McMillan Breiman theorem states that: Two problems: The sequence is finite We don’t know the true probability p of the sequence (we can only estimate it).

Effect of bin size x5 x3 Wide Sharp Wide Sharp Control