Idan Segev Interdisciplinary Center for Neural Computation Hebrew University Thanks to: Miki London Galit Fuhrman Adi Shraibman Elad Schneidman What does.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Neural Signaling: Postsynaptic Potentials Lesson 9.
Dendritic computation. Passive contributions to computation Active contributions to computation Dendrites as computational elements: Examples Dendritic.
Synapses and Multi Compartmental Models
History, Part III Anatomical and neurochemical correlates of neuronal plasticity Two primary goals of learning and memory studies are –i. neural centers.
A brief introduction to neuronal dynamics Gemma Huguet Universitat Politècnica de Catalunya In Collaboration with David Terman Mathematical Bioscience.
Mean = 75.1 sd = 12.4 range =
Spike Timing-Dependent Plasticity Presented by: Arash Ashari Slides mostly from: 1  Woodin MA, Ganguly K, and Poo MM. Coincident pre-
Spike timing-dependent plasticity: Rules and use of synaptic adaptation Rudy Guyonneau Rufin van Rullen and Simon J. Thorpe Rétroaction lors de l‘ Intégration.
Spike timing dependent plasticity Homeostatic regulation of synaptic plasticity.
Artificial Neural Networks - Introduction -
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Artificial Neural Networks - Introduction -
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Artificial Intelligence (CS 461D)
Effects of Excitatory and Inhibitory Potentials on Action Potentials Amelia Lindgren.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Synapses are everywhere neurons synapses Synapse change continuously –From msec –To hours (memory) Lack HH type model for the synapse.
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
How does the mind process all the information it receives?
Neural communication How do neurons send messages to each other?
The Integrate and Fire Model Gerstner & Kistler – Figure 4.1 RC circuit Threshold Spike.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Swimmy Copyright Dr. Franklin B. Krasne, 2008 Advantages of Swimmy 1)Very faithful simulation of neural activity 2)Flawless electrode placement. 3)No.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
Artificial Neural Networks. Applied Problems: Image, Sound, and Pattern recognition Decision making  Knowledge discovery  Context-Dependent Analysis.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Swimmy Copyright Dr. Franklin B. Krasne, 2008 Advantages of Swimmy 1)Very faithful simulation of neural activity 2)Flawless electrode placement. 3)No.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
Neural codes and spiking models. Neuronal codes Spiking models: Hodgkin Huxley Model (small regeneration) Reduction of the HH-Model to two dimensions.
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neurons & Nervous Systems. nervous systems connect distant parts of organisms; vary in complexity Figure 44.1.
Dopamine (DA) neuron Cell body (Soma) terminals axons Dendrites.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Neural Networks. Molecules Levels of Information Processing in the Nervous System 0.01  m Synapses 1m1m Neurons 100  m Local Networks 1mm Areas /
Nens220, Lecture 11 Introduction to Realistic Neuronal Networks John Huguenard.
Neural Modeling - Fall Single Neuron vs Neural population Strategy to discover the Brain Functionality Biomedical engineering Group School of Electrical.
Neurons. The human brain Thought experiment Scientist have discovered that the brain has about 83 billion neurons. How do they know?
Neural and Hormonal Systems Will Explain Why We FEEL…… Pain Strong Sick Nervous.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
University of Jordan1 Physiology of Synapses in the CNS- L4 Faisal I. Mohammed, MD, PhD.
Synaptic Plasticity Synaptic efficacy (strength) is changing with time. Many of these changes are activity-dependent, i.e. the magnitude and direction.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Do Now 1/9/15 1.Name 3 glial cells and describe their function and location. 2.Which neural pathway transmits a signal when the internal body temperature.
Synaptic integration – cable theory
Artificial Intelligence (CS 370D)
What does the synapse tell the axon?
Effects of Excitatory and Inhibitory Potentials on Action Potentials
Volume 40, Issue 6, Pages (December 2003)
Neural Signaling: Postsynaptic Potentials
Brain Function for Law-Neuro
Principles Governing the Operation of Synaptic Inhibition in Dendrites
Patrick Kaifosh, Attila Losonczy  Neuron 
Synaptic integration.
Synaptic Transmission and Integration
Neural Condition: Synaptic Transmission
Principles Governing the Operation of Synaptic Inhibition in Dendrites
Patrick Kaifosh, Attila Losonczy  Neuron 
Presentation transcript:

Idan Segev Interdisciplinary Center for Neural Computation Hebrew University Thanks to: Miki London Galit Fuhrman Adi Shraibman Elad Schneidman What does the synapse tell the axon?

Outline Introduction Questions in my group A brief history of the synapse what does “synaptic efficacy” mean? Complications with “synaptic efficacy” Information theory (I.T.) and synaptic efficacy Basic definitions (entropy, compression & mutual information) The “noisy input-output” model Preliminary Results “Synaptic efficacy” in the context of I.T. In simple neuron models In passive dendritic structures In excitable dendrites Conclusions Future questions

1. Neuronal “noise” and input-output properties of neurons (Elad Schneidmann, Miki London) Ion-channels, synaptic noise and AP reliability Optimization of information transmission with noise 2. Nonlinear cable theory (Claude Meunier) Threshold conditions for excitation in excitable dendrites Active propagation in excitable trees 3. “Learning rules” for ion channels and synapses. How to build a “H&H” axon? How to “read” synaptic plasticity? 4. The synapse: “what does it say”? (Miki London, Galit Fuhrman) Could dynamic synapses encode the timing of the pre-synaptic spikes? “Synaptic efficacy” - what does it mean? Research focus in my group

The “father” of the  Sir Charles Scott Sherrington Syndesm (“connection”) - Sherrington Synapsis (“Clasp”) - Verrall (Greek scholar/Cambridge) “Each synapsis offers an opportunity for a change in the character of nervous impulses, that the impulse as it passes over from the terminal arborescence of an axon into the dendrite of another cell, starts in that dendrite an impulse having character different from its own” Forster and Sherrington, 1897

Whitney Museum Presents: Synapsis Shuffle, a New Masterwork by Robert Rauschenberg Robert Rauschenberg has organized a hodgepodge group of famous names; from the highbrow (Robert Hughes, Chuck Close) to the lowbrow (Martha Stewart, Michael Ovitz) around the not-especially radical idea that anyone can create a Rauschenberg. Each participant chose an image (by lottery) from a total of 52 Rauschenberg transfer photographs, and then created a composition. “Blown Synapses” The result is bland, homogeneous work on an unnecessarily large scale. Perhaps if the project's parameters had been more narrowly defined?say, if each participant were allotted the same five images?these works would offer more insight into the minds of their composers. As it is, Rauschenberg's shuffle dulls the synapses. Karen Rosenberg ”

Motivation: Single synapse matters 400 ext. (10/sec) 100 inh. (65/sec) Mainen & Sejnowki model

Motivation: Single synapse matters 200 sec simulation (10 spikes/sec)

Motivation: Single synapse matters

“Synaptic efficacy” Artificial Neural Networks - synaptic efficacy reduced to a single number, W ij (J ij ) Biophysics - Utilizing the (average) properties of the PSP (peak; rise-time; area, charge …) Cross-Correlation - Relating the pre-synaptic input to the post-synaptic output (the firing probability). But how to interpret the shape of the cross-correlation?

Complications with “synaptic efficacy”: PSP have different shape indices: Who is more “effective” and by how much? EPSP peak is equal but the rise time is different EPSP area (charge) is equal but the peak is different

Complications with “synaptic efficacy”: Synapses are dynamic Facilitating Depressing

Complications with “synaptic efficacy”: The synapse: a voice in the crowd synaptic effect depends on the context (and the synapse itself is probabilistic) L.J. Borg-Graham, C. Monier & Y. Frengac Spontaneous in vivo voltage fluctuations in a neuron from the cat visual cortex

A new definition for “Synaptic efficacy” “Neuron” Output Input Noise Background Activity Input Output Mutual Information Mutual information: what does the synaptic input tell us about the spike output? “Synaptic efficacy”: The mutual information between the input and the output ?

Entropy Known Synaptic Input The Mutual Information (MI) is the extra bits saved in encoding the output by knowing the input Computing the mutual information (Compression, Entropy and Mutual Information) Information in the input? Output Spike train Compressed Spike train output Compressed output Spike train given the input Mutual Information Compression Information estimation We use the CTW compression algorithm (best known today)

Mutual information in a Simple I&F model (effect of potentiation) Threshold Isolated synapse background Background synapse x5x5 Output spike train

Which of the EPSP parameters affects the MI? Fixed peak Fixed charge the MI corresponds best to the EPSP peak

Why the MI corresponds best to EPSP peak? Sharp EPSP Broad EPSP Less spikes, More accurate More spikes, Less accurate Input

M.I (“synaptic efficacy”) in realistic models: Passive Cable with (linear) synapses +H&H axon

(Cable with linear synapses) MI (synaptic efficacy) of distal synapses scales with EPSP peak Proximal Distal

MI with Active dendritic currents proximal distal intermediate The active boosting affects both input synapse but also the “background noise” (i) Proximal synapse transmits less information compared to passive case (“noise” is larger and proximal EPSP is almost passive) (ii) Distal synapse is relatively more boosted due to large local input impedance. (iii) Intermediate synapse is boosted as much as the noise does; so it does not transmit more information in the active case.

Conclusions The mutual information measure provides a functional link between the synaptic input and the spike output. Hence, the M.I could be interpreted as “synaptic efficacy”. “Synaptic efficacy” depends on the context within which the synapse operates. The EPSP peak (rather than its area) corresponds most closely to the mutual information. Active dendritic currents affect both the “background noise”and the input synapse. The relative effect of this noise on the “efficacy of the synaptic input” depends on the location of the input. Typically, distal synapses tend to be relatively more boosted.

Future Questions Natural Generalizations for charaterizing “synaptic efficacy” *MI (efficacy) of Inhibitory synapses *Depressing, facilitating and probabilistic synapses *Dependence on input structure (regular input; bursting input) *Dependence on the Context (correlated background) *Dependence on dendrtic excitability (I h, I A, I Ca,.., ) *Dependence on # of and site of connection “synaptic efficacy” for many pre-synaptic inputs “Selfish” or Cooperative strategies for maximizing information transfer (each synapse may want to increase its own EPSP peak, but others do too)

Effect of bin size Sharp Wide Sharp Control x3 x5

12,000 Na channels 3,600 K channels 200  m 2