Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Mean = 75.1 sd = 12.4 range =
Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
B.Macukow 1 Lecture 3 Neural Networks. B.Macukow 2 Principles to which the nervous system works.
Neurophysics Part 1: Neural encoding and decoding (Ch 1-4) Stimulus to response (1-2) Response to stimulus, information in spikes (3-4) Part 2: Neurons.
Biological and Artificial Neurons Michael J. Watts
Artificial Neural Networks - Introduction -
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Artificial Neural Networks - Introduction -
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Stable Propagation of Synchronous Spiking in Cortical Neural Networks Markus Diesmann, Marc-Oliver Gewaltig, Ad Aertsen Nature 402: Flavio Frohlich.
Effects of Excitatory and Inhibitory Potentials on Action Potentials Amelia Lindgren.
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
The Integrate and Fire Model Gerstner & Kistler – Figure 4.1 RC circuit Threshold Spike.
Structure and function
How facilitation influences an attractor model of decision making Larissa Albantakis.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Chapter 8c Neurons: Cellular and Network Properties.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
PowerPoint Lecture Outlines to accompany
1 6. Feed-forward mapping networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Advances in Modeling Neocortex and its impact on machine intelligence Jeff Hawkins Numenta Inc. VS265 Neural Computation December 2, 2010 Documentation.
1 1. Introduction Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Graduate.
1 2. Neurons and Conductance-Based Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
Biomedical Sciences BI20B2 Sensory Systems Human Physiology - The basis of medicine Pocock & Richards,Chapter 8 Human Physiology - An integrated approach.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Theoretical Neuroscience Physics 405, Copenhagen University Block 4, Spring 2007 John Hertz (Nordita) Office: rm Kc10, NBI Blegdamsvej Tel (office)
1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 7. Associators and synaptic plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 4. Associators and Synaptic Plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Copyright © 2010 Pearson Education, Inc. The Synapse A junction that mediates information transfer from one neuron: To another neuron, or To an effector.
Lecture 5 Neural Control
Chapter 2. From Complex Networks to Intelligent Systems in Creating Brain-like Systems, Sendhoff et al. Course: Robots Learning from Humans Baek, Da Som.
Neurons & Nervous Systems. nervous systems connect distant parts of organisms; vary in complexity Figure 44.1.
8.2 Structures and Processes of the Nervous System
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Ch 9. Rhythms and Synchrony 9.7 Adaptive Cooperative Systems, Martin Beckerman, Summarized by M.-O. Heo Biointelligence Laboratory, Seoul National.
Nens220, Lecture 11 Introduction to Realistic Neuronal Networks John Huguenard.
Network Models (2) LECTURE 7. I.Introduction − Basic concepts of neural networks II.Realistic neural networks − Homogeneous excitatory and inhibitory.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Biological Modeling of Neural Networks: Week 10 – Neuronal Populations Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1 Cortical Populations - columns.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
LECTURE 8: SYNAPTIC TRANSMISSION OVERVIEW AND NMJ REQUIRED READING: Kandel text, Chapters 10,11 In humans, neurons, each receiving average of
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
1 5. Representations and the neural code Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Brain, Mind, and Computation Part I: Computational Brain Brain, Mind, and Computation Part I: Computational Brain Brain-Mind-Behavior Seminar May 18, 2011.
1 9. Continuous attractor and competitive networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
Bayesian Brain - Chapter 11 Neural Models of Bayesian Belief Propagation Rajesh P.N. Rao Summary by B.-H. Kim Biointelligence Lab School of.
Artificial Intelligence (CS 370D)
What does the synapse tell the axon?
9. Continuous attractor and competitive networks
Presented by Rhee, Je-Keun
A junction that mediates information transfer from one neuron:
Carlos D. Brody, J.J. Hopfield  Neuron 
Volume 30, Issue 2, Pages (May 2001)
7. Associators and synaptic plasticity
H.Sebastian Seung, Daniel D. Lee, Ben Y. Reis, David W. Tank  Neuron 
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
Synaptic Transmission and Integration
Rapid Neocortical Dynamics: Cellular and Network Mechanisms
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Presentation transcript:

Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002. 4. Neurons in a network Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002. Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Graduate Programs in Cognitive Science, Brain Science and Bioinformatics Brain-Mind-Behavior Concentration Program Seoul National University E-mail: btzhang@bi.snu.ac.kr This material is available online at http://bi.snu.ac.kr/

Outline 4.1 4.2 4.3 4.4 4.5 Organizations of neural networks Information transmission in networks Population dynamics: modeling the average behavior of neurons The sigma node Networks with non-classical synapses: the sigma-pi node

4.1 Organizations of neural networks The high-order mental abilities An emerging property of specialized neural networks The number of neurons in the central nervous systems 1012 We aim to understand the principal organization of neuron-like elements and how such structures can support and enable particular mental processes. The anatomy of the brain areas Neocortex, cerebral cortex, cortex Cerebellum Subcortical area

4.1.1 Neocortical organization Brodmann’s cortical map 52 cortical area Functional correlates of different cortical areas Fig. 4.1 Outline of the lateral view of the human brain including the cortex, cerebellum, and brainstem. The neocortex is divided into for lobes. The numbers correspond to Brodmann’s classification of cortical areas.

4.1.2 Staining techniques Fig. 4.2 Examples of stained neocortical slice showing the layered structure in neocortex. (B) Illustration of different staining techniques.

4.1.3 Common neuronal types in the neocortex Pyramidal cell Stellate cell

4.1.4 The layered structure of neocortex A generally layered structure of the neocortex Laminar-specific structure Fig 4.2 A Fig 4.2 C Fig. 4.2 Examples of stained neocortical slice showing the layered structure in neocortex. (A) Nissl stained visual cortex showing cell bodies. (C) Different sizes of cortical layers in different areas.

4.1.5 Columnar organization and cortical modules Fig. 4.3 Columnar organization and topographic maps in neocortex. (A) Ocular dominance columns. (B) Schematic illustration of the relation between orientation and ocular dominance columns. (C) Topographic representation of the visual field in the primary visual cortex. (D) Topographic representation of touch-sensitive areas of the body I the somatosensory cortex.

4.1.6 Connectivity between neocortical layers Fig. 4.4 Schematic connectivity patterns between neurons in a cortical layer. Open cell bodies represent (spiny) excitatory neurons such as the pyramidal neuron and the spiny stellate neuron. Their axons are plotted with solid lines that end at open triangles that represent the axon terminal. The dendritic boutons are indicated by open circles. Inhibitory (smooth) stellate neurons have solid cell bodies and synaptic terminal, and the axons are represented by dashed lines.

4.1.7 Cortical parameter

4.2 Information transmission in networks 4.2.1 The simple chain Biologically not reasonable A single presynaptic spike not sufficient to elicit a postsynaptic spike. Synaptic transmission is lossy. The death of a single neuron would disrupt the transmission. Fig. 4.5 (A) A sequential transmission line of four nodes. Parallel chains are made out of many such sequential transmission lies without connections between them.

4.2.2 Diverging-converging chains The number of neurons, N Divergence rate, m Convergence rate, C Fully connected network N = m = C Synaptic efficiency (weight), w Feedback loop Fig. 4.5 (B) Diverging/converging chains where each node can contact several other nodes in neighboring transmission chains.

4.2.3 Immunity of random networks to spontaneous background activity (1) Cortical neurons typically fire with some background activity Mean 5Hz, variance 3Hz Neuron that has 10000 excitatory dendritic synapses. Spiking arriving in each time interval (1ms) Mean μ = 10000 * 0.005 = 50 Variance σ2 = 0.003 A spike arriving at a synapse with weight w=1 would elicit a spike The neuron to be immune against the background firing. w < 1/50 = 0.02

4.2.3 Immunity of random networks to spontaneous background activity (2) To compare these value to experimental data How to measure the average synaptic efficiency To stimulate a presynaptic neuron while recording from the postsynaptic neuron. Asynchronous gain The average number of extra spikes that are added to the spikes of a postsynaptic neuron by each presynaptic spikes. If 100 presynaptic spikes (100Hz) lead 2 postsynaptic spikes (2Hz) during 1ms, synaptic efficiency is 5/100 = 0.02 Fig. 4.6 Schematic illustration of the influence of a single presynaptic spike on the average firing rate of the postsynaptic neuron. The delay in the synaptic transmission curve is caused by some delay, after which, on average, more postsynaptic spikes are generated within a short time window compared to the spontaneous activity of the neurons

4.2.4 Noisy background Large variability Ex) μ = 50, variance σ2 = 50 The probability of a postsynaptic spikes generated by the background firing to be less than a certain value, pbg The probability of having more than x simultaneous presynaptic spikes Gaussian distribution If pbg = 0.1, x ≈ 59, w < 1/59 ≈ 0.017 (4.1) (4.2) (4.3)

4.2.5 Information transmission in large random networks Previous condition, at least 59-50=9 additional presynaptic spikes to elicit a meaningful postsynaptic spike Large random networks 1010 Neurons Each of these neurons connects to 10000 other neurons Stimulate 1000 neurons 1000 * 10000 = 107 spikes transmitted The probability of a neuron receives A spike is 107 / 1010 Two spikes is (107 / 1010)2=10-6 Not sufficient to secondary spikes A consequence of the small number of connections per neuron relative to large number of neuron in the network

4.2.6 The spread of activity in small random networks Netlets Small networks with only a small number of highly efficient synapses. Only very few active presynaptic neurons can elicit a postsynaptic spike in functionally correlated neurons. Absolute Refractory Period Control the number of active neurons Two Asymptotic model Small initial activity we get an inactive netlet Initial large activity we get a nearly maximally active netlet

4.2.7 The expected number of active neurons in netlets Fig. 4.7 The fraction of active nodes of netlets in the time interval t + 1 as a function of the fraction of active nodes in the previous time interval. The different curves correspond to different numbers of presynaptic spikes Θ that are necessary to elicit a postsynaptic spike. (A) Netlets with only excitatory neurons. (B) Netlets with the same amount of excitatory and inhibitory connections. The expected fraction of active nodes f the average number of synapses per neuron, C the firing threshold, Θ attractive fixpoint, firing rate 500Hz (4.4) 4.2.8 Netlets with inhibition Cortical neuron firing rate range 10~100Hz Inhibitory neurons the value still exceed.

4.3 Population dynamics: modeling the average behavior of neurons Simulation of networks of spiking neurons Computing power problem Use average firing rate Relationship of such models to population of spiking neurons What conditions these common approximations are useful and faithful descriptions of neuronal characteristics Rate models cannot incorporate all aspects of networks of spiking neurons Many investigations in computational neuroscience have used such models

4.3.1 Firing rate The rectangular time window The average temporal spike rate of a neuron with a window size Δt: Gaussian window (4.5) (4.6) Fig. 4.8 (A) Temporal average of a single spike train with a time widow ΔT hat ahs to be large compared to the average interspike interval

4.3.2 Population averages The average population activity A(t) of neurons Very small time windows (4.7) (4.8) Fig. 4.8 (B) Pool or local population of neurons with similar response characteristics. The pool average is defined as the average firing rate over the neurons in the pool within a relatively small time window.

4.3.3 Population dynamics in response to slowly varying inputs The average behavior of a pool of neuron Membrane time constant, τ Input current, I Activation function, g Stationary state (4.9) (4.10)

4.3.4 Rapid response of population Noise current at 100ms, increase external input current. Population dynamics(eqn 4.9) vs average population spike rate response to rapidly varying input. Adding noise, fluctuation, more realistic. Fig. 4.9 Simulation of a population of 1000 independent integrate-and-fire neurons with a membrane time constant τm = 10 ms and threshold θ = 10. Each neuron receives an input with white noise around a mean RIext. this mean is switched from RIext = 11 to RIext = 16 at t = 100 ms. The spike count of the population almost instantaneously follows this jump in the input, whereas the average population rate, calculated from in eqn 4.9 with a linear activation function, follows this change input only slowly when the time constant is set to τ = τm . Fig 4.9

4.3.5 Advanced descriptions of population dynamics Spike response model Ignored ε term, i for the postsynaptic neurons, average synaptic efficiency No spike-time adaptation in neuron. the mean influence of the postsynaptic potential Noise In slowly varying input, eqn 4.9 with a gain function, (4.11) (4.12) (4.13) (4.15) (4.14) Fig. 4. 10 (A) The gain function of eqn 4.15 that can be used to approximate the dynamics of a population response to slowly varying inputs (adiabatic limit). (B) Examples of physiological gain functions from a hippocampal pyramidal cell. The discharge frequency is based on the inverse of the first interspike interval after the cell started to respond to rectangular current pulses with different strength

4.4 The sigma node 4.4.1 Minimal neuron model McCulloch-Pitts node Rate model The timing of spikes is irrelevant. But spike times play critical role for fast response 4.4.1 Minimal neuron model Sigma node Rate value of neuronal groups, r (4.16) Fig. 4.11 Schematic summary of the functionality of a sigma node most commonly used in networks of artificial neurons. Such a node weights the input value of each channel with the corresponding weight value of that channel and sums up all these weighted inputs. The output of the node is then a function of this internal activation. (4.17) (4.18) (4.19)

4.4.2 Common activation functions Activation function, g generalize the sigmoid function (4.20)

4.4.3 The dynamic sigma node 4.4.4 Leaky integrator characteristics The discrete sigma node to continuous dynamics Continuous, limΔt→0. Fig. 4.12 Time course of the activation of a leaky integrator node with initial value h = 0. In the lower curve no input Iin was applied leading to an exponential decay. The upper curve corresponds to a constant input current Iin = 0.5. The resting activation of the node was set to hrest = 0.1. (4.21) 4.4.4 Leaky integrator characteristics (4.22) (4.23) The leaky integrator dynamics. without external input Solve change behavior (4.24)

4.4.5 Discrete formulation of continuous dynamics The exponential response to short inputs Taking time steps on a logarithmic scale Method Euler method Higher-order Runge-Kutta method Adaptive time step algorithm (4.25)

4.5 Networks with non-classical synapses: the sigma-pi node Sigma node A very rough abstraction of real neuron Interaction of different ion channels Information-processing Nonlinear interaction between input channels Average firing rate 4.5.1 Logical AND and sigma-pi nodes Nonlinear interaction between two ion channels Firing threshold Requires at least two spikes in some temporal proximity Correspond to a logical AND function Generalize this idea for model, sigma-pi node (4.26)

4.5.2 Divisive inhibition Interaction between an excitatory synapse and an inhibitory synapse Divisive inhibition Shunting inhibition (4.27)

4.5.3 further sources of modulatory effects between synaptic inputs Fig. 4.13 Some sources of nonlinear (modulatory) effects between synapses as modeled by sigma-pi nodes. (A) shunting (divisive) inhibition, which is often recorded as the effect of inhibitory synapses on the cell body. (B) The effect of simultaneously activated voltage-gated excitatory synapses that are in close physical proximity to each other (synaptic clusters) can be larger than the sum of the effect of each individual synapse. Examples are clusters of AMPA ad NMDA type synapses. (C) some cortical synaptic terminals have nicotinic acetylcholine (ACh) receptors. An ACh releases of cholinergic afferents can thus produce a larger efflux of neurotransmitter and thereby increases EPSPs in the postsynaptic neuron of this synaptic terminal. (D) Metabotropic receptors can trigger intracellular messengers that can influence the gain of ion channels. (E) Ion channels can be linked to the underlying cytoskeleton with adapter proteins and can thus influence other ion channels through this link.0

Conclusion The brain does display characteristic neuronal organizations Properties of networks of spiking neurons The spread of neuronal activities through random networks Transmission of information A sensible activity in random recurrent networks The self-organization of synaptic efficiencies The sigma node Modeling the average firing rate of populations of neurons Sigma-pi node Rate model