Download presentation
Presentation is loading. Please wait.
Published byLouise Bradford Modified over 9 years ago
1
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, 232-252 (1997) C van Vreeswijk & H Sompolinsky, Science 274, 1724-1726 (1996); Neural Computation 10, 1321-1371 (1998)
2
Basics N neurons, spike trains
3
Basics N neurons, spike trains Input current to neuron i : “current-based synapses”
4
Basics N neurons, spike trains Input current to neuron i : “current-based synapses” Synaptic kernel K( ) (normally taken indep of i,j )
5
Basics N neurons, spike trains Input current to neuron i : “current-based synapses” Synaptic kernel K( ) (normally taken indep of i,j ) normalization
6
Basics N neurons, spike trains Input current to neuron i : “current-based synapses” Synaptic kernel K( ) (normally taken indep of i,j ) normalization Recall parametrization
7
Basics N neurons, spike trains Input current to neuron i : “current-based synapses” Synaptic kernel K( ) (normally taken indep of i,j ) normalization 1 presynaptic spike changes postsynaptic potential by J ij Recall parametrization
8
Conductance-based synapses Better:
9
Differential form For exponential kernel
10
Differential form For exponential kernel Differentiate =>
11
Membrane potential Integrate-and-fire neurons:
12
Membrane potential Integrate-and-fire neurons: Mean + fluctuations of current:
13
Membrane potential Integrate-and-fire neurons: Mean + fluctuations of current: If mean varies slowly
14
Membrane potential Integrate-and-fire neurons: Mean + fluctuations of current: If mean varies slowly =>
15
Architectures (fx retina to LGN in visual system) Feedforward:
16
Architectures (fx retina to LGN in visual system) Feedforward: Recurrent:
17
Architectures (fx retina to LGN in visual system) Feedforward: Recurrent: Total input to neuron i :
18
Stationary states In limit
19
Stationary states In limit Mean:
20
Stationary states In limit Mean: Fluctuations:
21
Stationary states In limit Mean: Fluctuations: Approximation: Assume neurons fire as independent Poisson processes:
22
Stationary states In limit Mean: Fluctuations: Approximation: Assume neurons fire as independent Poisson processes:
23
Stationary states In limit Mean: Fluctuations: Approximation: Assume neurons fire as independent Poisson processes:
24
Stationary states In limit Mean: Fluctuations: Approximation: Assume neurons fire as independent Poisson processes: Large number of inputs: I i (t) is Gaussian
25
Simple model
26
Input population firing at rate r 0
27
Simple model Input population firing at rate r 0 Dilute excitatory FF connections:
28
Simple model Input population firing at rate r 0 Dilute excitatory FF connections: Dilute inhibitory recurrent connections:
29
Simple model Input population firing at rate r 0 Dilute excitatory FF connections: Dilute inhibitory recurrent connections:
30
Simple model Input population firing at rate r 0 Dilute excitatory FF connections: Dilute inhibitory recurrent connections:
31
Input current statistics: Mean:
32
Input current statistics: Mean: Average over neurons:
33
Input current statistics: Mean: Fluctuations: Average over neurons:
34
Mean field theory (white-noise approximation) In the previous lecture, we learned how to compute the firing rate of a neuron driven by a constant current plus white noise
35
Mean field theory (white-noise approximation) In the previous lecture, we learned how to compute the firing rate of a neuron driven by a constant current plus white noise
36
Mean field theory (white-noise approximation) In the previous lecture, we learned how to compute the firing rate of a neuron driven by a constant current plus white noise Here: use
37
Mean field theory (white-noise approximation) In the previous lecture, we learned how to compute the firing rate of a neuron driven by a constant current plus white noise Here: use
38
Mean field theory (white-noise approximation) In the previous lecture, we learned how to compute the firing rate of a neuron driven by a constant current plus white noise Here: useSolve for r
39
Insight from graphical solution
43
Root ~ at
44
Insight from graphical solution Root ~ at =>
45
Insight from graphical solution Root ~ at => Threshold-linear dependence on r 0
46
Balance of excitation and inhibition condition
47
Balance of excitation and inhibition condition Total average input current (including leak for V ~ ) = 0
48
Balance of excitation and inhibition condition Total average input current (including leak for V ~ ) = 0
49
Balance of excitation and inhibition condition Total average input current (including leak for V ~ ) = 0 At low rates ( r << 1 ) membrane potential has to be ~ stationary below , with firing noise-driven => net average current = 0
50
Amit-Brunel model 2 populations (plus external driving one)
51
Amit-Brunel model 2 populations (plus external driving one) indices a,b. … = 0,1,2 label populations a = 0 : external a = 1 : excitatory a = 2 : inhibitory
52
Amit-Brunel model 2 populations (plus external driving one) indices a,b. … = 0,1,2 label populations a = 0 : external a = 1 : excitatory a = 2 : inhibitory Synaptic strengths:
53
Amit-Brunel model 2 populations (plus external driving one) indices a,b. … = 0,1,2 label populations a = 0 : external a = 1 : excitatory a = 2 : inhibitory (Excitatory) external to excitatory, inhibitory Synaptic strengths:
54
Amit-Brunel model 2 populations (plus external driving one) indices a,b. … = 0,1,2 label populations a = 0 : external a = 1 : excitatory a = 2 : inhibitory (Excitatory) external to excitatory, inhibitory Recurrent Synaptic strengths:
55
Amit-Brunel model 2 populations (plus external driving one) indices a,b. … = 0,1,2 label populations a = 0 : external a = 1 : excitatory a = 2 : inhibitory (Excitatory) external to excitatory, inhibitory Recurrent Synaptic strengths:
56
Balance conditions, mean rates Net mean currents:
57
Balance conditions, mean rates Net mean currents:
58
Balance conditions, mean rates Solve for r 1, r 2 Net mean currents:
59
Balance conditions, mean rates Solve for r 1, r 2 Net mean currents:
60
Balance conditions, mean rates Solve for r 1, r 2 Threshold-linear dependence on r 0 Net mean currents:
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.