Presentation is loading. Please wait.

Presentation is loading. Please wait.

Arealization and Memory in the Cortex

Similar presentations


Presentation on theme: "Arealization and Memory in the Cortex"— Presentation transcript:

1 Arealization and Memory in the Cortex
Main perspectives: Hierarchical Modular Content-based monkey

2 The hierarchical perspective
The Elizabeth Gardner approach ri = g[∑jwijHEBBrj-Θ]+ wijHEBB ≈ ∑μ riμrjμ ..instead of neural activity (as in the Hopfield model).. riμ = g[∑jwijrjμ-Θ]+ ..do thermodynamics over connection weights, i.e. consider whether among all their possible values, there are which satisfy

3 The hierarchical perspective
The Elizabeth Gardner approach Backpropagation and E-M algorithms Network activation Forward Step: Δr Error propagation Backward Step: Δw Expectation – sampling the world Maximization – of the match between the world and our internal model of the world

4 The hierarchical perspective
The Elizabeth Gardner approach Backpropagation and E-M algorithms Generative models paper by Hinton & Gharamani

5

6 The modular perspective
The Braitenberg model N pyramidal cells √N compartments √N cells each A pical synapses B asal synapses

7 The modular perspective
The Braitenberg model Modular associative memories Memory glass & capacity issues (with D O’Kane) Sparsity (with C Fulvi Mari) Latching dynamics (with E Kropff)

8 The modular perspective
The Braitenberg model Modular associative memories Metricity in associative memory

9 slides of Anastasia Anishchenko (Brown)
Autoassociative memory retrieval is often studied using neural network models, in which connectivity does not follow any geometrical pattern, i.e. it is either all-to-all or, if sparse, randomly assigned Storage capacity (max # patterns that can be retrieved) is proportional to the number k of connections per unit Networks with a regular geometrical rule informing their connectivity, instead, often display geometrical patterns of activity, i.e. stabilize into activity profile `bumps` of width proportional to k Recently, applications in various fields have used the fact that small-world networks, characterized by a connectivity intermediate between regular and random, have different graph theoretic properties than either regular or random networks Autoassociative memory retrieval?.. Geometrical activity patterns?..

10 GRAPH THEORY DEFINITIONS
CREATING A SMALL WORLD Watts, Strogatz 1998: Start with a 1D lattice ring Rewire each edge at random with probability p Rewiring of only a few edges is enough Graph consists of a nonempty set of elements, called vertices, and a list of unordered pairs of elements, called edges Order n of the graph = number of vertices Coordination number k = average number of edges connected to one vertex GRAPH THEORY DEFINITIONS

11 PATH LENGTH and CLUSTERING
Characteristic path length L = length of the shortest path between two vertices, averaged over all pairs of vertices L = average number of links in the shortest chain connecting two people x0 x1 xM L(x0, xM) = M x3 x2 Clustering coefficient C = average fraction of vertices linked.to one, which … are also linked to each other All-to-all connectivity: C = 1 Random connections: C = k / n << 1 for a large network

12 THREE DIFFERENT WORLDS
Cregular= C (0) ≈ ¾ LARGE Lregular= L (0) ≈ n/2k LARGE Regular: Random: Small World: SW C (p) ≈ C (0) small L (p) ≈ L (1) small Crandom= C (1) ≈ k/n LARGE Lrandom= L (1) ≈ ln(n)/ln(k) small REG RND Figure from Watts, Strogatz 1998

13 THE BRAIN IS A SMALL WORLD
Characteristic path length L and clustering coefficient C for cortex: n = neurons, k = connections per neuron Lregular ≈ n / 2k ≈ synapses too large Lrandom ≈ ln (n) / ln (k) ≈ synapses realistic! But: Crandom ≈ k / n ≈ too small Cortex has short L (as random graph) and large C (as regular lattice) ð by definition, brain is a small world!..

14 NETWORK MODEL V0 = -70mV, Vthr= -54mV, m = 5 msec Connections: p
Store M = 5 patterns , drawn from a binary distribution (1 or 0) Sparseness  = 0.2 Average number of connections per neuron k = 50 N = 1000 neurons on a ring Integrate and fire: V0 = -70mV, Vthr= -54mV, m = 5 msec 1 = 30 msec, 2 = 4 msec, tref = 3 msec Connections: excitatory, all the same strength probabilistic Gaussian with a baseline: global inhibition, inh = 4 msec p Normalized modification of synaptic strength: Give a cue for one of the patterns stored “+” current into the “1” cells “-” current into the “0” cells Part of the cue may be randomly corrupted (partial cue)

15 CALCULATING L and C Analytical estimation of clustering coefficient:
C = [1/sqrt(3) - k/n]*(1-p)3 + k/n Comparison with the numerical results (n = 1000, k = 50): Numerical results for L and C as functions of the parameter of randomness Normalized and averaged over three network realizations

16 SPONTANEOUS ACTIVITY BUMPS
Number of spikes Nets with a regular geometrical rule guiding their connections (parameter of randomness p -> 0) often display geometrical patterns of activity, i.e. stabilize into activity profile "bumps“ The width of the “bump” is proportional to number k of connections per neuron Neuron

17 ASSOCIATIVE MEMORY cue
Overlaps Oi , i = 1..M of network activity with the M = 5 patterns stored - before, during, & after a 75%-corrupted cue for pattern 2 was given Samples taken for 50 msec every 50 msec during the simulation time

18 RETRIEVAL PERFORMANCE
Retrieval performance degrades gradually as the cue quality decreases Memories can be retrieved with even when 90% of the cue input is corrupted

19 RETRIEVAL vs. “BUMPINESS” - I
The ability to retrieve memories dies if p “prefers randomness” The ability to form activity bumps dies if p “prefers regularity” Can they both be “alive enough” when < p < 0.6 ?..

20 Effect of changing cue size

21 RETRIEVAL vs. “BUMPINESS” - II
”almost”: Is there still a chance for a successful coexistence at the same p?.. Increasing number k of connections per neuron helps the memories and interferes with the bumps Changing k does not affect the qualitative picture, i.e. retrieval performance and bumpiness favor almost non-overlapping ranges of p

22 RETRIEVAL vs. “BUMPINESS” - II
”almost”: Is there still a chance for a successful coexistence at the same p?.. Increasing number k of connections per neuron helps the memories and interferes with the bumps Changing k does not affect the qualitative picture, i.e. retrieval performance and bumpiness favor almost non-overlapping ranges of p

23 CONCLUSIONS The spontaneous activity bumps, which are formed in the regular network, can be observed up to p = 0.6 Storing random binary patterns in the network does not affect the bumps, but the retrieval performance appears to be very poor for small p As the randomness increases (p > 0.4), a robust retrieval is observed even for partial cues Changing k does not affect the qualitative network behavior The abilities to form stable activity bumps and to retrieve associative memories are favored at distinct ranges of the parameter of randomness The “almost” question… Special thanks to the EU Advanced Course in Computational Neuroscience - Obidos, Portugal 2002

24 New CONCLUSIONS NEXT SLIDE Those were from Anastasia’s simulations
Enters Yasser with analytical calculations on a simpler (threshold-linear) model, supported by extensive simulations NEXT SLIDE

25

26 The content-based perspective
An example: Plaut’s model of semantic memory  .pdf


Download ppt "Arealization and Memory in the Cortex"

Similar presentations


Ads by Google