Download presentation
Presentation is loading. Please wait.
Published byGael Davies Modified over 10 years ago
1
On Bubbles and Drifts: Continuous attractor networks in brain models
Thomas Trappenberg Dalhousie University, Canada
2
Once upon a time ... (my CANN shortlist)
Wilson & Cowan (1973) Grossberg (1973) Amari (1977) … Sampolinsky & Hansel (1996) Zhang (1997) Stringer et al (2002)
3
It’s just a `Hopfield’ net …
Recurrent architecture Synaptic weights
4
In mathematical terms …
Updating network states (network dynamics) Gain function Weight kernel
5
Weights describe the effective interaction profile in Superior Colliculus
TT, Dorris, Klein & Munoz, J. Cog. Neuro. 13 (2001)
6
Network can form bubbles of persistent activity (in Oxford English: activity packets)
End states
7
Space is represented with activity packets in the hippocampal system
From Samsonovich & McNaughton Path integration and cognitive mapping in a continuous attractor neural J. Neurosci. 17 (1997)
8
There are phase transitions in the weight-parameter space
9
CANNs work with spiking neurons
Xiao-Jing Wang, Trends in Neurosci. 24 (2001)
10
Shutting-off works also in rate model
Node Time
11
Various gain functions are used
End states
12
CANNs can be trained with Hebb
Training pattern:
13
Normalization is important to have convergent method
Random initial states Weight normalization w(x,y) w(x,50) x x y Training time
14
Gradient-decent learning is also possible (Kechen Zhang)
Gradient decent with regularization = Hebb + weight decay
15
CANNs have a continuum of point attractors
Point attractors and basin of attraction Line of point attractors Can be mixed: Rolls, Stringer, Trappenberg A unified model of spatial and episodic memory Proceedings B of the Royal Society 269: (2002)
16
Neuroscience applications of CANNs
Persistent activity (memory) and winner-takes-all (competition) Working memory (e.g. Compte, Wang, Brunel etc) Place and head direction cells (e.g. Zhang, Redish, Touretzky, Samsonovitch, McNaughton, Skaggs, Stringer et al.) Attention (e.g. Olshausen, Salinas & Abbot, etc) Population decoding (e.g. Wu et al, Pouget, Zhang, Deneve, etc ) Oculomotor programming (e.g. Kopecz & Schoener, Trappenberg) etc
17
Superior colliculus intergrates exogenous and endogenous inputs
h a l E F L I P R Cerebellum
18
Superior Colliculus is a CANN
TT, Dorris, Klein & Munoz, J. Cog. Neuro. 13 (2001)
19
CANN with adaptive input strength explains express saccades
20
CANN are great for population decoding (fast pattern matching implementation)
21
CANN (integrators) are stiff
22
… and drift and jump TT, ICONIP'98
23
Modified CANN solves path-integration
24
CANNs can learn dynamic motor primitives
Stringer, Rolls, TT, de Araujo, Neural Networks 16 (2003).
25
Drift is caused by asymmetries
NMDA stabilization
26
CANN can support multiple packets
Stringer, Rolls & TT, Neural Networks 17 (2004)
27
How many activity packets can be stable?
T.T., Neural Information Processing-Letters and Reviews, Vol. 1 (2003)
28
Stabilization can be too strong
TT & Standage, CNS’04
29
CANN can discover dimensionality
30
Continuous dynamic (leaky integrator):
The model equations: Continuous dynamic (leaky integrator): : activity of node i : firing rate : synaptic efficacy matrix : global inhibition : visual input : time constant : scaling factor : #connections per node : slope : threshold NMDA-style stabilization: Hebbian learning:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.