On Bubbles and Drifts: Continuous attractor networks in brain models Thomas Trappenberg Dalhousie University, Canada
Once upon a time ... (my CANN shortlist) Wilson & Cowan (1973) Grossberg (1973) Amari (1977) … Sampolinsky & Hansel (1996) Zhang (1997) Stringer et al (2002)
It’s just a `Hopfield’ net … Recurrent architecture Synaptic weights
In mathematical terms … Updating network states (network dynamics) Gain function Weight kernel
Weights describe the effective interaction profile in Superior Colliculus TT, Dorris, Klein & Munoz, J. Cog. Neuro. 13 (2001)
Network can form bubbles of persistent activity (in Oxford English: activity packets) End states
Space is represented with activity packets in the hippocampal system From Samsonovich & McNaughton Path integration and cognitive mapping in a continuous attractor neural J. Neurosci. 17 (1997)
There are phase transitions in the weight-parameter space
CANNs work with spiking neurons Xiao-Jing Wang, Trends in Neurosci. 24 (2001)
Shutting-off works also in rate model Node Time
Various gain functions are used End states
CANNs can be trained with Hebb Training pattern:
Normalization is important to have convergent method Random initial states Weight normalization w(x,y) w(x,50) x x y Training time
Gradient-decent learning is also possible (Kechen Zhang) Gradient decent with regularization = Hebb + weight decay
CANNs have a continuum of point attractors Point attractors and basin of attraction Line of point attractors Can be mixed: Rolls, Stringer, Trappenberg A unified model of spatial and episodic memory Proceedings B of the Royal Society 269:1087-1093 (2002)
Neuroscience applications of CANNs Persistent activity (memory) and winner-takes-all (competition) Working memory (e.g. Compte, Wang, Brunel etc) Place and head direction cells (e.g. Zhang, Redish, Touretzky, Samsonovitch, McNaughton, Skaggs, Stringer et al.) Attention (e.g. Olshausen, Salinas & Abbot, etc) Population decoding (e.g. Wu et al, Pouget, Zhang, Deneve, etc ) Oculomotor programming (e.g. Kopecz & Schoener, Trappenberg) etc
Superior colliculus intergrates exogenous and endogenous inputs h a l E F L I P R Cerebellum
Superior Colliculus is a CANN TT, Dorris, Klein & Munoz, J. Cog. Neuro. 13 (2001)
CANN with adaptive input strength explains express saccades
CANN are great for population decoding (fast pattern matching implementation)
CANN (integrators) are stiff
… and drift and jump TT, ICONIP'98
Modified CANN solves path-integration
CANNs can learn dynamic motor primitives Stringer, Rolls, TT, de Araujo, Neural Networks 16 (2003).
Drift is caused by asymmetries NMDA stabilization
CANN can support multiple packets Stringer, Rolls & TT, Neural Networks 17 (2004)
How many activity packets can be stable? T.T., Neural Information Processing-Letters and Reviews, Vol. 1 (2003)
Stabilization can be too strong TT & Standage, CNS’04
CANN can discover dimensionality
Continuous dynamic (leaky integrator): The model equations: Continuous dynamic (leaky integrator): : activity of node i : firing rate : synaptic efficacy matrix : global inhibition : visual input : time constant : scaling factor : #connections per node : slope : threshold NMDA-style stabilization: Hebbian learning: