Continuous attractor neural networks (CANNs)

Slides:



Advertisements
Similar presentations
Chapter3 Pattern Association & Associative Memory
Advertisements

On Bubbles and Drifts: Continuous attractor networks in brain models
Memristor in Learning Neural Networks
Face Recognition: A Convolutional Neural Network Approach
Introduction to Neural Networks Computing
Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
WINNERLESS COMPETITION PRINCIPLE IN NEUROSCIENCE Mikhail Rabinovich INLS University of California, San Diego ’
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
The Perceptron. 0T Afferents V thr V rest t max 0 What does a neuron do? spike no spike.
A globally asymptotically stable plasticity rule for firing rate homeostasis Prashant Joshi & Jochen Triesch
Artificial Neural Networks Ch15. 2 Objectives Grossberg network is a self-organizing continuous-time competitive network.  Continuous-time recurrent.
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
Biological Modeling of Neural Networks: Week 13 – Membrane potential densities and Fokker-Planck Wulfram Gerstner EPFL, Lausanne, Switzerland 13.1 Review:
1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.
1 8. Recurrent associative networks and episodic memory Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Pattern Similarity and Storage Capacity of Hopfield Network Suman K Manandhar Prof. Ramakoti Sadananda Computer Science and Information Management AIT.
Biological Modeling of Neural Networks: Week 9 – Adaptation and firing patterns Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 Firing patterns and adaptation.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Machine Learning: Connectionist Luger: Artificial.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
1 8. Auto-associative memory and network dynamics Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Chapter 7. Network models Firing rate model for neuron as a simplification for network analysis Neural coordinate transformation as an example of feed-forward.
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
Lecture 21 Neural Modeling II Martin Giese. Aim of this Class Account for experimentally observed effects in motion perception with the simple neuronal.
1 3. Simplified Neuron and Population Models Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
CAP6938 Neuroevolution and Developmental Encoding Evolving Adaptive Neural Networks Dr. Kenneth Stanley October 23, 2006.
CAP6938 Neuroevolution and Artificial Embryogeny Evolving Adaptive Neural Networks Dr. Kenneth Stanley March 1, 2006.
It’s raining outside; want to go to the pub? It’s dry outside; want to go to the pub? Sure; I’ll grab the umbrella. What, are you insane? I’ll grab the.
R ECURRENT N EURAL N ETWORKS OR A SSOCIATIVE M EMORIES Ranga Rodrigo February 24,
NETWORK SONGS !! created by Carina Curto & Katherine Morrison January 2016 Input: a simple directed graph G satisfying two rules: 1. G is an oriented.
From LIF to HH Equivalent circuit for passive membrane The Hodgkin-Huxley model for active membrane Analysis of excitability and refractoriness using the.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
1 9. Continuous attractor and competitive networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Learning in Neural Networks
Ch7: Hopfield Neural Model
Real Neurons Cell structures Cell body Dendrites Axon
Other Applications of Energy Minimzation
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Recurrent Neural Networks
CSC 578 Neural Networks and Deep Learning
How Neurons Do Integrals
Robustness in Neurons & Networks
9. Continuous attractor and competitive networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
Artificial Neural Networks
A method of determining the existence of bursting solutions, and output variables such as burst length and spike number, of multidimensional models using.
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Classification functions of chaotic pulse-coupled networks
Grossberg Network.
John Widloski, Ila R. Fiete  Neuron 
Brendan K. Murphy, Kenneth D. Miller  Neuron 
Synaptic Correlates of Working Memory Capacity
Face Recognition: A Convolutional Neural Network Approach
Chapter 8: Recurrent associative networks and episodic memory
Presentation transcript:

Continuous attractor neural networks (CANNs) Thomas P. Trappenberg Dalhousie University, Canada CANN models and their relation to ANN Phase transitions in the weight-parameter space Hebbian learning and dimensionality discovery Path-integration Drifting activity packets and NMDA stabilization ?

`Basic/standard’ Grossberg-Hopfield type recurrent networks or spiking versions

`Basic/standard’ CANNmodel

Activity packet

Phase transitions in the weight-parameter space

Various gain functions

( or … ) Hebbian Learning Training on Gaussian patterns: Also, Kechen Zhang ’96: Gradient decent training …

Dimensionality discovery

Path-integration

Continuous dynamic (leaky integrator): The model equations: Continuous dynamic (leaky integrator): : activity of node i : firing rate : synaptic efficacy matrix : global inhibition : visual input : time constant : scaling factor : #connections per node : slope : threshold NMDA-style stabilization: Hebbian learning:

Drifting activity packets NMDA stabilization: