OCNC---2004 Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.

Slides:



Advertisements
Similar presentations
Chapter3 Pattern Association & Associative Memory
Advertisements

On Bubbles and Drifts: Continuous attractor networks in brain models
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Introduction to Neural Networks Computing
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
WINNERLESS COMPETITION PRINCIPLE IN NEUROSCIENCE Mikhail Rabinovich INLS University of California, San Diego ’
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Learning crossmodal spatial transformations through STDP Gerhard Neumann Seminar B, SS 06.
Artificial Spiking Neural Networks
Image Segmentation by Complex-Valued Units Cornelius Weber and Stefan Wermter Hybrid Intelligent Systems School of Computing and Technology University.
Introduction: Neurons and the Problem of Neural Coding Laboratory of Computational Neuroscience, LCN, CH 1015 Lausanne Swiss Federal Institute of Technology.
Part II: Population Models BOOK: Spiking Neuron Models, W. Gerstner and W. Kistler Cambridge University Press, 2002 Chapters 6-9 Laboratory of Computational.
Biological Modeling of Neural Networks: Week 11 – Continuum models: Cortical fields and perception Wulfram Gerstner EPFL, Lausanne, Switzerland 11.1 Transients.
Introduction to the mathematical modeling of neuronal networks Amitabha Bose Jawaharlal Nehru University & New Jersey Institute of Technology IISER, Pune.
Brain-like design of sensory-motor programs for robots G. Palm, Uni-Ulm.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
A globally asymptotically stable plasticity rule for firing rate homeostasis Prashant Joshi & Jochen Triesch
Un Supervised Learning & Self Organizing Maps Learning From Examples
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Unsupervised learning
1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Chaos in Neural Network Theme presentation Cui, Shuoyang 03/08/2005.
Associative-Memory Networks Input: Pattern (often noisy/corrupted) Output: Corresponding pattern (complete / relatively noise-free) Process 1.Load input.
Biological Modeling of Neural Networks: Week 14 – Dynamics and Plasticity Wulfram Gerstner EPFL, Lausanne, Switzerland 14.1 Reservoir computing - Complex.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
The search for organizing principles of brain function Needed at multiple levels: synapse => cell => brain area (cortical maps) => hierarchy of areas.
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
Dynamical network motifs: building blocks of complex dynamics in biological networks Valentin Zhigulin Department of Physics, Caltech, and Institute for.
Deriving connectivity patterns in the primary visual cortex from spontaneous neuronal activity and feature maps Barak Blumenfeld, Dmitri Bibitchkov, Shmuel.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Unsupervised learning
Multiple attractors and transient synchrony in a model for an insect's antennal lobe Joint work with B. Smith, W. Just and S. Ahn.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
”When spikes do matter: speed and plasticity” Thomas Trappenberg 1.Generation of spikes 2.Hodgkin-Huxley equation 3.Beyond HH (Wilson model) 4.Compartmental.
Lecture 21 Neural Modeling II Martin Giese. Aim of this Class Account for experimentally observed effects in motion perception with the simple neuronal.
Introduction: Brain Dynamics Jaeseung Jeong, Ph.D Department of Bio and Brain Engineering, KAIST.
Synchronization in complex network topologies
1 4. Associators and Synaptic Plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Oscillatory Models of Hippocampal Activity and Memory Roman Borisyuk University of Plymouth, UK In collaboration with.
Neural Networks with Short-Term Synaptic Dynamics (Leiden, May ) Misha Tsodyks, Weizmann Institute Mathematical Models of Short-Term Synaptic plasticity.
Chapter 4. Formal Tools for the Analysis of Brain-Like Structures and Dynamics (1/2) in Creating Brain-Like Intelligence, Sendhoff et al. Course: Robots.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Image Segmentation by Complex-Valued Units Cornelius Weber Hybrid Intelligent Systems School of Computing and Technology University of Sunderland Presented.
Chapter 3. Stochastic Dynamics in the Brain and Probabilistic Decision-Making in Creating Brain-Like Intelligence, Sendhoff et al. Course: Robots Learning.
1 Properties of the Duration of Transient Oscillations in a Ring Neural Network Yo Horikawa and Hiroyuki Kitajima Kagawa University Japan.
Network Models (2) LECTURE 7. I.Introduction − Basic concepts of neural networks II.Realistic neural networks − Homogeneous excitatory and inhibitory.
Jochen Triesch, UC San Diego, 1 Part 3: Hebbian Learning and the Development of Maps Outline: kinds of plasticity Hebbian.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Information Geometry and Neural Netowrks
Randomness in Neural Networks
Ch7: Hopfield Neural Model
Biointelligence Laboratory, Seoul National University
Real Neurons Cell structures Cell body Dendrites Axon
Biointelligence Laboratory, Seoul National University
Synaptic Dynamics: Unsupervised Learning
Unsupervised learning
9. Continuous attractor and competitive networks
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
CSC 578 Neural Networks and Deep Learning
Continuous attractor neural networks (CANNs)
Presentation transcript:

OCNC---2004 Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical Neuroscience Shun-ichi Amari Laboratory for Mathematical Neuroscience RIKEN Brain Science Institute

BRAIN biological science information science Computational neuroscience Neurocomputing Mathematical Neuroscience

II. Population Coding I. Mathematical Neuroscience ---- modern topics ----classical theories II. Population Coding ---- modern topics III. Bayesian Inference ---- its merits and critique

Dynamics of Neuro-Ensembles Dynamics of Neuro-Fields 1. Mathematical Neurons Dynamics of Neuro-Ensembles Dynamics of Neuro-Fields Learning and Self-Organization 5. Self-Organization of Neuro-Fields

I Mathematical Neurons Simple model

output function u

spiking neuron integration-and-fire neuron rate coding

synchrony : spatial correlations firing probability

rate coding ensemble coding

1-layer network

Ensemble of networks macroscopic state macroscopic law

stability = =

Associative memory m pairs

Randomly generated Random matrix

II Dynamics of Neuro-Ensembles spiking neurons : stochastic point process synchronization Ensemble coding : macrodynamics

Simple examples Bistable S S Multi-stable

oscillation Amari (1971); Wilson-Cowan (1972)

competitive model (winner-take-all) ・・・ (winner-share-some)

multistable associative memory decision process (Anderson, Amari, Nakano, Kohonen Hopfield) decision process (Hopfield) travelling salesman problem

General Theory Transient Attractors stable state limit cycle chaos (strange attractors)

Chaotic behavior random stable states chaos Chaotic memory search

Associative memory (content-addressable memory) dynamics random attractor

Theory 1 =

=

Theory 2 …..

Macroscopic state Amari & Maginu, 1998

Dynamics of recalling processes Direction cosine Correct pattern 1 time simulations

Direction cosine 1 theory time

simulation Threshold of recalling Spurious memory

Dynamics of temporal sequence (Amari, 1972) non-monotonic output function Morita model

Nonmonotonic model non-monotonic

memory capacity : sparse exact : no spurious memories chaotic oscillation inhibitory connection

Biology hippocumpus, Rolls et al Chaotic associative memory Tonegawa et al CA3 Chaotic associative memory Aihara et al Chaotic search

Associative Memory Dynamics of a Chaotic Neural Networks Each neuron model shows chaotic dynamics Synaptic weights are determined by an auto-correlation matrix of the stored patterns Stored Patterns t=0 t=1 t=2 t=3 t=4

t=5 t=6 t=7 t=8 t=9 t=10 t=11 t=12 t=13 t=14 t=15 t=16 t=17 t=18 t=19

t=20 t=21 t=22 t=23 t=24 t=25 t=26 t=27 t=28 t=29 t

III Field Dynamics of Neural Excitation timing local excitations: travelling wave: oscillatory: memory decision Amari, Biol. Cybern,1978

Dynamics of Neural Fields

unstable stable

excitatory and inhibitory fields traveling wave oscillation

Neural Learning (Hebbian) classic theory ……… Information source I

Amari, Biol,Cybern,1978 Hebbian correlation generalized inverse principal component analyzer Perceptron ….

Neural learning (STDP) Spike-time dependent plasticity …. ……. emergence of synchrony LTP LTD

Learning Potential ………

1. Hebbian … 2. correlation associative memory …

3. generalized inverse least square

4. principal component analyzer Amari (1978), Oja (1980) 5. perceptron

Theory of Learning Networks Amari, IEEE Trans.C,1967 PDP: backprop; natural gradient

Learning algorithm

outer world

Self-organization …. …….

Proof RF of a neuron :

Special case Theorem: each : receptive field size of receptive field

1.resolution 2.topological property Self-Organizing Nerve Field signal space neural field 1.resolution 2.topological property

higher-dimensional 2- dimensional Topology Signal space Neural field higher-dimensional 2- dimensional position×orientation

orientation Signal space Neural field position

Self-organizing nerve field

dynamical stability patch structure variational equation stability : Takeuchi&Amari, Biol.Cybern, 35, 63-72, 1979

Topological properties emergence of block structure

Bayesian vs Fisherian. Any confrontation. --- histrical New framework Bayesian vs Fisherian? Any confrontation? --- histrical New framework? ---Amari, 1967 New in neuroscience?

Bayesian framework mle vs map information and decision asymptotically equivalent regularization theory predictive distribution

Priors: uniform, Jeffreys, smooth Hierarchical (empirical) Bayes decision of prior

Singular statistical model Singular model and prior