Cerebellar Spiking Engine: EDLUT simulator UGR with input from other partners.  Motivation 1. Simulation of biologically plausible spiking neural structures.

Slides:



Advertisements
Similar presentations
Chapter 31 Cerebellum Copyright © 2014 Elsevier Inc. All rights reserved.
Advertisements

For Wednesday Read chapter 19, sections 1-3 No homework.
Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
Cerebellar Spiking Engine: Towards Object Model Abstraction in Manipulation UGR with input from PAVIA and other partners  Motivation 1.Abstract corrective.
Lecture 15: Cerebellum The cerebellum consists of two hemispheres and a medial area called the vermis. The cerebellum is connected to other neural structures.
B.Macukow 1 Lecture 3 Neural Networks. B.Macukow 2 Principles to which the nervous system works.
Artificial Neural Networks
Learning crossmodal spatial transformations through STDP Gerhard Neumann Seminar B, SS 06.
Artificial Neural Networks - Introduction -
Why are cortical spike trains irregular? How Arun P Sripati & Kenneth O Johnson Johns Hopkins University.
1 3. Spiking neurons and response variability Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Lecture 16 Spiking neural networks
Basic Models in Theoretical Neuroscience Oren Shriki 2010 Synaptic Dynamics 1.
Artificial Neural Networks - Introduction -
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Neural Networks Basic concepts ArchitectureOperation.
The back-propagation training algorithm
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Stable Propagation of Synchronous Spiking in Cortical Neural Networks Markus Diesmann, Marc-Oliver Gewaltig, Ad Aertsen Nature 402: Flavio Frohlich.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Modeling The Spino- Neuromuscular System Terence Soule, Stanley Gotshall, Richard Wells, Mark DeSantis, Kathy Browder, Eric Wolbrecht.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
Motor systems III: Cerebellum April 16, 2007 Mu-ming Poo Population coding in the motor cortex Overview and structure of cerebellum Microcircuitry of cerebellum.
Neural Plasticity Lecture 7. Neural Plasticity n Nervous System is malleable l learning occurs n Structural changes l increased dendritic branching l.
Biologically-Inspired Neural Nets Modeling the Hippocampus.
Multi-scale Models of the Cerebellum: Role of the Adaptive Filter Model Paul Dean, Christian Rössert & John Porrill University of Sheffield REALNET.
Getting on your Nerves. What a lot of nerve! There are about 100,000,000,000 neurons in an adult human. These form 10,000,000,000,000 synapses, or connections.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
TEMPLATE DESIGN © A Computational Model of Auditory Processing for Sound Localization Diana Stan, Michael Reed Department.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Data Mining and Neural Networks Danny Leung CS157B, Spring 2006 Professor Sin-Min Lee.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Cerebellum Overview and structure of cerebellum Microcircuitry of cerebellum Motor learning.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Version 0.10 (c) 2007 CELEST VISI  N BRIGHTNESS CONTRAST: ADVANCED MODELING CLASSROOM PRESENTATION.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Perceptrons Michael J. Watts
IJCNN, July 27, 2004 Extending SpikeProp Benjamin Schrauwen Jan Van Campenhout Ghent University Belgium.
Ghent University Compact hardware for real-time speech recognition using a Liquid State Machine Benjamin Schrauwen – Michiel D’Haene David Verstraeten.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout.
Persistent activity and oscillations in recurrent neural networks in the high-conductance regime Rubén Moreno-Bote with Romain Brette and Néstor Parga.
Spiking Neuron Networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
The Brain as an Efficient and Robust Adaptive Learner
Emre O. Neftci  iScience  Volume 5, Pages (July 2018) DOI: /j.isci
Neuro-RAM Unit in Spiking Neural Networks with Applications
An Introduction To The Backpropagation Algorithm
Learning Precisely Timed Spikes
Volume 40, Issue 6, Pages (December 2003)
The Naïve Bayes (NB) Classifier
H.Sebastian Seung, Daniel D. Lee, Ben Y. Reis, David W. Tank  Neuron 
The Network Approach: Mind as a Web
The Brain as an Efficient and Robust Adaptive Learner
Shunting Inhibition Modulates Neuronal Gain during Synaptic Excitation
Cerebellar LTD: A Molecular Mechanism of Behavioral Learning?
Presentation transcript:

Cerebellar Spiking Engine: EDLUT simulator UGR with input from other partners.  Motivation 1. Simulation of biologically plausible spiking neural structures 2. Allow the simulation of different neuron models 3.Allow the incorporation of new neural features into neuron models without needing to modify the simulator code. 4. Real-time simulation of middle-scale neural networks (thousands of neurons).  Cerebellum model Cerebellum model. Inputs encoding the movement are sent (upward arrow) through the mossy fibers to the granular layer. These inputs encode the desired and actual position and velocity of each joint along the trajectory and also context- related information. Inputs encoding the error are sent (upper downward arrow) through the inferior olive (IO). Cerebellar outputs are provided by the deep-cerebellar-nuclei cells (DCN) (lower downward arrow). The DCN collects activity from the mossy fibers (excitatory inputs) and the Purkinje cells (inhibitory inputs). The outputs of the DCN are added as corrective torque  Using tables  The precision of the simulation will mainly depend on:  Table size and access mode (interpolation)  Table structure (coordinate distribution) If neuron behavior exhibits abrupt changes along a specific dimension denser sampling is required. It is also possible to use non-uniform coordinate distribution  Each incoming spike causes the conductance of the corresponding synapse (g j ) to follow an exponential decay function.  When the membrane potential (Vm) reaches the threshold, the neuron emits a spike. Synaptic plasticity&architecture  Synaptic plasticity&architecture Some implementation details  Some implementation details  Uniform treatment of all possible events that could occur in the simulation process  Heritage of Event class.  Different event types (Shot an unique spike, propagation, TCP/IP communication, synchronization with other system, save the current simulation status, simulation ending …etc ).  ProcessEvent method implements the specific treatment of each event.  Implementation in C++ for Linux and Windows (with Cygwin) until now.  Same precompiled look up tables both 32bits and 64 bits architectures  Same precompiled look up tables both 32bits and 64 bits architectures..  Uniform treatment of all possible events that could occur in the simulation process  Heritage of Event class.  Different event types (Shot an unique spike, propagation, TCP/IP communication, synchronization with other system, save the current simulation status, simulation ending …etc ).  ProcessEvent method implements the specific treatment of each event.  Implementation in C++ for Linux and Windows (with Cygwin) until now.  Same precompiled look up tables both 32bits and 64 bits architectures  Same precompiled look up tables both 32bits and 64 bits architectures.. Inferior Olive Neurons [48] Purkinje Cells [48] Deep Cerebellar Nuclei Cells [24] [8] [4] [8] Excitatory connections Inhibitory connections Teaching signal Excitatory connections Inhibitory connections Teaching signal Granule Cells [1500] Joint related Mossy [120] Joint related Mossy [120] 4 rand mos sy 4 rand mos sy 4 rand mos sy [20] Plastic synapses Parallel fibers Network topology teaching signal 1... Mossy fibers Population coding Leakage integrate- and-fire neurons trajectory in joint coordinate Translation of coordinates into spike trains I N1 (t) I N2 (t) I N-1 (t) I N (t) Advantages: Spikes of biological neurons are well localized in time and not very frequent. Thus low number of events (sparse coding). Disadvantages: We need a mathematical expression (or method) to calculate the value of each state variable after an arbitrary time (the time of the next event). Euler / Runge-Kutta method (N+1)-dimensional table for vn Table 1…, Table N Table-based event- driven simulator Before the simulation During the simulation  Conductance-based synaptic input  Neural parameters: C m, E exc, τ exc, E inh, τ inh, E rest g rest  Neural parameters: C m, E exc, τ exc, E inh, τ inh, E rest, g rest  Neural state variablesV m G exc G inh  Neural state variables: V m, G exc, G inh Dependent on: Vm, G exc, Ginh, tDependent on: Vm, G exc, Ginh, t 4-dimensional table Vm(V m 0,G exc 0,G inh 0,Δt)4-dimensional table Vm(V m 0,G exc 0,G inh 0,Δt) Dependent on: Gexc, tDependent on: Gexc, t 2-dimensional table Gexc(G exc 0,Δt)2-dimensional table Gexc(G exc 0,Δt) Dependent on: Ginh, tDependent on: Ginh, t 2-dimensional table Ginh(G inh 0,Δt)2-dimensional table Ginh(G inh 0,Δt) Simulator Synaptic Weights Input Spikes Output Spikes Net Definition: 1.-Neurons - Numbers -Types 2.-Connections -Delays -Synapses Neuron Model look up tables -Vm(Vi,Gi,Ge,t) -Ge(t),... -Tf(Vi,Gi,Ge) Tf end(Vi,Gi,Ge) Function Approximation Neuron Model Definitions -Equations -Table definitions WEIGHTS. DAT NET.CFG MODEL_N.D AT TAB2.CFG.C Long Term Depression (LTD): Each time a spike from inferior olive arrives: Long Term Potentiation (LTP): Each time a spike from granule cell arrives at synapse i: If the end time is not yet reached Extract the event with the shortest latency from the spike heap Update the corresponding neuron state to the current event time Apply the current event effect Insert new produced event in the spike heap Contribution last IO spike arrival (ms) Amount of LTD Learning laws x*e -x sin(x) 2 * e -x sin(x) 20 * e -x