Co-funded by the European Union SlideSP4 Theoretical Neuroscience – HBP 2 nd Periodic Review – June 2016 SP4 Major Achievements: Ramp-Up Phase overall.

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Neural Network Approach to Modeling the Laser Material-Removal Process By Basem. F. Yousef London, Canada, N6A 5B9 December 2001.
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Simple Neural Nets For Pattern Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Neural Networks Basic concepts ArchitectureOperation.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Voice Recognition by a Realistic Model of Biological Neural Networks by Efrat Barak Supervised by Karina Odinaev Igal Raichelgauz.
An Illustrative Example
Modeling Cross-Episodic Migration of Memory Using Neural Networks by, Adam Britt Definitions: Episodic Memory – Memory of a specific event, combination.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Self-Organized Recurrent Neural Learning for Language Processing April 1, March 31, 2012 State from June 2009.
Artificial Neural Networks
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
CONTENTS:  Introduction  What is neural network?  Models of neural networks  Applications  Phases in the neural network  Perceptron  Model of fire.
Classification / Regression Neural Networks 2
Well Log Data Inversion Using Radial Basis Function Network Kou-Yuan Huang, Li-Sheng Weng Department of Computer Science National Chiao Tung University.
Methodology of Simulations n CS/PY 399 Lecture Presentation # 19 n February 21, 2001 n Mount Union College.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Neural Networks Chapter 7
Speech Communication Lab, State University of New York at Binghamton Dimensionality Reduction Methods for HMM Phonetic Recognition Hongbing Hu, Stephen.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Neural Networks 2nd Edition Simon Haykin
Spiking Neural Networks Banafsheh Rekabdar. Biological Neuron: The Elementary Processing Unit of the Brain.
Chapter 6 Neural Network.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
March 31, 2016Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms I 1 … let us move on to… Artificial Neural Networks.
Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout.
Evolutionary Computation Evolving Neural Network Topologies.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
October 20-23rd, 2015 Deep Neural Network Based Malware Detection Using Two Dimensional Binary Program Features Joshua Saxe, Dr. Konstantin Berlin Invincea.
Neural networks.
Big data classification using neural network
Neural Network Architecture Session 2
Classification / Regression Neural Networks 2
Interacting Roles of Attention and Visual Salience in V4
Approximate Fully Connected Neural Network Generation
Training a Neural Network
OVERVIEW OF BIOLOGICAL NEURONS
Using a Compound Gain Field to Compute a Reach Plan
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
network of simple neuron-like computing elements
Neural Networks Chapter 5
Emre O. Neftci  iScience  Volume 5, Pages (July 2018) DOI: /j.isci
Generating Coherent Patterns of Activity from Chaotic Neural Networks
Department of Electrical Engineering
Volume 81, Issue 6, Pages (March 2014)
Brendan K. Murphy, Kenneth D. Miller  Neuron 
Volume 40, Issue 6, Pages (December 2003)
Dynamic Coding for Cognitive Control in Prefrontal Cortex
Thomas Akam, Dimitri M. Kullmann  Neuron 
Experience-Dependent Asymmetric Shape of Hippocampal Receptive Fields
Volume 71, Issue 4, Pages (August 2011)
Learning by the Dendritic Prediction of Somatic Spiking
Artificial Neural Networks / Spring 2002
Machine Learning.
Presentation transcript:

Co-funded by the European Union SlideSP4 Theoretical Neuroscience – HBP 2 nd Periodic Review – June 2016 SP4 Major Achievements: Ramp-Up Phase overall WP4.2:Synaptic Plasticity and Learning - Gruning lab 1 Task Structures of spiking learning algorithms ALGO MultilayerSpiker (Gruning-lab, Surrey) GIT repository 1 SGA1-SP9 neuromorphic SGA1-SP9 neuromorphic SP4 CDP5 ramp-up-phase, work finished SGA1-SP6 NEST Simulator SGA1-SP6 NEST Simulator CDP5 Gardner, Sporea, Gruning. “Learning Spatio- Temporally Encoded Pattern Transformations in Structured Spiking Neural Networks”, Neural Computation 27(12), Gardner, Gruning, “Optimal Supervised Learning…”. Submitted March 2016.

Co-funded by the European Union SlideSP4 Theoretical Neuroscience – HBP 2 nd Periodic Review – June 2016 SP4 Major Achievements: Ramp-Up Phase overall WP4.2:Synaptic Plasticity and Learning - Gruning lab Task Structures of spiking learning algorithms ALGO MultilayerSpiker (Gruning-lab, Surrey) Big question: How can neural-level mechanisms “conspire” to yield Supervised Learning on systems level with Technically acceptable performance?

Co-funded by the European Union SlideSP4 Theoretical Neuroscience – HBP 2 nd Periodic Review – June 2016 SP4 Major Achievements: Ramp-Up Phase overall WP4.2:Synaptic Plasticity and Learning - Gruning lab Task Structures of spiking learning algorithms ALGO MultilayerSpiker (Gruning-lab, Surrey) Sample Learning Task : Learning a target output spike train in response to a single, fixed input pattern. The network contained n i = 100 input neurons, n h = 10 hidden neurons and a single output neuron. The input pattern was repeatedly presented to the network over 1000 episodes, where each episode lasted T = 500ms. The target output spike train contained five spikes. (A)A spike raster of the input pattern. (B)The activity of one hidden neuron with each episode. (C)The activity of the output, with target output spike times indicated by crosses. (D)An illustration of the multilayer network setup. (E)The evolution of the distance/error between the actual and target output spike trains of the network, given as a moving average of the van-Rossum Distance.

Co-funded by the European Union SlideSP4 Theoretical Neuroscience – HBP 2 nd Periodic Review – June 2016 SP4 Major Achievements: Ramp-Up Phase overall WP4.2:Synaptic Plasticity and Learning - Gruning lab Task Structures of spiking learning algorithms ALGO MultilayerSpiker (Gruning-lab, Surrey) The dependence of the network performance on the number of input patterns, the number of hidden neurons n h, and the number of target output spikes for a single output neuron (Left) Performance as a function of the number of input patterns, for n h = 10 (A), n h = 20 (B), and n h = 30 (C) hidden neurons. In each panel, different curves (blue, red, green) correspond to the number n a =1,5,10 target output spikes identifying each input. (Right) The number of episodes to convergence in learning.

Co-funded by the European Union SlideSP4 Theoretical Neuroscience – HBP 2 nd Periodic Review – June 2016 SP4 Major Achievements: Ramp-Up Phase overall WP4.2:Synaptic Plasticity and Learning - Gruning lab Task Structures of spiking learning algorithms ALGO MultilayerSpiker (Gruning-lab, Surrey) Main Results: 1.New learning algorithm MultiLayerSpiker: Compared to other learning algorithms for spiking neuron networks, we can learn 1.more input-output mappings: 200 pattern pairs here vs more timed output spikes: up to 10 individually timed spikes here vs with multiple outputs: up to 30 here vs 1 2.Conclusion for HBP from comparison of diverse supervised learning rules (both biological and machine-learning oriented ones): 1.all need either target or reward signal (“third signals” / three factor rules: neuro- modulators, weak synapses etc) 2.neuromorphic platforms (entire) and NEST (partially) do not yet implement the "infrastructure" to support third signals. 3.=> changes to be implemented in NEST and Spinnaker during SGA1 in CDP 5.