Grid-based Simulations of Mammalian Visual System Grzegorz M. Wójcik and Wiesław A. Kamiński Maria Curie-Sklodowska University, Lublin, Poland. Abstract.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

NEU Neural Computing MSc Natural Computation Department of Computer Science University of York.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Adaptive Resonance Theory (ART) networks perform completely unsupervised learning. Their competitive learning algorithm is similar to the first (unsupervised)
Kostas Kontogiannis E&CE
Neuromorphic Engineering
Self Organizing Maps. This presentation is based on: SOM’s are invented by Teuvo Kohonen. They represent multidimensional.
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Artificial Intelligence (CS 461D)
Producing Artificial Neural Networks using a Simple Embryogeny Chris Bowers School of Computer Science, University of Birmingham White.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial Basis Function (RBF) Networks
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Chapter 14: Artificial Intelligence Invitation to Computer Science, C++ Version, Third Edition.
Connectionism. ASSOCIATIONISM Associationism David Hume ( ) was one of the first philosophers to develop a detailed theory of mental processes.
Neural and Evolutionary Computing - Lecture 10 1 Parallel and Distributed Models in Evolutionary Computing  Motivation  Parallelization models  Distributed.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Artificial Neural Network Unsupervised Learning
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems.
Modeling Neural Networks Christopher Krycho Advisor: Dr. Eric Abraham May 14, 2009.
Liquid State Machines and Large Simulations of Mammalian Visual System Grzegorz M. Wójcik 14 XII 2004.
Cellular Automata BIOL/CMSC 361: Emergence 2/12/08.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
Hodgkin-Huxley Model and FitzHugh-Nagumo Model. Nervous System Signals are propagated from nerve cell to nerve cell (neuron) via electro-chemical mechanisms.
V.M. Sliusar, V.I. Zhdanov Astronomical Observatory, Taras Shevchenko National University of Kyiv Observatorna str., 3, Kiev Ukraine
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
1 Computational Vision CSCI 363, Fall 2012 Lecture 2 Introduction to Vision Science.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Robot Intelligence Technology Lab. Evolutionary Robotics Chapter 3. How to Evolve Robots Chi-Ho Lee.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
BIOLOGICALLY MOTIVATED OSCILLATORY NETWORK MODEL FOR DYNAMICAL IMAGE SEGMENTATION Margarita Kuzmina, Eduard Manykin Keldysh Institute of Applied Mathematics.
Cmpe 588- Modeling of Internet Emergence of Scale-Free Network with Chaotic Units Pulin Gong, Cees van Leeuwen by Oya Ünlü Instructor: Haluk Bingöl.
Big data classification using neural network
Outline Of Today’s Discussion
Self-Organizing Network Model (SOM) Session 11
Self Organizing Maps: Parametrization of Parton Distribution Functions
Artificial Intelligence (CS 370D)
Neuroinformatics at Edinburgh
Dr. Unnikrishnan P.C. Professor, EEE
Chapter 12 Advanced Intelligent Systems
OVERVIEW OF BIOLOGICAL NEURONS
Presented by Rhee, Je-Keun
Spatial Frequency Maps in Cat Visual Cortex
Reaction & Diffusion system
Fundamentals of Neural Networks Dr. Satinder Bal Gupta
Mechanisms of Connectome Development
Synaptic integration.
Design of Experiments CHM 585 Chapter 15.
Margarita Kuzmina, Eduard Manykin
Presentation transcript:

Grid-based Simulations of Mammalian Visual System Grzegorz M. Wójcik and Wiesław A. Kamiński Maria Curie-Sklodowska University, Lublin, Poland. Abstract Large biological neural networks are examined. Ensembles of simulated microcircuits model behaviour of mammalian visual system in some detail. All neural cells are simulated according to Hodgkin-Huxley theory. In that model each neuron is treated as a set of several non-linear differential equations. Good simulation of large groups of Hodgkin-Huxley neurons usually requires high computational powers. The modular structure of visual system seems to be appropriate task for grid computations. In this paper we report first results of CLUSTERIX grid application to modelling of vision processes. MPGENESIS simulator is used for all simulations. We investigate networks consisting 16 thousands of Hodgkin-Huxley neurons. First simulations were run on the local cluster with 24 nodes. Consequently, in the next stage of experiments, we are going to check the time of simulation for larger number of processors, using CLUTERIX grid resources. Such number of simulated neurons allowed us to observe liquid computing phenomena. In theory cortical microcircuits are treated as Liquid State Machines (LSM). The work of each machine resembles behaviour of particles in a liquid. Though, we also present some results confirming the thesis that neural liquids tend to be in different states for different, changing in time stimulations and that such biological structures can have computational power. Introduction Human brain built of about 1011 neural cells is a hard object of simulation even for contemporary super-computers. Some idea of whole brain modelling was suggested by Maass and since then it has been called Liquid State Machine (LSM) [1-3]. In general, the brain (or a fragment of it) is treated as a liquid. Mammalian brains’ cortex is built of neurons organised in microcircuits [4]. Microcircuits form columns, and the function of each column depends on its location in the brain. Cortical microcircuits turn out to be very good “liquids” for computing on perturbations because of the large diversity of their elements, neurons and synapses, and the large variety of mechanisms and time constants characterising their interactions, involving recurrent connections on multiple spatial scales. Like Turing machine, the model of LSM is based on strict mathematical framework that guarantees under ideal conditions universal computational power [3]. RETINA 256 NEURONS 16 (4×4) PATCHES READOUT 256 NEURONS 16 (4×4) PATCHES LIQUID 16 (4×4) COLUMNS 16×1024 NEURONS Fig. 1. Scheme of Simulated Visual System We investigate the model consisting neurons (as the Liquid is simulated by ensemble of 16 HHLSM columns). 30% of randomly chosen retinal cells are stimulated and the signal is transformed by the Liquid. As a result we obtain some activity of the Readout device. We simulate 20 ms of biological work for such system. The main objective of the referred research is to check the dependence of simulation time from the number of processors used for simulation and from the percentage number of connections established among neurons of the liquid. As we mentioned the architecture of the network implies that the problem can be most effectively parallelised into sixteen main nodes with one controlling node. Results confirm our expectations (see. Fig. 2). Simulation’s time reaches its minimum when the model parallelised for 17 nodes runs on 17 processors. Note that increasing the number of processors is useless for the algorithm with 17-nodes parallelisation implemented. Model of mammalian visual system Discussed model of mammalian visual system consists three main modules (Fig. 1). The Retina is built on 16×16 square-shaped grid and it is divided into 16 patches (4×4). Each patch is connected with HHLSM column which simulates the Lateral Geniculate Nuclei (LGN) and ensemble of cortical microcircuits. HHLSM consists 1024 cells put on 8×8×16 grid. There are layers arranged in each column. Set of columns simulates the Liquid which is connected with so-called Readout device. The Readout’s architecture is similar to the Retina, in analogy it is divided into 16 patches with 16 cells in each patch. Connections among layers and neurons of each layer are established with some probability i.e. p=10%. All simulations discussed in this paper are conducted in parallel version of GENESIS for MPI environment [6]. Such a model can be easily scaled to multiprocessor simulation. In referred research each column and its corresponding retinal or readout patches should be simulated on one node. Note that in that case we require 16 processors for the best realisation of the model and additional one for control of simulation. However, both the Retina and the Readout may be easily divided into 4 (2×2), 64 (8×8) or 256 (16×16) patches, depending on the number of processors available. Thus, if each patch is connected with a corresponding HHLSM column we will have possibility to conduct a simulation of about 256 thousands Hodgkin-Huxley neural cells. Fig. 2 Time of Simulation as a function of processors’ number Next, 500 ms of the real system’s work was simulated. 30% of retinal cells were stimulated with random, varying in time spike trains. The signal was then transformed by a liquid. As a result some activity appeared on the readout. We investigated whether there is a significant distance of Liquid states evolving in high-dimensional space. Fig. 3 shows the typical Euclidean distance of two liquid states calculated for two different stimulating patterns. Fig. 3. Distance of the Liquid obtained for two different stimulating patterns. One can note some peak of characteristic liquid activity appearing in about first 50 ms of time. Such peaks are also observed in Maass works [1]. That proves that liquid computing appears also in Hodgkin-Huxley neurons and its properties are quite similar to these known from networks of integrate and fire neurons. After the abovementioned time of 50 ms some oscillations still can be observed, however, they are on the same level until the end of simulation. That implies that the network is in two completely different states for two different stimulations. Fig. 3. Distance of the liquid states for two different stimulating patterns. References [1] Maass W., Natschlager T., Markram H.: Real-time Computing Without Stable States: A New Framework for Neural Computation Based on perturbations. Neural Computations, 14(11): (2002). [2] Kamiński W.A., Wójcik G.M.: Liquid State Machine Built of Hodgkin ‑ Huxley Neurons – Pattern Recognition and Informational Entropy. Annales Informatica UMCS, vol.1, Lublin, (2003). [3] Wójcik G.M., Kamiński W.A.: Liquid State Machine Built of Hodgkin ‑ Huxley Neurons and Pattern Recognition, Neurocomputing vol. 239, (2004). [4] Gupta A., Wang Y., Markram H.: Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex, Science 287, (2000). [5] Hodgkin A., Huxley A.: Currents carried by sodium and potassium ions through the membrane of the giant axon of Loligio. J. Physiol., London [6] Bower J. M., Beeman D.: The Book of GENESIS – Exploring Realistic Neural Models with the GEneral NEural SImulation System. Telos, New York [7] CLUSTERIX: