Nikola Kasabov, FIEEE, FRSNZ INNS Education Symposium – Lima, Peru, 24-25.01.2011 New Trends in Artificial Neural Networks for Evolving Intelligent Systems.

Slides:



Advertisements
Similar presentations
Artificial Spiking Neural Networks
Advertisements

Introduction: Neurons and the Problem of Neural Coding Laboratory of Computational Neuroscience, LCN, CH 1015 Lausanne Swiss Federal Institute of Technology.
Biological and Artificial Neurons Michael J. Watts
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Machine Learning Neural Networks
Soft computing Lecture 6 Introduction to neural networks.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Introduction to machine learning
Radial Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Medical Informatics Basics
컴퓨터 그래픽스 분야의 캐릭터 자동생성을 위하여 인공생명의 여러 가지 방법론이 어떻게 적용될 수 있는지 이해
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Machine Learning. Learning agent Any other agent.
Self-Organized Recurrent Neural Learning for Language Processing April 1, March 31, 2012 State from June 2009.
Department of Information Technology Indian Institute of Information Technology and Management Gwalior AASF hIQ 1 st Nov ‘09 Department of Information.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Neural Key Exchange Presented by: Jessica Lowell 10 December 2009 CS 6750.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 INFO331 Machine learning. Neural networks. Supervised.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
An Introduction to Artificial Intelligence and Knowledge Engineering N. Kasabov, Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering,
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Artificial Neural Networks. Applied Problems: Image, Sound, and Pattern recognition Decision making  Knowledge discovery  Context-Dependent Analysis.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 14/15 – TP19 Neural Networks & SVMs Miguel Tavares.
NEURAL NETWORKS FOR DATA MINING
So Far……  Clustering basics, necessity for clustering, Usage in various fields : engineering and industrial fields  Properties : hierarchical, flat,
Machine Learning Extract from various presentations: University of Nebraska, Scott, Freund, Domingo, Hong,
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Synaptic Dynamics: Unsupervised Learning
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
From brain activities to mathematical models The TempUnit model, a study case for GPU computing in scientific computation.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Chapter 6 Neural Network.
A field of study that encompasses computational techniques for performing tasks that require intelligence when performed by humans. Simulation of human.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
IJCNN, July 27, 2004 Extending SpikeProp Benjamin Schrauwen Jan Van Campenhout Ghent University Belgium.
Ghent University Compact hardware for real-time speech recognition using a Liquid State Machine Benjamin Schrauwen – Michiel D’Haene David Verstraeten.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Democritus University of Thrace Dep. of Forestry & Management of the Environment & Natural Resources Forest Informatics Laboratory Konstantinos Demertzis.
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Evolving Computational Intelligence System for Malware Detection
The 14th International Conference on Neural Information Processing (ICONIP2007) ICONIP 2007 참관기 홍진혁.
Deep Learning Amin Sobhani.
Chapter 3. Artificial Neural Networks - Introduction -
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Emre O. Neftci  iScience  Volume 5, Pages (July 2018) DOI: /j.isci
Introduction to Neural Network
Presentation transcript:

Nikola Kasabov, FIEEE, FRSNZ INNS Education Symposium – Lima, Peru, New Trends in Artificial Neural Networks for Evolving Intelligent Systems and Knowledge Engineering Nikola Kasabov, FIEEE, FRSNZ Knowledge Engineering and Discovery Research Institute, KEDRI Auckland University of Technology (AUT), New Zealand Immediate Past President International Neural Network Society

The Scope of the Contemporary NN Research: Understanding, modelling and curing the brain Cognition and cognitive modelling Memory Learning, development Neurogenetic modelling Neuro-informatics Mathematics of the brain Brain ontologies Bioinformatics Molecular computing Quantum information processing; Quantum inspired neural networks Novel methods of soft computing for adaptive modelling and knowledge discovery; Hybrid NN-, fuzzy-, evolutionary- algorithms; Methods of evolving intelligence (EI); Evolving molecular processes and their modelling Evolving processes in the brain and their modelling Evolving language and cognition Adaptive integrated/embedded systems

Adaptive speech, image and multimodal processing; Biosecurity Adaptive decision support systems; Dynamic time-series modelling; Adaptive control; Adaptive intelligent systems on the WWW; Medicine, Health, Information Technologies, Horticulture, Agriculture, Business and finance, Process and robot control, Arts and Design; Space research Earth and oceanic research The Scope of the Contemporary NN Applications:

Abstract of the talk The talk presents theoretical foundations and practical applications of evolving intelligent information processing systems inspired by information principles in Nature in their interaction and integration. That includes neuronal-, genetic-, and quantum information principles, all manifesting the feature of evolvability. First, the talk reviews the main principles of information processing at neuronal-, genetic-, and quantum information levels. Each of these levels has already inspired the creation of efficient computational models that incrementally evolve their structure and functionality from incoming data and through interaction with the environment. The talk also extends these paradigms with novel methods and systems that integrate these principles. Examples of such models are: evolving spiking neural networks; computational neurogenetic models (where interaction between genes, either artificial or real, is part of the neuronal information processing); quantum inspired evolutionary algorithms; probabilistic spiking neural networks utilizing quantum computation as a probability theory. The new models are significantly faster in feature selection and learning and can be applied to solving efficiently complex biological and engineering problems for adaptive, incremental learning and knowledge discovery in large dimensional spaces and in a new environment. Examples include: incremental learning systems; on-line multimodal audiovisual information processing; evolving neuro-genetic systems; bio-informatics; biomedical decision support systems; cyber-security. Open questions, challenges and directions for further research are presented.

Text and research book: N.Kasabov (2007) Evolving Connectionist Systems: The Knowledge Engineering Approach, Springer, London ( Main references: N.Kasabov, Evolving Intelligence in Humans and Machines: Integrative Connectionist Systems Approach, Feature article, IEEE CIS Magazine, August, 2008, vol.3, No.3, pp N.Kasabov, Integrative Connectionist Learning Systems Inspired by Nature: Current Models, Future Trends and Challenges, Natural Computing, Int. Journal, Springer, Vol. 8, 2009, Issue 2, pp N.Kasabov, To spike or not to spike: A probabilistic spiking neural model, Neural Networks, Volume 23, Issue 1, January 2010, Benuskova, L. and N.Kasabov, Computational neuro-genetic modelling, Springer, New York, 2007, 290 pages P. Angelov, D.Filev, and N.Kasabov (eds) Evolving intelligent systems, IEEE Press and Wiley, 2010

Content of the talk SNN have a tremendous potential as a new paradigm to implement and achieve computational intelligence (CI). Based on biological evidence, new SNN models can be developed to solve complex generic and specific tasks of CI. 1)SNN: Models, applications, challenges. 2)Evolving SNN (eSNN). 3)Probabilistic evolving SNN (peSNN). 4)Quantum-inspired optimisation of evolving SNN. 5)Neuro-genetic evolving SNN (ngeSNN). 6)Further NN models and applications.

1. SNN: Models, Applications, Challenges Information processing principles in neurons and neural networks: –LTP and LTD; –Trains of spikes; –Time, frequency and space; –Synchronisation and stochasticity; –Evolvability… They offer the potential for: –Modelling cognitive functions through patterns of neuronal spiking activity; –Modelling neuronal activities based on genes and proteins; –Integration of different ‘levels ‘of information processing.

Models of Spiking Neurons Spiking neuron models incorporate the concept of time and phase in relation to the neuronal and synaptic states Microscopic Level: Modeling of ion channels, that depend on presence/absence of various chemical messenger molecules  Hodgkin-Huxley’s  Izhikevich’s Macroscopic Level: Neuron is a homogenous unit, receiving and emitting spikes according to defined internal dynamics  Spike response model (SRM) (Maass)  Integrate-and-Fire models (IF, LIF) (Maass, Gerstner)  Thorpe’s model  A probabilistic spiking neuron model (pSNM) Integrative  A quantum inspired optimisation of evolving SNN  A neuro-genetic evolving SNN (ngeSNN)

Rank Order Population Encoding Distributes a single real input value to multiple neurons and may cause the excitation and firing of several responding neurons Implementation based on Gaussian receptive fields introduced by Bothe et al

Learning in SNN Due to time dependence, learning methods are rather complex in SNN Recurrent SNN introduce additional difficulties and complexity Unsupervised Hebbian learning Spike-timing dependent plasticity (STDP) Reinforcement learning (e.g. for robotics applications) SpikeProp – supervised error back-propagation, similar to learning in classical MLP (Linear) readout functions for the Liquid State Machines (Maas et al) ReSuMe – Remote Supervised Learning, capable of learning the mapping from input to output spike trains Weight optimization based on evolutionary algorithms (EA) Combined EA and STDP

Spike-Time Dependent Plasticity (STDP) Hebbian form of plasticity in the form of long-term potentiation (LTP) and depression (LTD) Effect of synapses are strengthened or weakened based on the timing of post-synaptic action potentials Pre-synaptic activity that precedes post-synaptic firing can induce LTP, reversing this temporal order causes LTD

Thorpe’s Model Simple but computationally efficient neural model, in which early spikes are stronger weighted – time to first spike learning Model was inspired by the neural processing of the human eye and introduced by S. Thorpe et. al PSP u i (t) of a neuron i : w ji is the weight of the connection between neuron j and i, f (j ) is the firing time of j, m i a parameter of the model (modulation factor) Function order (j ) represents the rank of the spike emitted by neuron j and receive at neuron i

Software and hardware platforms for SNN applications Software simulators: Neuron Genesis Izhikevich software eSNN and CNGM ( SpikeNet (Delorme and Thorpe) many others Hardware realisations: –SPINN (TU Berlin) –SpinNNaker (U.Manchester) –FPGA (U.Ulster, McGinnity) –BlueGene (IBM, Almaden, Modha) –EMBRACE (EU) –Chip design (G.Indivieri, U. Zurich)

2. Evolving SNN (eSNN) Inspiration from the brain The brain evolves through genetic “pre-wiring” and life-long learning Evolving structures and functions Evolving features Evolving knowledge Local (e.g. cluster-based) learning and global optimisation Memory (prototype)-based learning, “traceable” Multimodal, incremental learning Spiking activity Genes/proteins involved Quantum effects in ion channels The challenge: To develop evolving SNN models (eSNN) to facilitate the creation of evolving CI.

An eSNN model (Kasabov, 2007; Wysoski, Benuskova and Kasabov, ) Creating and merging neurons based on localised information Uses the first spike principle (Thorpe et al.) for fast on-line training For each input vector a) Create (evolve) a new output spiking neuron and its connections b) Propagate the input vector into the network and train the newly created neuron c) Calculate the similarity between weight vectors of newly created neuron and existing neurons: IF similarity > SIMthreshold THEN Merge newly created neuron with the most similar neuron, where N is the number of samples previously used to update the respective neuron. d) Update the corresponding PSP threshold ϑ: Three main parameters of the eSNN : Modulation factor m; Spiking threshold ϑ; SIMthreshold 16 Weights change based on the spike time arrival

eSNN for integrated audio-visual information processing Person authentication based on speech and face data (Wysoski, Benuskova and Kasabov, Proc. ICONIP, 2007; Neural Networks, 2010)

eSNN for taste recognition S.Soltic, S.Wysoski and N.Kasabov, Evolving spiking neural networks for taste recognition, Proc.WCCI 2008, Hong Kong, IEEE Press, 2008 The L2 layer evolves during the learning stage (S Θ ). Each class C i is represented with an ensemble of L2 neurons Each ensemble (G i ) is trained to represent one class. The latency of L2 neurons’ firing is decided by the order of incoming spikes. Tastants (food, beverages) “Artificial tongue” GRF layer – population rank coding (m receptive fields)... L1 neurons ( j )... L2 neurons ( i ) ESNN... G 1 (C 1 ) G k (C k )...

Knowledge discovery through eSNN CiCi CjCj L1 m L2 i L2 j (S.Soltic and N.Kasabov, Int.J.Neural Systems, 2010) The eSNN architecture can be analysed for new information discovery at several levels: - Feature subset selection (important variables are discovered for the problem in hand); - The (optimised) parameters can be interpreted in the context of the problem - Association fuzzy rules can be extracted from the trained eSNN structure (the eSNN learn through clustering), e.g.:

3. Probabilistic evolving SNN (peSNN) njnj nini p j (t) p cji (t) p sj,i (t), w ji (t) p i (t) The PSPi(t) is now calculated using a new formula: PSP i (t) = ∑ ∑ e j g(p cj,i (t-p)) f(p sj,i (t-p)) w j,i (t) + η(t-t 0 ) p=t 0,.,t j=1,..,m where: e j is 1, if a spike has been emitted from neuron n j, and 0 otherwise; g(p cj,i (t)) is 1 with a probability p cji (t), and 0 otherwise; f(p sj,i (t)) is 1 with a probability p sj,i (t), and 0 otherwise; t 0 is the time of the last spike emitted by n i ; η(t-t 0 ) is an additional term representing decay in the PSP. As a special case, when all or some of the probability parameters are fixed to “1”, the ipSNM will be simplified and will resemble some already known spiking neuron models, such as SRM. Probabilistic spiking neuron model, pSNM (Kasabov, Neural Netw., 2010). The information in pSNM is represented as both connection weights and probabilistic parameters for spikes to occur and propagate. The neuron (n i ) receives input spikes from pre-synaptic neuron n j (j=1,2,…,m). The state of neuron n i is described by the sum of the inputs received from all m synapses – the postsynaptic potential, PSPi(t). When PSPi(t) reaches a firing threshold i (t), neuron ni fires, i.e. emits a spike.

peSNN for Spatiotemporal and Spectrotemporal Data Modelling S. Schliebs, N. Nuntalid, and N. Kasabov,Towards spatio-temporal pattern recognition using evolving spiking neural networks, Proc. ICONIP 2010, LNCS

peSNN for Spatiotemporal and Spectrotemporal Data Modelling - Applications

4. Quantum inspired optimisation of eSNN 1) The principle of quantum probability feature representation: At any time a feature is both present and not present in a computational model, which is defined by the probability density amplitudes. When the model computes, the feature state is ‘collapsed’ in either 0 (not used) or 1 (used ). 2) Quantum probability representation of the connections per the peSNN. 3) Quantum probability representation of the eSNN parameters. N.Kasabov, Integrative connectionist learning systems inspired by Nature: Current models, future trends and challenges, Natural Computation, Springer, 2009, 8:

qi Evolutionary Algorithms QiEA use a q-bit representation of a chromosome of n “genes” at a time t: Each q-bit is defined as a pair of numbers (α, β) – probability density amplitudes. A n element q-bit vector can represent probabilistically 2 n states at any time The output is obtained after the q-bit vector is collapsed into a single state Changing probability density with quantum gates, e.g. rotation gate: M. Defoin-Platel, S.Schliebs, N.Kasabov, Quantum-inspired Evolutionary Algorithm: A multi-model EDA, IEEE Trans. Evolutionary Computation, Dec.,

Integrated feature selection and parameter optimisation for classification tasks using eSNN and QiEA S.Schliebs, M.Defoin-Platel and N.Kasabov, ICONIP’2008 and Neural Networks, 2010)

eSNN and QiEA for feature selection in ecological modelling (insect establishment prediction) Evolution of classification accuracy on the climate data set after 3,000 generations. Evolution of the features on the climate data set. The best accuracy model is obtained for 15 features.

5. Neurogenetic evolving SNN (ngeSNN) Gene information processing principles: Nature via Nurture Complex interactions between thousands of genes (appr expressed in the brain) and proteins (more than 100,000) Different time-scales Stochastic processes Offer the potential for: Integrating molecular and neuronal information processing (possibly with particle level as well) The challenge: How do we integrate molecular and spiking neuronal processes in a SNN?

Computational Neurogenetic Modelling A ngeSNN is an eSNN that incorporates a gene regulatory network to capture the interaction of several genes related to neuronal activities.

6. Future NN models and applications Modelling evolving connectivity in a large scale CNGM – synaptic ontogenesis; Integrated probabilistic evolving ngSNN for CNGM (pengSNN); Integrated brain-gene ontology with ngSNN; Methods and algorithms for generic tasks of a large scale: finite automata; associative memories; Neurogenetic robots; Medical decision support systems for personalised risk and outcome prediction of brain diseases: –AD; –Clinical depression; –Stroke and TBI; –Bipolar disease (data from the Welcome Trust UK – New hardware and software realisations; Large scale applications for cognitive function modelling; Large scale engineering applications, e.g. cyber security, environmental disaster prediction, climate change prediction, ….

KEDRI: The Knowledge Engineering and Discovery Research Institute ( Auckland University of Technology (AUT), New Zealand Established June 2002 Funded by AUT, NERF (FRST), NZ industry External funds approx NZ$4 mln. 4 senior research fellows and post-docs 25PhD and Masters students; 25 associated researchers Both fundamental and applied research (theory + practice) 320 refereed publications 5 patents Multicultural environment (10 ethnic origins) Strong national and international collaboration New PhD students are welcome to apply.