Hybrid CA-Systems: Coupling Cellular Automata with Artificial Neural Nets Christina Stoica www.cobasc.de Institute for Computer Science and Business Information.

Slides:



Advertisements
Similar presentations
Jürgen Klüver Information Technologies and Educational Processes
Advertisements

© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Negative Selection Algorithms at GECCO /22/2005.
Swarm-Based Traffic Simulation
CSCTR Session 11 Dana Retová.  Start bottom-up  Create cognition based on sensori-motor interaction ◦ Cohen et al. (1996) – Building a baby ◦ Cohen.
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
The Logic of Intelligence Pei Wang Department of Computer and Information Sciences Temple University.
Chapter Thirteen Conclusion: Where We Go From Here.
Computer Vision Lecture 18: Object Recognition II
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
Simple Neural Nets For Pattern Classification
UNIVERSITY OF JYVÄSKYLÄ Yevgeniy Ivanchenko Yevgeniy Ivanchenko University of Jyväskylä
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
A Multi-Agent System for Visualization Simulated User Behaviour B. de Vries, J. Dijkstra.
Chapter Seven The Network Approach: Mind as a Web.
Orit Katz Seminar in CS (Robotics) 1.
Lecture #1COMP 527 Pattern Recognition1 Pattern Recognition Why? To provide machines with perception & cognition capabilities so that they could interact.
October 7, 2010Neural Networks Lecture 10: Setting Backpropagation Parameters 1 Creating Data Representations On the other hand, sets of orthogonal vectors.
Lecture 09 Clustering-based Learning
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Radial Basis Function Networks
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Machine Learning. Learning agent Any other agent.
JingTao Yao Growing Hierarchical Self-Organizing Maps for Web Mining Joseph P. Herbert and JingTao Yao Department of Computer Science, University or Regina.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
 Young children view the world very differently from adults.  E.g. no unusual for a child to think the sun follows them.  Field of cognitive psychology.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Self Organizing Maps (SOM) Unsupervised Learning.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Artificial Neural Network Unsupervised Learning
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley.
INTERNATIONAL INSTITUTE FOR GEO-INFORMATION SCIENCE AND EARTH OBSERVATION Transition Rule Elicitation Methods for Urban Cellular Automata Models Junfeng.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Unsupervised Learning
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Cellular Automata Introduction  Cellular Automata originally devised in the late 1940s by Stan Ulam (a mathematician) and John von Neumann.  Originally.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary Zoltán Somogyvári.
Lecture 14, CS5671 Clustering Algorithms Density based clustering Self organizing feature maps Grid based clustering Markov clustering.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Introduction to Machine Learning, its potential usage in network area,
Self-Organizing Network Model (SOM) Session 11
CHAPTER 1 Introduction BIC 3337 EXPERT SYSTEM.
Data Mining, Neural Network and Genetic Programming
Self organizing networks
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Robot Intelligence Kevin Warwick.
Feature mapping: Self-organizing Maps
The Network Approach: Mind as a Web
Artificial Neural Networks
Presentation transcript:

Hybrid CA-Systems: Coupling Cellular Automata with Artificial Neural Nets Christina Stoica Institute for Computer Science and Business Information Systems University of Duisburg-Essen Germany

Hybrid Systems The nearly universal usability of cellular automata (CA) is well known. Models become still more powerful by coupling CA with artificial neural nets (NN). Such CA-NN-models may be called "hybrid systems" that contain certain characteristics of ã learning, ã adaptability and ã flexibility.

First example SIMULATION OF TRAFFIC FLOWS Kohonen Feature Map The cells of the CA represent different types of cars regulates the special traffic lights

Second Example NEURAL CA-SYSTEM The cells of the CA consist of Bi-directional Associative Memory-Nets (BAM) and a Kohonen Feature Map (KFM). models individual learning processes in dependency of a certain social milieu

simulation of traffic flows - CA The cells of the CA represent different types of cars, i.e. different with respect to velocity and type of driving. These artificial cars move on different lanes of a highway. Because of the different velocities and types of driving accidents and other problems will occur that lead to backups. In particular, high density of traffic will increase the probability of accidents.

simulation of traffic flows - CA Obstacle: If possible: overtake, else slow down Speed limit: adapt the speed The state of the cells is defined as: S = {0,1,2} 0 = speed up; 1 = overtake; 2 = adapt the speed Enlarged" Moore neighborhood i.e. two additional cells outside of the Moore neighborhood are taken into consideration. Is the cell on the right line, then consider only the cells at the front, on the left and the additional two cells on the left side.

If car = right line then compute the state S t+1 : If R = set of all cells on the right line, then S ir R: If S ir > 4, then the car speeds up. If L = set of all cells on the left line, then apply for all S ir R: If S il > 4, then the car can overtake simulation of traffic flows - CA

simulation of traffic flows In order to regulate the traffic and to avoid too many accidents the access roads to the highways are regulated by special traffic lights. These traffic lights stop the access if the density of traffic is too high and/or if there are already accidents with according backups.

simulation of traffic flows In the CA-model the traffic lights are regulated by a Kohonen feature map, which belongs to the type of non supervised learning nets. The net is trained to certain critical values of traffic density.

simulation of traffic flows Assumptions: There exists a station of measurement, e.g. 1 km before and after an access road. The numbers of cars, the distance and the speed is measured. The collected data from the CA are the training data for the KFM.

simulation of traffic flows - KFM

Learning rule: Winner-take-all: The amount the units learn will be governed by a neighbourhood kernel h, which is a decreasing function of the distance of the units from the winning unit on the map lattice X=(w 1,....x n ), w j ={w i,...w nj } X=Inputvector W=connection strength The weights will be changed according to the formula: j-z= Distance of neuron j to the kernel z =radius within the units will be changed

simulation of traffic flows - KFM Learning rate (for this model):

simulation of traffic flows

The practical use of such a system is the possible optimization of the real regulating systems that already exist on the Autobahnen in the German Rhein-Ruhr Region.

Second Example: The Evolution of Neural Networks in a "Social" Context

A computational model as a possibility to analyse some important concepts of cognitive development embedded in a social context. The model consists of Neural Networks (BAM / SOM) individual level Cellular AutomatonBoolean Network social level

Theoretical descriptions of cognitive ontogenesis have a long and famous tradition; in the last years research changed the focus of descriptions by including the interdependency between social context and cognitive development.

"dependency of social context" cognitive development of a learning system gets information from its environment organizes its own evolution by constructing cognitive representations

The factual development of the system is dependent on: ë its particular developmental logic, ë i.e., the cognitive dynamics that governs its evolutionary path. ë environment or context respectively determines the development by orientating the system into certain directions and by slowing or fastening the whole process.

Referring to cognitive ontogenesis, the fact must be taken into consideration that intelligent actors "construct" actively the concepts and cognitive categories they use for world representation. Even learning processes by which people take over concepts from other people are no simple imitation processes but rather complex constructive ones whose results are dependent on the individual learning biography of the learners and the social context in which they take over the new concepts.

Concept Building Analogy Conceptual learning (supervised vs. unsupervised learning) Social learning

Supervised vs. unsupervised learning Supervised learning means that the learner gets an immediate response (valuation) after solving a problem. Non supervised learning means that the cognitive task has to be fulfilled by applying particular schemas that the learner has learned before. Usually theses processes are done without immediate responses or valuation respectively by the environment

Bi-directional Associative Memory (BAM) Hetero-associative network The network gets pairs of vectors e.g.: X 1 = (x 11,x 12,....,x 1n ) T Y 1 = (y 11,y 12,.....,y 1m ) T X 2 = (x 21,x 22,....,x 2n ) T Y 2 = (y 21,y 22,.....,y 2m ) T X 3 = (x 31,x 32,....,x 3n ) T Y 3 = (y 31,y 32,.....,y 3m ) T (Contains the features) (contains the concepts for the features) (x,y) 1,-1

Learning rule: The weight matrix is computed by the following algorithm: BAM

Each cognitive system has often the task to create concepts by its own. This creative operation is not done arbitrarily but mainly by formation of analogy: If a learner has to create new concepts by himself - without supervising - (s)he will rather often (perhaps not always) do so by applying the logic (s)he has learned before. Creating new concepts

Creating new concepts -> Analogy If (X,-) is a new vector with no according Y-part, then Y is calculated: XW = Y with XW = Y W is the weight matrix of X and Y, with H(X,X) = min for all X. H(X,X) is the Hamming distance of X and X.

Building semantical networks The second type that is used to model the generation of semantic networks is a "Kohonen Feature Map" (KFM), which is able to learn in an unsupervised way. KFM is the best known example of unsupervised learning. Its task is the collecting and ordering of singular concepts, that is the forming of concept clusters. Learning occurs in this type conforming to the following learning rule:

Kohonen Feature Map (KFM) Self-Organising Map (SOM) Learning rule: Winner-take-all: The amount the units learn will be governed by a neighbourhood kernel h, which is a decreasing function of the distance of the units from the winning unit on the map lattice X=(w 1,....x n ), w j ={w i,...w nj } X=Inputvector W=connection strength The weights will be changed according to the formula: j-z= Distance of neuron j to the kernel z =radius within the units will be changed

Building semantical networks The resulting ensemble of clusters is a formal representation of a semantic network. The KFM gets the information directly from the different BAM networks. The Y-vectors represent the concepts that shall be clustered according to the X-vectors, which consist of the respective features of perceptions. The KFM clusters only the concepts and not the features, so it is not always evident why the clusters are generated this way (this fact can be observed in human interactions as well).

Each learner A (a cell in the CA) can be represented as the according set of concepts C A ={c 1,....,c n } with c i = (X i,Y i ). If B N(A) (the Moore neighbourhood of A) has a set C B with C B | | C D for all D N(A) and If c k C B and c k C A and If (X k ) is presented to A, then C A ={c 1,....,c n, c k } in the next time step Learning in a social milieu

Reproduction Two actors who are placed together on the grid and who have reached a sufficient age can get a child, i.e., a new actor is generated with an age of 0. The relations between the parents and the child become asymmetrical, that is one-directional from the parents to the child.

Basically the actors (learners) are placed on the grid according to the topology of a cellular automaton (CA). This means that the relations between the actors are symmetrical: R(a,b) = R(b,a) If two artificial actors became parents, then the CA is transformed into a Boolean net (BN) with asymmetrical or anti-symmetrical relations. R(a,b) R(b,a) Transformation

The Program

Conclusions The differences of individual developments are often (although not always) due to the temporal order in which learners get acquainted with new concepts. Therefore it is not enough to analyse the difference of learning milieus in terms of the number of concepts they offer to the learners but it is nearly as important to observe the temporal order of informational processes. In this sense culture as ordered sets of concepts must be taken into regard when analysing learning processes.

Conclusions A social milieu that forces the learner to learn everything the social environment offers can be counterproductive for the learner: he has to spend all his time to take over knowledge already known and can not unfold his own innovative capability. Therefore a cognitive development that allows the learner to unfold his creativity must rely upon an environment that allows "social forgetting", i.e., ignoring some knowledge that has been achieved by elders.

Thank You