Download presentation
Presentation is loading. Please wait.
Published byConrad Jacobs Modified over 9 years ago
1
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
2
14.11.20052 Objectives Why too much of models of neural networks (NN)? Classes of tasks and classes of NN Hybrid neural networks Hybrid model based on MLP and ART-2 Paths to improvement of neural networks
3
14.11.20053 Submit a questions to discuss Paths to improvement of neural networks: –Development of growth neural networks with feedback and delays –Development of theory of spiking neurons and building of associative memory based on its –Development of neural network in which during learning logical (verbal) inference would appearance from associative memory
4
14.11.20054 Why too much of models of neural networks (NN)? Models of neural networks simulate separate aspects of working of brain (e.g. associative memory, but how it works in whole is unknown for us. Questions: 1)What is consciousness? 2)What is role of emotions? 3)How different areas of brain are coordinated? 4) How associative links are transformed and used in logical inference and calculations?
5
14.11.20055
6
6 Classes of tasks : prediction classification data association data conceptualization data filtering Neuromathematics
7
14.11.20057 Classes of Neural Networks: Multi Layer Networks –Multi Layer Perceptron (MLP) Supervised learning –Radial Basis Functions (RBF- networks) Supervised learning –Recurrent Neural Networks (Elman, Jordan) Supervised learning Reinforcement learning –Counterpropagation network Supervised learning One-layer networks –Self-organized map (MAP) Unsupervised learning –Artificial resonance theory (ART) Unsupervised learning –Hamming network Supervised learning Fully interconnected networks –Hopfield network Supervised learning –Boltzmann machine Supervised learning –Bi-directional associative memory Supervised learning Spiking networks Supervised learning Unsupervised learning Reinforcement learning
8
14.11.20058 Counterpropagation network
9
14.11.20059 Network Selector Table
10
14.11.200510 Hybrid Neural Networks. Includes: –Main neural network –Other neural network Preprocessing Postprocessing Some models of neural networks consist of some layers working by different manner and so such neural networks may be viewed as hybrid neural networks (including more elementary networks) Some authors calls hybrid neural networks such model which combine paradigms of neural networks and knowledge engineering.
11
14.11.200511 Hybrid Neural Network based on models of Multi-Layer Perceptron and Adaptive Resonance Theory (A.Gavrilov, 2005) Aims to keep capabilities of ARM (plasticity and stability) Include in ART capabilities of MLP during learning to obtain complex secondary features from primary features (to approximate any function)
12
14.11.200512 Disadvantages of model ART- 2 for recognition of images It uses of metrics of primary features of images to recognize of class or create of new class, Transformations of graphic images (shift or rotation or others) essentially influence on distance between input vectors So it is unsuitable for control system of a mobile robots
13
14.11.200513 Architecture of hybrid neural network output vector output layer: clusters input layer: input variables y 1 y 2 y m input layer of ART-2, output layer of perceptron hidden layer of perceptron input vector x 1 x 2 x n
14
14.11.200514 Algorithm of learning without teacher Set of initial weights of neurons; N out :=0; Input of image-example and calculate of outputs of perceptron; If N out =0 then forming of new cluster-output neuron; If N out >0 then calculate of distances between weight vector of ART-2 and output vector of perceptron, select of minimum of them (selection of output neuron-winner) and decide to create or not new cluster; If new cluster is not created then calculate new values of weights of output neuron-winner and calculate new weights of perceptron with algorithm “error back propagation”.
15
14.11.200515 The illustration of algorithm 1 3 42 5 R1R1
16
14.11.200516 Images and parameters used in experiments Quantity of input neurons (pixels) - 10000 (100х100), Quantity of neurons in hidden layer of perceptron - 20, Quantity of output neurons of perceptron (in input layer of ART-2) Nout - 10, Radius of cluster R was used in experiments in different manners: 1) adapt and fix, 2) calculate for every image by formulas S/(2N out ), where S – average input signal, N out – number of output neurons of perceptron, 3) calculated as 2D min, where D min – minimal distance between input vector of ART2 and weight vectors in previous image. Activation function of neurons of perceptron is rational sigmoid with parameter a=1, Value of learning step of perceptron is 1, Number of iterations of recalculation of weights of perceptron is from 1 to 10. 1) 2)3)
17
14.11.200517 Series of images 1 ------------------
18
14.11.200518 Program for experiments
19
14.11.200519 For sequence of images of series 1, 2, 1, 2 (a dark points are corresponding to 2nd kind of calculation of vigilance and light – to 1st one).
20
14.11.200520 For sequence of images of series 1 at different number of iteration of EBP algorithm: 1, 3, 5, 7, 9.
21
14.11.200521 Paths to improvement of neural networks Development of growth neural networks with feedback and delays Development of theory of spiking neural networks and building of associative memory based on them Development of neural network in which during learning logical (verbal) inference would appearance from associative memory
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.