1 GMDH and Neural Network Application for Modeling Vital Functions of Green Algae under Toxic Impact Oleksandra Bulgakova, Volodymyr Stepashko, Tetayna.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Financial Informatics –XVI: Supervised Backpropagation Learning
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
1 Part I Artificial Neural Networks Sofia Nikitaki.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
The back-propagation training algorithm
Radial Basis Functions
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neural networks.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Radial Basis Function Networks
Artificial neural networks:
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Artificial Neural Networks
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Appendix B: An Example of Back-propagation algorithm
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
Lecture 3 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 3/1 Dr.-Ing. Erwin Sitompul President University
Classification / Regression Neural Networks 2
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
Map-Reduce for Machine Learning on Multicore C. Chu, S.K. Kim, Y. Lin, Y.Y. Yu, G. Bradski, A.Y. Ng, K. Olukotun (NIPS 2006) Shimin Chen Big Data Reading.
11 1 Backpropagation Multilayer Perceptron R – S 1 – S 2 – S 3 Network.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Neural Networks 2nd Edition Simon Haykin
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Neural NetworksNN 21 Architecture We consider the architecture: feed- forward NN with one layer It is sufficient to study single layer perceptrons with.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Learning with Perceptrons and Neural Networks
第 3 章 神经网络.
Real Neurons Cell structures Cell body Dendrites Axon
CSE 473 Introduction to Artificial Intelligence Neural Networks
Derivation of a Learning Rule for Perceptrons
Dr. Unnikrishnan P.C. Professor, EEE
Artificial Neural Network & Backpropagation Algorithm
Synaptic DynamicsII : Supervised Learning
of the Artificial Neural Networks.
Artificial Intelligence Chapter 3 Neural Networks
Neural Network - 2 Mayank Vatsa
Artificial Intelligence Chapter 3 Neural Networks
Backpropagation.
Multilayer Perceptron: Learning : {(xi, f(xi)) | i = 1 ~ N} → W
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Backpropagation.
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

1 GMDH and Neural Network Application for Modeling Vital Functions of Green Algae under Toxic Impact Oleksandra Bulgakova, Volodymyr Stepashko, Tetayna Parshikova, Iryna Novikova

2 GMDH and Neural Network Application for Modeling Vital Functions of Green Algae under Toxic Impact This work presents the modeling results of influence of toxic bichromate potassium on the vital functions cells of green algae using intellectual computing – Group method of data handling (MGDH) and neural network (NN) with backpropagation algorithm learning. The results of the laboratory experiments executed at the Kyiv national university. In all experiments was used the toxic – bichromate potassium with concentration from 0,05 to 135 mg/l. In some experiments this toxic was used with algin acid. The goal of our work is to get better result of forecasting influence of toxic bichromate potassium on the vital functions cells of green algae and compare results which were got using different methods.

3 GMDH GMDH is constructed by the following six procedures: (1) Separating original data into test and control sets Original data is separated into test and control sets. Test data is used for estimating parameters of partial descriptions which describe partial relationships of the nonlinear system. Control data is used for organizing complete description which describes complete relationships between input and output variables. (2) Generating combinations of input variables in each layer All combinations of two input variables (xi, xj) are generated in each layer. (3) Calculating partial descriptions For each combination, partial descriptions of the system can be calculated by applying regression analysis to testing data. Output variables of partial descriptions are called as intermediate variables. (4) Selecting intermediate variables L intermediate variables which give L smallest test errors calculated using control data are selected from generated intermediate variables. (5) Iterating calculations from 2 to 5 Select L intermediate variables are set to input variables of the next layer and calculations from procedure 2 to 5 are iterated. The multilayered architecture is organized. (6) Stopping multilayered iterative calculation When errors of test data in each layer stop decreasing, iterative calculation is terminated. Finally, complete description of the nonlinear system is constructed by partial descriptions generated in each layer. Neural network with backpropagation learning algorithm Backpropagation is the most widely applied learning algorithm for neural networks. It learns the weights for a multilayer network, given a network with a fixed set of weights and interconnections. Backpropagation employs gradient descent to minimize the squared error between the networks output values and desired values for those outputs. The goal of gradient descent learning is to minimize the sum of squared errors by propagating error signals backward through the network architecture upon the presentation of training samples from the training set. These error signals are used to calculate the weight updates which represent the knowledge learnt in the network.

4 by GMDH We ran our experiments by using 60% of the data for testing and 30% for control and the remaining 10% for forecast. On fig.1 you can see real (y) and prognosis (ypred) data.

5 уpred = 178,98 -3,13x +0,04x2 -0,0002x3 where x – concentration уpred = 175,37 -3,71x +0,04x2 -0,00015x3 уpred = 2287,21 -45,21x +0,533x2 -0,002x3 Explore reaction response cells of Euglena gracilis on the difference phases of development

6

7 by NN We ran our experiments by using 90% of the data for testing (testing and control) and the remaining 10% for forecast. The neurons in the network had a sigmoidal discriminant function and all networks were trained using the standard quadratic error function. For compare GMDH and NN we used experiment 1-a. On fig.4 (GMDH, NN) you can see real (y) and prognosis (ypred) data which were modeling by GMDH and NN.

8 Interpolation model, t = 1,4,7; п = 0,1,2,3,4; For more exact description of model, we build dependence of output variable which allows reproducing absent information for the intervals of time and concentrations. For this we build dependence output variables from time: where x1 – concentration and coefficients are equals: y(t) a0a1a2a3a4 y(1)y(1) 467,45-83,23-106,73166,45-62,24 y(4)y(4) 512,89-331,6125,33337,13-177,60 y(7)y(7) 439, ,186172, ,821781,35 On this basis, we got next model:

9

10 Thank you for attention