Inductive modelling: Detection of system states validity Pavel Kordík Department of Computer Science and Engineering, FEE,CTU Prague

Slides:



Advertisements
Similar presentations
Perceptron Lecture 4.
Advertisements

Slides from: Doug Gray, David Poole
NEURAL NETWORKS Perceptron
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Artificial Neural Networks
Artificial Neural Networks - Introduction -
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
Using Interfaces to Analyze Compositionality Haiyang Zheng and Rachel Zhou EE290N Class Project Presentation Dec. 10, 2004.
Power Systems Application of Artificial Neural Networks. (ANN)  Introduction  Brief history.  Structure  How they work  Sample Simulations. (EasyNN)
1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
Outline 1-D regression Least-squares Regression Non-iterative Least-squares Regression Basis Functions Overfitting Validation 2.
Using Neural Networks to Predict Claim Duration in the Presence of Right Censoring and Covariates David Speights Senior Research Statistician HNC Insurance.
1 GMDH and Neural Network Application for Modeling Vital Functions of Green Algae under Toxic Impact Oleksandra Bulgakova, Volodymyr Stepashko, Tetayna.
PROCESS MODELLING AND MODEL ANALYSIS © CAPE Centre, The University of Queensland Hungarian Academy of Sciences Statistical Model Calibration and Validation.
Feature selection with Neural Networks Dmitrij Lagutin, T Variable Selection for Regression
Neural Networks Demystified by Louise Francis Francis Analytics and Actuarial Data Mining, Inc.
Pre-calculated Fluid Simulator States Tree Marek Gayer and Pavel Slavík C omputer G raphics G roup Department of Computer Science and Engineering Faculty.
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.

IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
Computacion Inteligente Least-Square Methods for System Identification.
CORRELATION-REGULATION ANALYSIS Томский политехнический университет.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
VISUALIZATION TECHNIQUES UTILIZING THE SENSITIVITY ANALYSIS OF MODELS Ivo Kondapaneni, Pavel Kordík, Pavel Slavík Department of Computer Science and Engineering,
Today’s Lecture Neural networks Training
Multiple-Layer Networks and Backpropagation Algorithms
SUR-2250 Error Theory.
Chapter 7. Classification and Prediction
OPTIMIZATION OF MODELS: LOOKING FOR THE BEST STRATEGY
Deep Feedforward Networks
Supervised Learning in ANNs
CSE 4705 Artificial Intelligence
DEPARTMENT: COMPUTER SC. & ENGG. SEMESTER : VII
Neural Networks Dr. Peter Phillips.
2006 IEEE World Congress on Computational Intelligence, International Joint Conference on Neural Networks (IJCNN) Evolutionary Search for Interesting Behavior.
Automatic Picking of First Arrivals
Luís Filipe Martinsª, Fernando Netoª,b. 
Regularization of Evolving Polynomial Models
Prof. Carolina Ruiz Department of Computer Science
Machine Learning Today: Reading: Maria Florina Balcan
Objective of This Course
Artificial Intelligence Methods
klinické neurofyziologie
Artificial Neural Network & Backpropagation Algorithm
of the Artificial Neural Networks.
Artificial Intelligence Chapter 3 Neural Networks
General Aspects of Learning
Neural Network - 2 Mayank Vatsa
Wireless Sensor Networks: nodes localization issue
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Identification of Wiener models using support vector regression
Artificial Intelligence Chapter 3 Neural Networks
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Artificial Intelligence Chapter 3 Neural Networks
Neuro-Computing Lecture 2 Single-Layer Perceptrons
Backpropagation David Kauchak CS159 – Fall 2019.
Artificial Intelligence Chapter 3 Neural Networks
Computer Vision Lecture 19: Object Recognition III
Introduction to Neural Network
A Data Partitioning Scheme for Spatial Regression
Artificial Intelligence Chapter 3 Neural Networks
Akram Bitar and Larry Manevitz Department of Computer Science
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Inductive modelling: Detection of system states validity Pavel Kordík Department of Computer Science and Engineering, FEE,CTU Prague 7th International Student Conference on Electrical Engineering, POSTER 2003, May 22, 2003, FEE, Czech Technical University, Prague, Czech Republic Introduction Inductive modelling using the Modified GMDH We would like to introduce the approach we have developed for locating correct (sufficiently defined) states of a system. Inductive models representing the system are constructed by our method (Modified GMDH) on the data set. The data set contains records of input and output variables of the system – it describes states of the system modelled. Inductive models are able to derive values of dependent output variables for every configuration of input variables. Correct states of modelled system are these that are well defined by the data set. The rest is neither enough defined nor the data set contains any records describing these states, at all. Employing our technique allows automatic detection of correct system states (correct input configurations of the inductive model) without the need of computation on the data set. Keywords Group Method of Data Handling (GMDH), Perceptron neural network, inductive modelling, visualization techniques. Visualization techniques and system states validity detection IC20 Conclusion We have introduced the technique for the automatic detection of system states validity. It allows us to locate interesting areas and to avoid ill-defined areas of the system behaviour. It is essential specially when modelling real data sets that are mostly multidimensional and define systems just partly. Thanks to Neural Computing Group members, Doc. Miroslav Šnorek for support and Philip Prendergast from Hort Research, New Zealand for the Mandarin data set. We generated small data set (40 data vectors) describing the artificial system. The output variable y depends on input variables x 1, x 2 in conformity with the following equation: Training data vectors are distributed uniformly in the area x 1, x 2  0,1 . The data set is problem A of the "Great energy predictor shootout" contest. "The Great Energy Predictor Shootout" - The First Building Data Analysis And Prediction Competition; ASHRAE Meeting; Denver, Colorado; June, 1993; The java application has been created. It allows the real-time simulation of the group of Modified GMDH models. When we compute mean square error of all models in the group, we can usually find a few models with the error much bigger than the rest of the group. The quality of these models is lower (in the area of the input space defined by the presence of testing vectors). These models were not successfully generated and should be eliminated from the group. The remaining models have always approximately the same mean square error. To check if models in the group are equivalent, we designed method facilitating visualization of their responses. Visualization of nine models developed on the artificial data in the graph of the input-output relationship. The variable x 5 (PAR), as the majority of natural variables, has the normal distribution. This distribution is not very suitable for the shape definition because the values concentrate in a small dense cloud. If we expose the tree to very different conditions, we would achieve the cloud to get bigger and the shape of curves in the graph to be better defined. As can be seen in the graph for the variable x 1 (Time), that has the uniform distribution. Mandarin tree water consumption data set Building data set The data set is property of the Hort Research company, New Zealand; The Modified GMDH, which is being developed at our university, proceeds from GMDH introduced by Ivachknenko in It uses a data set to construct a model of a complex system. The model is represented by a network. Layers of units transfer input signals to the output of the network. The coefficients of units transfer functions are estimated using the data set describing the modelled system. L Linear unit Units in the network can be of several types. The units of transfer function proper to data set nature survive in the model. Data of various types can be modelled. The Modified GMDH generates a group of models on a single training data set. The random processes influence the construction procedure. Weights and coefficients of units are randomly initialized. Transfer functions of many units types are defined pseudo-randomly when the unit is initialized. Inputs for units are selected pseudo- randomly, as well. The modelling method searches just the subspace of all possible architectures and connections of units. It results in the fact, that the topology of models developed on the same training data set differs. We have the group of models developed on the same training set. The question is how to determine the quality of models in the group and how to detect the validity of the model response. It is hard to determine correct system states unless the data set is available. Computing the distribution and the density of data vectors in the neighbourhood of the system state is often unsatisfactory. Our approach is to monitor responses of inductive models constructed on the data set. The valid system states are in areas of models responses coincidence. Where responses differ, there is not enough information in the data set, the inductive models to be valid (within these areas).