Approximating the Algebraic Solution of Systems of Interval Linear Equations with Use of Neural Networks Nguyen Hoang Viet Michal Kleiber Institute of.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Yuri R. Tsoy, Vladimir G. Spitsyn, Department of Computer Engineering
B.Macukow 1 Lecture 12 Neural Networks. B.Macukow 2 Neural Networks for Matrix Algebra Problems.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Data classification based on tolerant rough set reporter: yanan yean.
Prediction of Natural Gas Consumption with Feed-forward and Fuzzy Neural Networks N.H. Viet Institute of Fundamental Tech. Research Polish Academy of Sciences.
Design of Curves and Surfaces by Multi Objective Optimization Rony Goldenthal Michel Bercovier School of Computer Science and Engineering The Hebrew University.
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
9.4 – Solving Absolute Value Equations and Inequalities 1.
Application of Back-Propagation neural network in data forecasting Le Hai Khoi, Tran Duc Minh Institute Of Information Technology – VAST Ha Noi – Viet.
Radial Basis Function (RBF) Networks
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Solving Inequalities We can solve inequalities just like equations, with the following exception: Multiplication or division of an inequality by a negative.
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Multiple-Layer Networks and Backpropagation Algorithms
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
Review of Matrices Or A Fast Introduction.
Using Neural Networks to Predict Claim Duration in the Presence of Right Censoring and Covariates David Speights Senior Research Statistician HNC Insurance.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
1 RECENT DEVELOPMENTS IN MULTILAYER PERCEPTRON NEURAL NETWORKS Walter H. Delashmit Lockheed Martin Missiles and Fire Control Dallas, TX 75265
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
Well Log Data Inversion Using Radial Basis Function Network Kou-Yuan Huang, Li-Sheng Weng Department of Computer Science National Chiao Tung University.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Radoslav Forgáč, Igor Mokriš Pulse Coupled Neural Network Models for Dimension Reduction of Classification Space Institute of Informatics Slovak Academy.
1 GMDH and Neural Network Application for Modeling Vital Functions of Green Algae under Toxic Impact Oleksandra Bulgakova, Volodymyr Stepashko, Tetayna.
Robust Pareto Design of GMDH-type Neural Networks for Systems with Probabilistic Uncertainties N. Nariman-zadeh, F. Kalantary, A. Jamali, F. Ebrahimi Faculty.
3.1 You should be able to solve one-step equations using algebra. Solve for x.
Non-Bayes classifiers. Linear discriminants, neural networks.
Akram Bitar and Larry Manevitz Department of Computer Science
Copyright © Cengage Learning. All rights reserved. Fundamentals.
1  Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Differential Equations Linear Equations with Variable Coefficients.
Neural Networks 2nd Edition Simon Haykin
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Multiple-Layer Networks and Backpropagation Algorithms
Supervised Learning in ANNs
Solving Linear Equations and Inequalities
One-layer neural networks Approximation problems
第 3 章 神经网络.
Soft Computing Applied to Finite Element Tasks
A Simple Artificial Neuron
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Introduction to Soft Computing
Artificial Neural Network & Backpropagation Algorithm
Synaptic DynamicsII : Supervised Learning
Artificial Intelligence Chapter 3 Neural Networks
Implementation of neural gas on Cell Broadband Engine
Neural Networks Geoff Hulten.
Capabilities of Threshold Neurons
Artificial Intelligence Chapter 3 Neural Networks
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence 10. Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Linear Discrimination
Prediction Networks Prediction A simple example (section 3.7.3)
Random Neural Network Texture Model
Artificial Intelligence Chapter 3 Neural Networks
Akram Bitar and Larry Manevitz Department of Computer Science
Presentation transcript:

Approximating the Algebraic Solution of Systems of Interval Linear Equations with Use of Neural Networks Nguyen Hoang Viet Michal Kleiber Institute of Fundamental Technological Research. Polish Academy of Sciences.

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Outline Introduction to systems of linear interval equations (SILE), Solving SILE as an optimization task, Neural networks in solving SILE, Result and conclusions.

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Introduction to SILE Interval real number: Operations on interval real numbers: Endpoint notations:

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Introduction to SILE Distance between IRVs:

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Introduction to SILE System of Interval Linear Equations: SILEs can be found in many real world applications (for example in FEM methods), where all uncertainties can be describe in term of impreciseness. The two most popular types of solution: the united solution and the algebraic solution.

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Introduction to SILE Algebraic solution: an interval vector so that the product: gives an interval vector equal to, where the product of interval real numbers:

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Introduction to SILE Algebraic solution (...) And in effect: It is well known that solving the algebraic SILE is an NP-hard problem.

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Solving SILE as an optimization task The cost function: where: The cost function is not differentiable, event not continuous.

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Solving SILE as an optimization task The product of IRNs is approximated by a differentiable function: The modified cost function:

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Solving SILE as an optimization task The modified cost function is now differentiable:

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Neural network for IRN multiplication Network architecture: All neurons are sigmoidal. Such networks represent a differentiable function wrt. to each input variable.

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Neural network for IRN multiplication Hybrid network for computing the gradient wrt. input signals:

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Results and Conclusions A set of 50 networks with 5 hidden neurons were trained with use of 500 training samples and 200 validating samples (randomly generated). These NNs were tested against a test set composed of 2000 samples. The best one was chosen to be used in SILE solving. NN training and cost function minimization were done with use of the Scaled Conjugate Gradient Algorithm.

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Results and Conclusions The left side interval matrices and right side interval vectors were generated in order to test the performance of the proposed approach Results for various sizes of the problem:

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Results and Conclusions Example 1 (f=1.38e-30, 312 iterations)

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Results and Conclusions Example 2: (f=7.8e-5, 42 iterations)

N.H. Viet & M Kleiber. IFTR, Polish Academy of Sciences. Results and Conclusions A new approach for solving the NP-hard problem SILE was proposed. Small neural network, no need to update. The results can be obtained in real time for large systems. An alternative for other approach, for example the GA. Similar techniques are being developed for solving system of fuzzy linear equations.

Thank you for your attention