OPTIMIZATION OF MODELS: LOOKING FOR THE BEST STRATEGY

Slides:



Advertisements
Similar presentations
School of Applied Technology, Dep. Of Computer Engineering, T.E.I of Epirus A-Class: a novel classification method I.Tsoulos, A. Tzallas, E. Glavas.
Advertisements

CGeMM – University of Louisville Mining gene-gene interactions from microarray data - Coefficient of Determination Marcel Brun – CGeMM - UofL.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
It Is Better to Run Iterative Resynthesis on Parts of the Circuit Petr Fišer, Jan Schmidt Faculty of Information Technology Czech Technical University.
Branch Prediction with Neural- Networks: Hidden Layers and Recurrent Connections Andrew Smith CSE Dept. June 10, 2004.
1 Part I Artificial Neural Networks Sofia Nikitaki.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Speaker Adaptation for Vowel Classification
Evolving Neural Networks in Classification Sunghwan Sohn.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
Hybrid AI & Machine Learning Systems Using Ne ural Network and Subsumption Architecture Libraries By Logan Kearsley.
Application of the Direct Optimized Probabilistic Calculation Martin Krejsa Department of Structural Mechanics Faculty of Civil Engineering VSB - Technical.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Outline 1-D regression Least-squares Regression Non-iterative Least-squares Regression Basis Functions Overfitting Validation 2.
Treatment Learning: Implementation and Application Ying Hu Electrical & Computer Engineering University of British Columbia.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
RECENT DEVELOPMENTS OF INDUCTION MOTOR DRIVES FAULT DIAGNOSIS USING AI TECHNIQUES 1 Oly Paz.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
WB1440 Engineering Optimization – Concepts and Applications Engineering Optimization Concepts and Applications Fred van Keulen Matthijs Langelaar CLA H21.1.
11 Experimental and Analytical Evaluation of Available Bandwidth Estimation Tools Cesar D. Guerrero and Miguel A. Labrador Department of Computer Science.
Jun-Won Suh Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Speaker Verification System.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Exploiting Context Analysis for Combining Multiple Entity Resolution Systems -Ramu Bandaru Zhaoqi Chen Dmitri V.kalashnikov Sharad Mehrotra.
Ensemble Methods: Bagging and Boosting
Non-Bayes classifiers. Linear discriminants, neural networks.
Akram Bitar and Larry Manevitz Department of Computer Science
Prediction of the Foreign Exchange Market Using Classifying Neural Network Doug Moll Chad Zeman.
ADALINE (ADAptive LInear NEuron) Network and
Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo.
Reservoir Uncertainty Assessment Using Machine Learning Techniques Authors: Jincong He Department of Energy Resources Engineering AbstractIntroduction.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
GeneMANIA: a real-time multiple association network integration algorithm for predicting gene function Sara Mostafavi, Debajyoti Ray, David Warde-Farley,
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Brain-Machine Interface (BMI) System Identification Siddharth Dangi and Suraj Gowda BMIs decode neural activity into control signals for prosthetic limbs.
Chapter 6 Neural Network.

Intelligent Control Methods Lecture 14: Neuronal Nets (Part 2) Slovak University of Technology Faculty of Material Science and Technology in Trnava.
FAKE GAME updates Pavel Kordík
VISUALIZATION TECHNIQUES UTILIZING THE SENSITIVITY ANALYSIS OF MODELS Ivo Kondapaneni, Pavel Kordík, Pavel Slavík Department of Computer Science and Engineering,
Evolutionary Computation Evolving Neural Network Topologies.
CTU Prague, Faculty of Electrical Engineering, Department of Computer Science Pavel Kordík GMDH and FAKE GAME: Evolving ensembles of inductive models.
Inductive modelling: Detection of system states validity Pavel Kordík Department of Computer Science and Engineering, FEE,CTU Prague
The GAME Algorithm Applied to Complex Fractionated Atrial Electrograms Data Set Pavel Kordík, Václav Křemen and Lenka Lhotská Department of Computer Science.
One-layer neural networks Approximation problems
CSE 4705 Artificial Intelligence
Boosting and Additive Trees
Soft Computing Applied to Finite Element Tasks
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Classification with Perceptrons Reading:
Rutgers Intelligent Transportation Systems (RITS) Laboratory
2006 IEEE World Congress on Computational Intelligence, International Joint Conference on Neural Networks (IJCNN) Evolutionary Search for Interesting Behavior.
Regularization of Evolving Polynomial Models
Approximate Fully Connected Neural Network Generation
Computer Science Department Brigham Young University
klinické neurofyziologie
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Whitening-Rotation Based MIMO Channel Estimation
Backpropagation Disclaimer: This PPT is modified based on
Lecture 4. Niching and Speciation (1)
Using Clustering to Make Prediction Intervals For Neural Networks
Akram Bitar and Larry Manevitz Department of Computer Science
Derivatives and Gradients
Presentation transcript:

OPTIMIZATION OF MODELS: LOOKING FOR THE BEST STRATEGY Pavel Kordík, Oleg Kovářík, Miroslav Šnorek Department of Computer Science and Engineering, Faculty of Eletrical Engineering, Czech Technical University in Prague, Czech Republic kordikp@fel.cvut.cz (Pavel Kordík)

Motivation Continuous optimization Several methods available Which is the best? Is there any strategy to chose the best method for given task?

Our task: FAKE GAME research project

The GAME engine for automated data mining How it works inside?

The GAME engine: building a model Group of Adaptive Models Evolution (GAME) Inductive model Heterogeneous units Niching genetic algorithm (will be explained) employed in each layer to optimize the topology of GAME networks.

Heterogeneous units in GAME

Optimization of coefficients (learning) x Gaussian ( GaussianNeuron ) 1 x n 2 ( ) y’ å 2 x - a i i ... - i = 1 ( ) ( ) = + + 2 + y 1 a e 1 a * n + a 2 x n + 1 n We are looking for optimal values of coefficients a0, a1, …, an+2 We have inputs x1, x2, …, xn and target output y in the training data set The difference between unit output y’ and the target value y should be minimal for all vectors from the training data set

What is an analytic gradient and how to derive it? Error of the unit for training data (energy surface) Gradient of the error Unit with gaussian transfer function Partial derivation of error in the direction of coefficient ai

Partial derivatives of the Gauss unit

Optimization of their coefficients Unit repeat Optimization method optimize coefficients given inintial values new values coefficients a 1 , a 2 , ..., a n error final values compute error on training data estimate gradient a) Unit does not provide analytic gradient just error of the unit b) Unit repeat Optimization method optimize coefficients given inintial values new values coefficients a 1 , a 2 , ..., a n error final values compute error on training data gradient of the Unit provides analytic gradient and the error of the unit

Very efficient gradient based training for hybrid networks developed! Quasi Newton method estimating gradient gradient supplied

Optimization methods available in GAME

Experimental results of competing opt. methods on Building data set RMS error on testing data sets (Building data) averaged over 5 runs

RMS error on the Boston data set

Classification accuracy [%] on the Spiral data set

Evaluation on diverse data sets What is it All?

Remember the Genetic algorithm optimizing the structure of GAME?

Conclusion It is wise to combine several different optimization strategies for the training of inductive models. Evolution of optimization methods works, but it is not significantly better than the random selection of methods. Nature inspired methods are slow for this problem (they don’t care about the analytic gradient). Future work: utilize the gradient in nature inspired methods.