EGEE-III INFSO-RI-222667 Enabling Grids for E-sciencE www.eu-egee.org EGEE and gLite are registered trademarks Grid-enabled parameter initialization for.

Slides:



Advertisements
Similar presentations
Yuri R. Tsoy, Vladimir G. Spitsyn, Department of Computer Engineering
Advertisements

Genetic Algorithms (Evolutionary Computing) Genetic Algorithms are used to try to “evolve” the solution to a problem Generate prototype solutions called.
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
also known as the “Perceptron”
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
U NIVERSITY OF D ELAWARE C OMPUTER & I NFORMATION S CIENCES D EPARTMENT Mitigating the Compiler Optimization Phase- Ordering Problem using Machine Learning.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Neuro-Evolution of Augmenting Topologies Ben Trewhella.
Parallelized Evolution System Onur Soysal, Erkin Bahçeci Erol Şahin Dept. of Computer Engineering Middle East Technical University.
Evolving Neural Network Agents in the NERO Video Game Author : Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen Presented by Yi Cheng Lin.
Artificial Intelligence Genetic Algorithms and Applications of Genetic Algorithms in Compilers Prasad A. Kulkarni.
1. 2 overview Background on video games Background on video games Neural networks Neural networks NE NE NEAT NEAT rtNEAT rtNEAT NERO NERO.
Applying Multi-Criteria Optimisation to Develop Cognitive Models Peter Lane University of Hertfordshire Fernand Gobet Brunel University.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
1. Optimization and its necessity. Classes of optimizations problems. Evolutionary optimization. –Historical overview. –How it works?! Several Applications.
Genetic Algorithms and Ant Colony Optimisation
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
A Budget Constrained Scheduling of Workflow Applications on Utility Grids using Genetic Algorithms Jia Yu and Rajkumar Buyya Grid Computing and Distributed.
Department of Information Technology Indian Institute of Information Technology and Management Gwalior AASF hIQ 1 st Nov ‘09 Department of Information.
Adapting Convergent Scheduling Using Machine Learning Diego Puppin*, Mark Stephenson †, Una-May O’Reilly †, Martin Martin †, and Saman Amarasinghe † *
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
Introduction to Genetic Algorithms and Evolutionary Computation
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
CAP6938 Neuroevolution and Developmental Encoding Working with NEAT Dr. Kenneth Stanley September 27, 2006.
Minimum Mean Squared Error Time Series Classification Using an Echo State Network Prediction Model Mark Skowronski and John Harris Computational Neuro-Engineering.
NEURAL NETWORKS FOR DATA MINING
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The network monitoring in grid context Operations.
Job scheduling algorithm based on Berger model in cloud environment Advances in Engineering Software (2011) Baomin Xu,Chunyan Zhao,Enzhao Hua,Bin Hu 2013/1/251.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks SA2 Quality Plan for EGEE III Geneviève.
Mike Taks Bram van de Klundert. About Published 2005 Cited 286 times Kenneth O. Stanley Associate Professor at University of Central Florida Risto Miikkulainen.
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems.
ELeaRNT: Evolutionary Learning of Rich Neural Network Topologies Authors: Slobodan Miletic 3078/2010 Nikola Jovanovic 3077/2010
1 Genetic Algorithms and Ant Colony Optimisation.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Using the Grid to improve the effectiveness.
Pac-Man AI using GA. Why Machine Learning in Video Games? Better player experience Agents can adapt to player Increased variety of agent behaviors Ever-changing.
Mutation Operator Evolution for EA-Based Neural Networks By Ryan Meuth.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Abel Carrión Ignacio Blanquer Vicente Hernández.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
Comparative Reproduction Schemes for Evolving Gathering Collectives A.E. Eiben, G.S. Nitschke, M.C. Schut Computational Intelligence Group Department of.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Computational chemistry with ECCE on EGEE.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
Neural Networks And Its Applications By Dr. Surya Chitra.
Onlinedeeneislam.blogspot.com1 Design and Analysis of Algorithms Slide # 1 Download From
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Regional Nagios Emir Imamagic /SRCE EGEE’09,
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
CAP6938 Neuroevolution and Artificial Embryogeny Real-time NEAT Dr. Kenneth Stanley February 22, 2006.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Nov. 18, EGEE and gLite are registered trademarks High-End Computing - Clusters.
March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Evolutionary Computation Evolving Neural Network Topologies.
Dr. Kenneth Stanley September 11, 2006
Deep Feedforward Networks
Randomness in Neural Networks
HyperNetworks Engın denız usta
Dr. Kenneth Stanley September 25, 2006
CSE 473 Introduction to Artificial Intelligence Neural Networks
Chapter 12 Advanced Intelligent Systems
CSE 573 Introduction to Artificial Intelligence Neural Networks
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Dr. Kenneth Stanley February 6, 2006
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Evolutionary Ensembles with Negative Correlation Learning
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Presentation transcript:

EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Grid-enabled parameter initialization for high performance machine learning tasks Kyriakos Chatzidimitriou, Fotis Psomopoulos and Pericles Mitkas Thessaloniki, Greece

Enabling Grids for E-sciencE EGEE-III INFSO-RI Introduction to the scope of work The algorithm –NeuroEvolution of Augmented Reservoir (NEAR) = NeuroEvolution of Augmented Topologies + Echo State Networks (NEAT + ESN) Questions we want to answer Testbeds –Supervised learning –Reinforcement learning Experimental setup Results obtained Conclusions Future work Presentation Overview Grid-enabled parameter initialization for NEAR 2

Enabling Grids for E-sciencE EGEE-III INFSO-RI Approach – Problem – Solution Reinforcement Learning paradigm  appropriate for agents Real world/complex tasks  Function Approximator Echo State Networks  Non-linear/Non-Markovian tasks Evolution and learning  adapt the reservoir to the problem at hand How?  NeuroEvolution (NEAT) and Temporal Difference learning Hierarchy of involved areas Reinforcement Learning Function Approximator Evolutionary Computation Neuro Evolution of Augmented Topologies Temporal Difference Neuro Evolution Neural Networks Recurrent NN Reservoir Computing Echo State Networks Introduction 3 Grid-enabled parameter initialization for NEAR

Enabling Grids for E-sciencE EGEE-III INFSO-RI Echo State Networks An Echo State Network (ESN) [Jaeger, 2001 & 2002] Input Linear Features Reservoir Non-linear Features - Memory Output Linear/Non-linear Combination Trainable Static 4 Grid-enabled parameter initialization for NEAR

Enabling Grids for E-sciencE EGEE-III INFSO-RI Start minimally & complexify Weight & structural mutation Speciation through clustering to protect innovation Crossover networks through historical markings on connections [Stanley, PhD, 2004] NeuroEvolution of Augmented Topologies 5 Grid-enabled parameter initialization for NEAR

Enabling Grids for E-sciencE EGEE-III INFSO-RI Use NEAT as a meta-search method Start from minimal reservoirs (1 neuron) Perform weight and structural mutation –Add neurons, add connections Maintain ESN constraints Apply speciation through clustering –Similarity metric ~ Reservoir’s Macroscopic Features (Spectral Radius, Reservoir Neurons & Sparseness) Apply crossover using historical markings on neurons Identical performance and in some cases better against “rival” algorithms Work under review NeuroEvolution of Augmented Reservoirs (NEAR) 6 Grid-enabled parameter initialization for NEAR

Enabling Grids for E-sciencE EGEE-III INFSO-RI We ask the following questions: Questions are formulated as sets of parameters Experimentation to answer them Questions Degree of elitism Selection With crossover Mutation only Reproduction Speciation Individualism Fitness Continuous complexification Survival of the fittest Crossover Sparse Dense Reservoir 7 Grid-enabled parameter initialization for NEAR

Enabling Grids for E-sciencE EGEE-III INFSO-RI problems 5 parameters 64 parameter sets 30 runs per experiment A total of evolutionary procedures Population of 100 individuals Evolutionary process of 100 generations networks were evaluated Set up Jobs AlgorithmParametersProblem ResultsAggregationProcessing 8 Grid-enabled parameter initialization for NEAR

Enabling Grids for E-sciencE EGEE-III INFSO-RI Time Series –Mackey-Glass –Multiple Superimposed Oscillator –Lorentz Attractor Predict the next value of sequence Train on sequence T Calculate fitness, 1/NRMSE, by feeding output to input on F chunks of sequence T Validate on sequence V Testbeds – Time Series 9 Grid-enabled parameter initialization for NEAR Mackey Glass MSO Lorentz Attractor

Enabling Grids for E-sciencE EGEE-III INFSO-RI Reinforcement Learning –Single and Double pole balancing Balance one pole, or two poles of different lengths and masses for more than time steps –2D and 3D mountain car Escape from a valley by moving the car in two or three dimensions, starting from random states  10 runs for 3D due to extremely large execution time Testbeds – Reinforcement Learning 10 Grid-enabled parameter initialization for NEAR

Enabling Grids for E-sciencE EGEE-III INFSO-RI Mackey-Glass Performance measure: Validation NRMSE Inconclusive Grid-enabled parameter initialization for NEAR 11 RankPerf.Elitis m CrossoverSpeciationComplexifySparse %YesNoYesNo %YesNoYesNo %No YesNo %Yes No %No Yes ………………… %YesNo Yes %YesNo %No Yes %No YesNo %NoYes

Enabling Grids for E-sciencE EGEE-III INFSO-RI MSO Performance measure: Validation NRMSE The most difficult task Know to be an especially difficult task for ESNs Even the best results exhibit poor error behavior The poor performance does not allow us to derive concrete conclusions Grid-enabled parameter initialization for NEAR 12 RankPerf.Elitis m CrossoverSpeciationComplexifySparse %No %No

Enabling Grids for E-sciencE EGEE-III INFSO-RI Lorentz Attractor Performance measure: Validation NRMSE Grid-enabled parameter initialization for NEAR 13 RankPerf.Elitis m CrossoverSpeciationComplexifySparse %YesNoYesNo %YesNoYes %No Yes %YesNo Yes %No Yes ………………… %YesNo %Yes No %YesNo Yes %YesNoYesNo %Yes No

Enabling Grids for E-sciencE EGEE-III INFSO-RI D Mountain Car Performance measure: avg # steps escaping the valley from 1000 random starting states Inconclusive Similar results versus classic CMAC SARSA and the recent NEAT+Q Grid-enabled parameter initialization for NEAR 14 RankPerf.Elitis m CrossoverSpeciationComplexifySparse %No YesNo %YesNoYes %NoYesNo %YesNo %YesNo ………………… %Yes NoYes %Yes No %Yes %Yes No %Yes No

Enabling Grids for E-sciencE EGEE-III INFSO-RI D Mountain Car Performance measure: avg # steps escaping the valley from 1000 random starting states Inconclusive Grid-enabled parameter initialization for NEAR 15 RankPerf.ElitismCrossoverSpeciationComplexifySparse %YesNoYes %No Yes %No %NoYes No %No Yes ………………… %NoYesNo %YesNo %YesNoYes %No Yes %YesNo Yes

Enabling Grids for E-sciencE EGEE-III INFSO-RI Single Pole Balancing Performance measure: # Nets evaluated Grid-enabled parameter initialization for NEAR 16 RankPerf.ElitismCrossoverSpeciationComplexifySparse %No %No Yes %NoYes %YesNo Yes %NoYesNo ………………… %Yes %Yes No %Yes NoYes %Yes NoYes %Yes NoYes

Enabling Grids for E-sciencE EGEE-III INFSO-RI Double Pole Balancing Performance measure: # Nets evaluated Important parameters: Elitism, Crossover Grid-enabled parameter initialization for NEAR 17 RankPerf.ElitismCrossoverSpeciationComplexifySparse %YesNoYesNo %Yes No %YesNoYes %YesNo Yes %YesNo ………………… %No YesNo %No YesNo %NoYes No %No %No Yes

Enabling Grids for E-sciencE EGEE-III INFSO-RI Summarize Testbeds very different Free lunch theorem holds for parameters Results inconclusive besides speciation –Actually good –Multiple ways of finding a good solution without in many environments without worrying much about specific parameter settings –The case the algorithm does not work well (MSO) is mainly due to the restriction of the model itself Grid-enabled parameter initialization for NEAR 18

Enabling Grids for E-sciencE EGEE-III INFSO-RI Execution period on Grid Execution for 1 run on Grid –MG ~ sec –MSO ~ sec –Lorentz ~ sec –2DMC ~ sec –3DMC ~ sec –SPB ~ sec –DPB ~ sec Total time ~ ~ 316 days of sequential execution time Experimentation period on Grid ~ 60 days –allowing for testing, errors, outage, inactivity periods etc. Grid-enabled parameter initialization for NEAR 19

Enabling Grids for E-sciencE EGEE-III INFSO-RI Add more testbeds to the grid search –Non-markov cases that make pole balancing and mountain car problems more difficult (Implememted) –Server Job Scheduling (Implemented) More research on speciation and clustering similarity metric Increase generations when searching for a suitable network for the MSO testbed Future Work 20 Grid-enabled parameter initialization for NEAR

EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Thank you for your attention! Fotis Psomopoulos Intelligent Systems and Software Engineering Labgroup Informatics and Telematics Institute Electrical and Computer Eng. Dept. Centre for Research and Technology-Hellas Aristotle University of Thessaloniki Thessaloniki, Greece