Surrogate Model Based Differential Evolution for Multiobjective Optimization (GP - DEMO) Šmarna gora, 20. 9. 2012Miha Mlakar Algoritmi po vzorih iz narave.

Slides:



Advertisements
Similar presentations
College of Information Technology & Design
Advertisements

Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur.
CS6800 Advanced Theory of Computation
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Biased Random Key Genetic Algorithm with Hybrid Decoding for Multi-objective Optimization Panwadee Tangpattanakul, Nicolas Jozefowiez, Pierre Lopez LAAS-CNRS.
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Student : Mateja Saković 3015/2011.  Genetic algorithms are based on evolution and natural selection  Evolution is any change across successive generations.
Experimental Design, Response Surface Analysis, and Optimization
Elitist Non-dominated Sorting Genetic Algorithm: NSGA-II
Statistics : Statistical Inference Krishna.V.Palem Kenneth and Audrey Kennedy Professor of Computing Department of Computer Science, Rice University 1.
Sampling Distributions (§ )
Optimal Design Laboratory | University of Michigan, Ann Arbor 2011 Design Preference Elicitation Using Efficient Global Optimization Yi Ren Panos Y. Papalambros.
COMPUTER MODELS IN BIOLOGY Bernie Roitberg and Greg Baker.
Genetic Algorithms for Bin Packing Problem Hazem Ali, Borislav Nikolić, Kostiantyn Berezovskyi, Ricardo Garibay Martinez, Muhammad Ali Awan.
Evolutionary Computational Intelligence Lecture 10a: Surrogate Assisted Ferrante Neri University of Jyväskylä.
Adaptive Multi-objective Differential Evolution with Stochastic Coding Strategy Wei-Ming Chen
Stat 301 – Day 15 Comparing Groups. Statistical Inference Making statements about the “world” based on observing a sample of data, with an indication.
A New Evolutionary Algorithm for Multi-objective Optimization Problems Multi-objective Optimization Problems (MOP) –Definition –NP hard By Zhi Wei.
Evolution of descent directions Alejandro Sierra Escuela Politécnica Superior Universidad Autónoma de Madrid Iván Santibáñez Koref Bionik und Evolutionstechnik.
Fuzzy Simulated Evolution for Power and Performance of VLSI Placement Sadiq M. SaitHabib Youssef Junaid A. KhanAimane El-Maleh Department of Computer Engineering.
Torcs Simulator Presented by Galina Volkinshtein and Evgenia Dubrovsky.
Jan 6-10th, 2007VLSI Design A Reduced Complexity Algorithm for Minimizing N-Detect Tests Kalyana R. Kantipudi Vishwani D. Agrawal Department of Electrical.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Evolutionary Computational Intelligence Lecture 9: Noisy Fitness Ferrante Neri University of Jyväskylä.
MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition
Fuzzy Evolutionary Algorithm for VLSI Placement Sadiq M. SaitHabib YoussefJunaid A. Khan Department of Computer Engineering King Fahd University of Petroleum.
The Pareto fitness genetic algorithm: Test function study Wei-Ming Chen
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Curve fit noise=randn(1,30); x=1:1:30; y=x+noise ………………………………… [p,s]=polyfit(x,y,1);
1 Assessment of Imprecise Reliability Using Efficient Probabilistic Reanalysis Farizal Efstratios Nikolaidis SAE 2007 World Congress.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Review of normal distribution. Exercise Solution.
Genetic Algorithm.
Efficient Model Selection for Support Vector Machines
On comparison of different approaches to the stability radius calculation Olga Karelkina Department of Mathematics University of Turku MCDM 2011.
FDA- A scalable evolutionary algorithm for the optimization of ADFs By Hossein Momeni.
A Comparison of Nature Inspired Intelligent Optimization Methods in Aerial Spray Deposition Management Lei Wu Master’s Thesis Artificial Intelligence Center.
MOGADES: Multi-Objective Genetic Algorithm with Distributed Environment Scheme Intelligent Systems Design Laboratory , Doshisha University , Kyoto Japan.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
Improved Gene Expression Programming to Solve the Inverse Problem for Ordinary Differential Equations Kangshun Li Professor, Ph.D Professor, Ph.D College.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Omni-Optimizer A Procedure for Single and Multi-objective Optimization Prof. Kalyanmoy Deb and Santosh Tiwari.
The Generational Control Model This is the control model that is traditionally used by GP systems. There are a distinct number of generations performed.
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
1/27 Discrete and Genetic Algorithms in Bioinformatics 許聞廉 中央研究院資訊所.
Simulation in Wind Turbine Vibrations: A Data Driven Analysis Graduate Students: Zijun Zhang PI: Andrew Kusiak Intelligent Systems Laboratory The University.
VLDB 2006, Seoul1 Indexing For Function Approximation Biswanath Panda Mirek Riedewald, Stephen B. Pope, Johannes Gehrke, L. Paul Chew Cornell University.
DIVERSITY PRESERVING EVOLUTIONARY MULTI-OBJECTIVE SEARCH Brian Piper1, Hana Chmielewski2, Ranji Ranjithan1,2 1Operations Research 2Civil Engineering.
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
ECE 103 Engineering Programming Chapter 52 Generic Algorithm Herbert G. Mayer, PSU CS Status 6/4/2014 Initial content copied verbatim from ECE 103 material.
1 Effect of Spatial Locality on An Evolutionary Algorithm for Multimodal Optimization EvoNum 2010 Ka-Chun Wong, Kwong-Sak Leung, and Man-Hon Wong Department.
Optimization of functions of one variable (Section 2)
Introduction Genetic programming falls into the category of evolutionary algorithms. Genetic algorithms vs. genetic programming. Concept developed by John.
Ch 8.2: Improvements on the Euler Method Consider the initial value problem y' = f (t, y), y(t 0 ) = y 0, with solution  (t). For many problems, Euler’s.
Riza Erdem Jappie Klooster Dirk Meulenbelt EVOLVING MULTI-MODAL BEHAVIOR IN NPC S.
Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models Ahmed Kattan Edgar Galvan.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Multiobjective Optimization  Particularities of multiobjective optimization  Multiobjective.
Evolutionary Computing Chapter 12. / 26 Chapter 12: Multiobjective Evolutionary Algorithms Multiobjective optimisation problems (MOP) -Pareto optimality.
Tutorial 2, Part 2: Calibration of a damped oscillator.
IP Routing table compaction and sampling schemes to enhance TCAM cache performance Author: Ruirui Guo a, Jose G. Delgado-Frias Publisher: Journal of Systems.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Genetic Algorithms and Evolutionary Programming A Brief Overview.
Genetic (Evolutionary) Algorithms CEE 6410 David Rosenberg “Natural Selection or the Survival of the Fittest.” -- Charles Darwin.
Traffic Simulator Calibration
Lecture 19.
Multi-Objective Optimization
Probabilistic Latent Preference Analysis
Multiobjective Optimization
Presentation transcript:

Surrogate Model Based Differential Evolution for Multiobjective Optimization (GP - DEMO) Šmarna gora, Miha Mlakar Algoritmi po vzorih iz narave 20. delavnica

Overview Motivation Surrogate models – Evolution control – Gaussian process Outline of GP - DEMO Comparison of solutions under uncertainty Selection procedure under uncertainty Testing and results Discussion and future work 2

Motivation MOEA are effective but require numerous fitness function evaluations Solution evaluations can be: – very complex and can take a lot of computational time – expensive – dangerous Our goal is to build an algorithm that will: – return comparable results to other MOEAs – require fewer evaluations than other MOEAs 3

Surrogate models Surrogate models (also called meta-models) are models that approximate original fitness function Surrogate models can be NN, GP, SVN,... Optimization with surrogate models uses: – solution evaluation (with original function) – solution approximation (with surrogate model) Evolution control balances the use of the surrogate models 4

Evolution control Three main approaches: – No evolution control – Fixed evolution control Individual-based evolution control Generation-based evolution control – Adaptive evolution control 5

Individual-based evolution control Just some individuals evaluated 6

Generation-based evolution control Some generations evaluated some approximated 7

Adaptive evolution control Solve optimization problem with the model and evaluate best (nondominated) points Evaluate only better points (using the model) Evaluate points with low confidence to get a better model prediction 8

Gaussian process modeling Gaussian process model is build from previously evaluated solutions Result of solution approximation is normal distribution Solution fitness value = mean value Solution confidence interval (95%) = twice the standard deviation 9

Gaussian process modeling (2) 10

GP - DEMO Algorithm for surrogate model optimization with Gaussian process Based on DEMO Adaptive evolution control in two parts of algorithm: a)Comparison of parent and candidate solutions b)Determining set of non-dominated solutions and environmental selection 11

GP – DEMO pseudocode 1.Randomly create and evaluate initial population 2.Create GP model 3.Until stopping criteria are not met, repeat: – For every solution in generation: Create candidate and aproximate it with GP model If confidence interval is too large, evaluate solution Compare under uncertainty candidate with parent – Do selection procedure under uncertainty – Update model with newly exactly evaluated solutions 12

Comparison of parent and candidate solutions under uncertainty Three types of comparison: – Evaluated parent compared to approximated candidate – Parent and candidate both approximated – Parent and candidate both evaluated 13

Comparison of parent and candidate solutions under uncertainty (2) For every objective we need a separate model Approximated solution has for every objective mean (fitness) value and confidence interval Comparing solutions separately for every objective If solutions interval(value +- confidence interval) overlap => incomparable objective 14

Comparison of parent and candidate solutions under uncertainty (3) 15

Comparison of parent and candidate solutions under uncertainty (4) 16

Comparison of parent and candidate solutions under uncertainty (5) 17

Comparison of parent and candidate solutions under uncertainty (6) 18

Comparison of parent and candidate solutions under uncertainty (7) 19

NSGA-II selection procedure NSGA-II environmental selection procedure: – Nondominated sorting – Crowding distance 20

NSGA-II selection procedure under uncertainty Rank of solutions determined with comparison under uncertainty If, because of the uncertainty, dominance status could not be obtained, we mark this solution as potential for reevaluated After comparison with all other solutions: – If solution is nondominated and marked, than the solution is exactly reevaluated Solutions on the front are nondominated 21

NSGA-II selection procedure under uncertainty - example Example when to reevaluate solution 22

Testing environment 9 WFG benchmark problems Continuous steel casting problem Stopping criteria: max number of evaluations Max number of evaluations: Population size: 100 Comparing results of our algorithm with DEMO

Average time for solution approximation

WFG1 AlgorithmAverage number of exact evaluations HypervolumeAverage opti. time Average eval time DEMO :00:15 GP-DEMO :37:24

WFG2 AlgorithmAverage number of function evaluations HypervolumeAverage opti. time Average eval time DEMO GP-DEMO :21:52

WFG3 AlgorithmAverage number of function evaluations HypervolumeAverage opti. time Average eval time DEMO GP-DEMO :54:15

WFG4 AlgorithmAverage number of function evaluations HypervolumeAverage opti. time Average eval time DEMO GP-DEMO :55:33

WFG5 AlgorithmAverage number of function evaluations HypervolumeAverage opti. time Average eval time DEMO GP-DEMO :50:12

WFG6 AlgorithmAverage number of function evaluations HypervolumeAverage opti. time Average eval time DEMO GP-DEMO :54:05

WFG7 AlgorithmAverage number of function evaluations HypervolumeAverage opti. time Average eval time DEMO GP-DEMO :58:11

WFG8 AlgorithmAverage number of function evaluations HypervolumeAverage opti. time Average eval time DEMO GP-DEMO :00:48

WFG9 AlgorithmAverage number of function evaluations HypervolumeAverage opti. time Average eval time DEMO GP-DEMO :15:55

Continuous steel casting problem Searching for the best quality of casted steel Continuous steel casting simulator – 4 input variables – 3 objectives 3000 solution evaluations 34

Continuous steel casting problem AlgorithmAverage number of function evaluations HypervolumeAverage opti. time Average eval time DEMO3000 GP-DEMO (3000) :20:053s

Observations Results depend on the problem If the problem is hard to model, than the solutions have high confidence interval and we have to reevaluate almost all solutions If the problem can be modeled efficiently than the GP-DEMO is very effective

Future work Use of global and local models for approximation Tests on additional problems Comparison with related methods Publication in an SCI journal 37