Rigorous Analyses of Simple Diversity Mechanisms Tobias Friedrich Nils Hebbinghaus Frank Neumann Max-Planck-Institut für Informatik Saarbrücken.

Slides:



Advertisements
Similar presentations
Fakultät für informatik informatik 12 technische universität dortmund Standard Optimization Techniques Peter Marwedel Informatik 12 TU Dortmund Germany.
Advertisements

The Contest between Simplicity and Efficiency in Asynchronous Byzantine Agreement Allison Lewko The University of Texas at Austin TexPoint fonts used in.
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Temi Avanzati di Intelligenza Artificiale - Intro1 Temi Avanzati di Intelligenza Artificiale Prof. Vincenzo Cutello Department of Mathematics and Computer.
Talk for Topics course. Pseudo-Random Generators pseudo-random bits PRG seed Use a short “ seed ” of very few truly random bits to generate a long string.
On the Genetic Evolution of a Perfect Tic-Tac-Toe Strategy
Biologically Inspired Computing: Selection and Reproduction Schemes This is DWC’s lecture three `Biologically Inspired Computing’
Evolutionary Computing A Practical Introduction Presented by Ben Paechter Napier University with thanks to the EvoNet Training Committee and its “Flying.
Estimation of Distribution Algorithms Ata Kaban School of Computer Science The University of Birmingham.
Estimation of Distribution Algorithms Let’s review what have done in EC so far: We have studied EP and found that each individual searched via Gaussian.
A New Evolutionary Algorithm for Multi-objective Optimization Problems Multi-objective Optimization Problems (MOP) –Definition –NP hard By Zhi Wei.
Evolutionary Computational Intelligence
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
02 -1 Lecture 02 Heuristic Search Topics –Basics –Hill Climbing –Simulated Annealing –Best First Search –Genetic Algorithms –Game Search.
Fast Evolutionary Optimisation Temi avanzati di Intelligenza Artificiale - Lecture 6 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Tutorial 1 Temi avanzati di Intelligenza Artificiale - Lecture 3 Prof. Vincenzo Cutello Department of Mathematics and Computer Science University of Catania.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
Quantum Algorithms II Andrew C. Yao Tsinghua University & Chinese U. of Hong Kong.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Tutorial 4 (Lecture 12) Remainder from lecture 10: The implicit fitness sharing method Exercises – any solutions? Questions?
A New Algorithm for Solving Many-objective Optimization Problem Md. Shihabul Islam ( ) and Bashiul Alam Sabab ( ) Department of Computer Science.
Reliability-Redundancy Allocation for Multi-State Series-Parallel Systems Zhigang Tian, Ming J. Zuo, and Hongzhong Huang IEEE Transactions on Reliability,
Christoph F. Eick: Applying EC to TSP(n) Example: Applying EC to the TSP Problem  Given: n cities including the cost of getting from on city to the other.
1 Paper Review for ENGG6140 Memetic Algorithms By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002.
FDA- A scalable evolutionary algorithm for the optimization of ADFs By Hossein Momeni.
Constrained Evolutionary Optimization Yong Wang Associate Professor, PhD School of Information Science and Engineering, Central South University
Limits and Horizon of Computing Post silicon computing.
Genetic Algorithms Michael J. Watts
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory 1.
Speaker: Yoni Rozenshein Instructor: Prof. Zeev Nutov.
Fuzzy Genetic Algorithm
Computational Complexity Jang, HaYoung BioIntelligence Lab.
Heuristic Optimization Methods Greedy algorithms, Approximation algorithms, and GRASP.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Applications of Genetic Algorithms TJHSST Computer Systems Lab By Mary Linnell.
Edge Assembly Crossover
Genetic Algorithms Abhishek Sharma Piyush Gupta Department of Instrumentation & Control.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
Distributed Q Learning Lars Blackmore and Steve Block.
Zeidat&Eick, MLMTA, Las Vegas K-medoid-style Clustering Algorithms for Supervised Summary Generation Nidal Zeidat & Christoph F. Eick Dept. of Computer.
Diversity Loss in General Estimation of Distribution Algorithms J. L. Shapiro PPSN (Parallel Problem Solving From Nature) ’06 BISCuit 2 nd EDA Seminar.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
Benjamin Doerr 1, Michael Gnewuch 2, Nils Hebbinghaus 1, Frank Neumann 1 1 Max-Planck-Institut für Informatik Saarbrücken A Rigorous View on Neutrality.
Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Constrained Optimization by the  Constrained Differential Evolution with an Archive and Gradient-Based Mutation Tetsuyuki TAKAHAMA ( Hiroshima City University.
Genetic Algorithm(GA)
Advanced AI – Session 7 Genetic Algorithm By: H.Nematzadeh.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
N-Queens Boanerges Aleman-Meza Cheng Hu Darnell Arford Ning Suo Wade Ertzberger.
CEng 713, Evolutionary Computation, Lecture Notes parallel Evolutionary Computation.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Intelligent Exploration for Genetic Algorithms Using Self-Organizing.
Simulating the Evolution of Ant Behaviour in Evaluating Nest Sites
Crossover Can Provably be Useful in Evolutionary Computation
Evolution strategies Can programs learn?
Example: Applying EC to the TSP Problem
Process modelling and optimization aid
Limits and Horizon of Computing
Standard Optimization Techniques
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
G5BAIM Artificial Intelligence Methods
More on Search: A* and Optimization
Applications of Genetic Algorithms TJHSST Computer Systems Lab
Traveling Salesman Problem by Genetic Algorithm
Derivative-free Methods for Structural Optimization
Alex Bolsoy, Jonathan Suggs, Casey Wenner
Coevolutionary Automated Software Correction
Presentation transcript:

Rigorous Analyses of Simple Diversity Mechanisms Tobias Friedrich Nils Hebbinghaus Frank Neumann Max-Planck-Institut für Informatik Saarbrücken

Tobias Friedrich Diversity  Important issue for designing successful EAs  Prevents an EA from too large selection pressure  Assumption: The right diversity mechanism may be crucial for the success of an algorithm  Aim of this talk: Show this observed behavior by rigorous runtime analyses Frank Neumann

Tobias Friedrich Runtime Analysis  Lot of progress in recent years  Results for pseudo-Boolean functions  Well-known combinatorial optimization problems  Most results are for the (1+1) EA  Some examine the choice of the “ right ” population size  No analyses that consider the impact of diversity  Question in this talk: What about diversity in populations? Frank Neumann

Tobias Friedrich Simple Diversity Mechanisms  Diversify the population with respect to search points  Diversify the population with respect to fitness values  Show situations where the behavior of these strategies differs significantly Frank Neumann

Tobias Friedrich Search point diversifying (μ+1)-EA  initial population (fitness = size):  select random individual  mutate this  if already in population, goto 1.  add new individual  delete individual with lowest fitness  current population (fitness = size): Frank Neumann

Tobias Friedrich Fitness diversifying (μ+1)-EA  initial population (fitness = size):  select random individual  mutate this  if individual with same fitness in population, replace this by new individual and goto 1  current population (fitness = size): Frank Neumann

Tobias Friedrich Fitness diversifying (μ+1)-EA  select random individual  mutate this  if individual with same fitness in population, replace this by new individual and goto 1  add new individual  delete individual with lowest fitness  current population (fitness = size): Frank Neumann

Tobias Friedrich Plateaus  Examine the choice of diversity on plateau functions  Plateaus are regions in the search space where all search points have the same fitness  Size and structure determines difficulty for evolutionary search  Investigations for the (1+1) EA on pseudo-Boolean functions, maximum matchings, Eulerian cycles Frank Neumann

Tobias Friedrich Investigations  Search point vs. Fitness diversifying ( μ +1)-EA  Constant population size  Search space {0,1} n, mutate each bit with 1/n  Compare them on different plateau functions  Runtime:= Number of fitness evaluations to reach an optimal search point  Show advantage/disadvantage of the different diversity mechanisms Frank Neumann

Tobias Friedrich Theorem 1   Th eorem 1 : O n f ( x ) : = 8 < : j x j 0 : x 6 2 f 1 i 0 n ¡ i ; 0 < i · n g n + 1 : x 2 f 1 i 0 n ¡ i ; 0 < i < n g n + 2 : x = 1 n searc h po i n t d i vers i f y i ng ( ¹ + 1 ) - EAh as expec t e d run t i me O ( n 3 ), ¯ t ness d i vers i f y i ng ( ¹ + 1 ) - EAh as exponen t i a l run t i mew i t h overw h e l m i ngpro b a b i l i t y.

Tobias Friedrich Formal Proof of Theorem 1 Tonto Plateau (Grand Canyon)© by Prof. Ian Parker, Univ. of California f ( x ) : = 8 < : j x j 0 : x 6 2 f 1 i 0 n ¡ i ; 0 < i · n g n + 1 : x 2 f 1 i 0 n ¡ i ; 0 < i < n g n + 2 : x = 1 n Pl a t eau f unc t i on i n t h eory: Pl a t eau i n t h erea l wor ld :

Tobias Friedrich Proof of Theorem 1 Tonto Plateau (Grand Canyon)© by Prof. Ian Parker, Univ. of California 0 n 1 n o t h erw i se f 1 i 0 n ¡ i ; 0 < i < n g Pl a t eauw i t h¯ t ness n + 1 Pl a t eau f unc t i on f ( x ) : = 8 < : j x j 0 : x 6 2 f 1 i 0 n ¡ i ; 0 < i · n g n + 1 : x 2 f 1 i 0 n ¡ i ; 0 < i < n g n + 2 : x = 1 n O p t i mumw i t h ¯ t ness n + 2

Tobias Friedrich Proof of Theorem 1 0 n 1 n o t h erw i se f 1 i 0 n ¡ i ; 0 < i < n g  Fitness diversifying ( μ +1)-EA

Tobias Friedrich Proof of Theorem 1 0 n 1 n o t h erw i se f 1 i 0 n ¡ i ; 0 < i < n g  Mutation with probability  Selection kills individual on plateau Mutation Selection ? ? 1 n

Tobias Friedrich Proof of Theorem 1 0 n 1 n o t h erw i se f 1 i 0 n ¡ i ; 0 < i < n g  Individual on plateau cannot perform random walk  exponential runtime with overwhelming probability

Tobias Friedrich Proof of Theorem 1 0 n 1 n o t h erw i se f 1 i 0 n ¡ i ; 0 < i < n g  Search point diversifying ( μ +1)-EA  expected polynomial runtime Optimum found!

Tobias Friedrich Theorem 2   searc h po i n t d i vers i f y i ng ( ¹ + 1 ) - EAh as exponen t i a l run t i mew i t h pro b a b i l i t y 1 = 2 ¡ o ( 1 ), ¯ t ness d i vers i f y i ng ( ¹ + 1 ) - EAh as expec t e d run t i me O ( n 3 ).

Tobias Friedrich Proof of Theorem 2 D ou bl e-p l a t eau f unc t i on:

Tobias Friedrich Proof of Theorem 2 f ( x ) : = 8 > > < > > : n + 1 : x 2 Pl a t eau 1 ( w i t h ou t O p t i mum ) n + 2 : x 2 Pl a t eau 2 n + 3 : x = O p t i mum ( on Pl a t eau 1 ) j x j 0 : o t h erw i se. D ou bl e-p l a t eau f unc t i on: D ou bl e-p l a t eau i n t h erea l wor ld :

Tobias Friedrich Proof of Theorem 2 f ( x ) : = 8 > > < > > : n + 1 : x 2 Pl a t eau 1 ( w i t h ou t O p t i mum ) n + 2 : x 2 Pl a t eau 2 n + 3 : x = O p t i mum ( on Pl a t eau 1 ) j x j 0 : o t h erw i se. D ou bl e-p l a t eau f unc t i on: D ou bl e-p l a t eau \ c l ose t o t h erea l wor ld" :

Tobias Friedrich Proof of Theorem 2 Plateau 2 Plateau 1 Optimum f ( x ) : = 8 > > < > > : n + 1 : x 2 Pl a t eau 1 ( w i t h ou t O p t i mum ) n + 2 : x 2 Pl a t eau 2 n + 3 : x = O p t i mum ( on Pl a t eau 1 ) j x j 0 : o t h erw i se. D ou bl e-p l a t eau f unc t i on:

Tobias Friedrich Proof of Theorem 2 Mutation 1 2  Search point diversifying ( μ +1)-EA (only avoiding duplicates) 1 2

Tobias Friedrich Proof of Theorem Optimum found!  Search point diversifying ( μ +1)-EA  reaches Optimum with prob. 1 2

Tobias Friedrich Proof of Theorem  Search point diversifying ( μ +1)-EA  expected exponential runtime

Tobias Friedrich Proof of Theorem 2 Mutation  Fitness diversifying ( μ +1)-EA  expected polynomial runtime Optimum found!

Tobias Friedrich Larger Populations  

Tobias Friedrich Conclusions  Ensuring diversity is important for successful EAs  First rigorous runtime analysis on this topic  Using the “ right ” strategy may have a great impact on the runtime  Proven for some basic plateau functions  Same effect can be observed in multi-objective optimization (upcoming CEC paper)  Future work: Other measures for diversity, classical combinatorial optimization problems Thanks!