AUTOMATED REAL-WORLD PROBLEM SOLVING EMPLOYING META-HEURISTICAL BLACK-BOX OPTIMIZATION November 2013 Daniel R. Tauritz, Ph.D. Guest Scientist, Los Alamos.

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

Representing Hypothesis Operators Fitness Function Genetic Programming
CS6800 Advanced Theory of Computation
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Parameter control Chapter 8. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter Control in EAs 2 Motivation 1 An EA has many.
1 Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control Ferrante Neri University of Jyväskylä.
Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Estimation of Distribution Algorithms Ata Kaban School of Computer Science The University of Birmingham.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Introduction to Genetic Algorithms Yonatan Shichel.
Learning to Advertise. Introduction Advertising on the Internet = $$$ –Especially search advertising and web page advertising Problem: –Selecting ads.
EC Awards Lecture ~ Spring 2008 Advances in Parameterless Evolutionary Algorithms Lisa Guntly André Nwamba Research Advisor: Dr. Daniel Tauritz Natural.
Symposium “New Directions in Evolutionary Computation” Dr. Daniel Tauritz Director, Natural Computation Laboratory Associate Professor, Department of Computer.
CS 447 Advanced Topics in Artificial Intelligence Fall 2002.
What is Neutral? Neutral Changes and Resiliency Terence Soule Department of Computer Science University of Idaho.
Soft Computing Paradigm
Image Registration of Very Large Images via Genetic Programming Sarit Chicotay Omid E. David Nathan S. Netanyahu CVPR ‘14 Workshop on Registration of Very.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Christoph F. Eick: Applying EC to TSP(n) Example: Applying EC to the TSP Problem  Given: n cities including the cost of getting from on city to the other.
Genetic Programming.
EVOLVING ANTS Enrique Areyan School of Informatics and Computing Indiana University January 24, 2012.
Genetic Algorithm.
1 Paper Review for ENGG6140 Memetic Algorithms By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Genetic Algorithms Michael J. Watts
Comparison of Differential Evolution and Genetic Algorithm in the Design of a 2MW Permanent Magnet Wind Generator A.D.Lilla, M.A.Khan, P.Barendse Department.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
The Generational Control Model This is the control model that is traditionally used by GP systems. There are a distinct number of generations performed.
1 Formal Verification of Candidate Solutions for Evolutionary Circuit Design (Entry 04) Zdeněk Vašíček and Lukáš Sekanina Faculty of Information Technology.
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
GENETIC ALGORITHMS.  Genetic algorithms are a form of local search that use methods based on evolution to make small changes to a popula- tion of chromosomes.
A New Evolutionary Approach for the Optimal Communication Spanning Tree Problem Sang-Moon Soak Speaker: 洪嘉涓、陳麗徽、李振宇、黃怡靜.
/ 26 Evolutionary Computing Chapter 8. / 26 Chapter 8: Parameter Control Motivation Parameter setting –Tuning –Control Examples Where to apply parameter.
Genetic Algorithms CSCI-2300 Introduction to Algorithms
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
Coevolutionary Automated Software Correction Josh Wilkerson PhD Candidate in Computer Science Missouri S&T.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.
Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models Ahmed Kattan Edgar Galvan.
Sporadic model building for efficiency enhancement of the hierarchical BOA Genetic Programming and Evolvable Machines (2008) 9: Martin Pelikan, Kumara.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo.
Genetic Programming Using Simulated Natural Selection to Automatically Write Programs.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms.
Zdeněk Vašíček and Lukáš Sekanina
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
Genetic Algorithms CSCI-2300 Introduction to Algorithms
Hyper-heuristics Tutorial
Aiman H. El-Maleh Sadiq M. Sait Syed Z. Shazli
Parameter control Chapter 8.
Artificial Intelligence CIS 342
Parameter control Chapter 8.
Self-Configuring Crossover
Parameter control Chapter 8.
Coevolutionary Automated Software Correction
A Tutorial on Hyper-Heuristics for the Automated Design of Algorithms
Presentation transcript:

AUTOMATED REAL-WORLD PROBLEM SOLVING EMPLOYING META-HEURISTICAL BLACK-BOX OPTIMIZATION November 2013 Daniel R. Tauritz, Ph.D. Guest Scientist, Los Alamos National Laboratory Director, Natural Computation Laboratory Department of Computer Science Missouri University of Science and Technology

Black-Box Search Algorithms Many complex real-world problems can be formulated as generate-and-test problems Black-Box Search Algorithms (BBSAs) iteratively generate trial solutions employing solely the information gained from previous trial solutions, but no explicit problem knowledge

Practitioner’s Dilemma 1.How to decide for given real-world problem whether beneficial to formulate as black-box search problem? 2.How to formulate real-world problem as black-box search problem? 3.How to select/create BBSA? 4.How to configure BBSA? 5.How to interpret result? 6.All of the above are interdependent!

Theory-Practice Gap While BBSAs, including EAs, steadily are improving in scope and performance, their impact on routine real-world problem solving remains underwhelming A scalable solution enabling domain-expert practitioners to routinely solve real-world problems with BBSAs is needed 5

Two typical real-world problem categories Solving a single-instance problem: automated BBSA selection Repeatedly solving instances of a problem class: evolve custom BBSA 6

Part I: Solving Single- Instance Problems Employing Automated BBSA Selection 7

Requirements 1.Need diverse set of high- performance BBSAs 2.Need automated approach to select most appropriate BBSA from set for a given problem 3.Need automated approach to configure selected BBSA 8

Automated BBSA Selection 1.Given a set of BBSAs, a priori evolve a set of benchmark functions which cluster the BBSAs by performance 2.Given a real-world problem, create a surrogate fitness function 3.Find the benchmark function most similar to the surrogate 4.Execute the corresponding BBSA on the real-world problem 9

10 Benchmark Generator BBSA 1 BBSA 2 BBSA n … BBSA 1 BP 1 BBSA 2 BP 2 BBSA n BP n … A Priori, Once Per BBSA Set

11 A Priori, Once Per Problem Class Real-World Problem Sampling Mechanism Surrogate Objective Function Match with most “similar” BP k Identified most appropriate BBSA k

12 Per Problem Instance Real-World Problem Sampling Mechanism Surrogate Objective Function Apply a priori established most appropriate BBSA k

Requirements 1.Need diverse set of high- performance BBSAs 2.Need automated approach to select most appropriate BBSA from set for a given problem 3.Need automated approach to configure selected BBSA 13

Static vs. dynamic parameters Static parameters remain constant during evolution, dynamic parameters can change Dynamic parameters require parameter control The optimal values of the strategy parameters can change during evolution [1]

Solution Create self-configuring BBSAs employing dynamic strategy parameters 15

Supportive Coevolution [3] 16

Dynamical Evolution of Parameters Individuals Encode Parameter Values Indirect Fitness Poor Scaling Self Adaptation

Def.: In Coevolution, the fitness of an individual is dependent on other individuals (i.e., individuals are part of the environment) Def.: In Competitive Coevolution, the fitness of an individual is inversely proportional to the fitness of some or all other individuals Coevolution Basics

Primary Population –Encodes problem solution Support Population –Encodes configuration information Supportive Coevolution Primary Individual Support Individual

Each Parameter Evolved Separately Propagation Rate Separated Multiple Support Populations

Multiple Primary Populations

Rastrigin Shifted Rastrigin –Randomly generated offset vector –Individuals offset before evaluation Benchmark Functions

All Results Statistically Significant according to T-test with α = Fitness Results ProblemSASuCo Rastrigin N = ( ) ( ) Rastrigin N = ( ) ( ) Shifted N = ( ) ( ) Shifted N = ( ) ( )

Proof of Concept Successful –SuCo outperformed SA on all but on test problem SuCo Mutation more successful –Adapts better to change in population fitness Conclusions

Self-Configuring Crossover [2] 25

Performance Sensitive to Crossover Selection Identifying & Configuring Best Traditional Crossover is Time Consuming Existing Operators May Be Suboptimal Optimal Operator May Change During Evolution Motivation

Meta-EA –Exceptionally time consuming Self-Adaptive Algorithm Selection –Limited by algorithms it can choose from Some Possible Solutions

Each Individual Encodes a Crossover Operator Crossovers Encoded as a List of Primitives –Swap –Merge Each Primitive has three parameters –Number, Random, or Inline Self-Configuring Crossover (SCX) Swap(3, 5, 2) Swap(r, i, r) Merge(1, r, 0.7) Offspring Crossover

Applying an SCX Parent 1 GenesParent 2 Genes Concatenate Genes

Each Primitive has a type –Swap represents crossovers that move genetic material First Two Parameters –Start Position –End Position Third Parameter Primitive Dependent –Swaps use “Width” The Swap Primitive Swap(3, 5, 2)

Applying an SCX Concatenate Genes Swap(3, 5, 2) Swap(r, i, r) Merge(1, r, 0.7) Offspring Crossover

Third Parameter Primitive Dependent –Merges use “Weight” Random Construct –All past primitive parameters used the Number construct –“r” marks a primitive using the Random Construct –Allows primitives to act stochastically The Merge Primitive Merge(1, r, 0.7)

Applying an SCX Concatenate Genes Merge(1, r, 0.7) Swap(3, 5, 2) Swap(r, i, r) Offspring Crossover 0.7 g(1) = 1.0*(0.7) + 6.0*(1-0.7) g(i) = α*g(i) + (1-α)*g(j) 2.5g(2) = 6.0*(0.7) + 1.0*(1-0.7)4.5

Only Usable by First Two Parameters Denoted as “i” Forces Primitive to Act on the Same Loci in Both Parents The Inline Construct Swap(r, i, r)

Applying an SCX Concatenate Genes Swap(r, i, r) Merge(1, r, 0.7) Swap(3, 5, 2) Offspring Crossover

Applying an SCX Concatenate Genes Remove Exess Genes Offspring Genes

Evolving Crossovers Merge(1, r, 0.7) Merge(i, 8, r) Swap(r, i, r) Parent 1 Crossover Swap(4, 2, r) Swap(r, 7, 3) Parent 2 Crossover Merge(r, r, r) Offspring Crossover Swap(3, 5, 2)

Compared Against –Arithmetic Crossover –N-Point Crossover –Uniform Crossover On Problems –Rosenbrock –Rastrigin –Offset Rastrigin –NK-Landscapes –DTrap Empirical Quality Assessment ProblemComparisonSCX Rosenbrock (54.54) (23.33) Rastrigin-59.2 (6.998) (0.021) Offset Rastrigin (0.116)-0.03 (0.028) NK0.771 (0.011) (0.013) DTrap (0.005) (0.021)

Requires No Additional Evaluation Adds No Significant Increase in Run Time –All linear operations Adds Initial Crossover Length Parameter –Testing showed results fairly insensitive to this parameter –Even worst settings tested achieved better results than comparison operators SCX Overhead

Remove Need to Select Crossover Algorithm Better Fitness Without Significant Overhead Benefits From Dynamically Changing Operator Conclusions

Current Work Combining Support Coevolution and Self-Configuring Crossover by employing the latter as a support population in the former [4] 41

Part II: Repeatedly Solving Problem Class Instances By Evolving Custom BBSAs Employing Meta-GP [5] 42

Difficult Repeated Problems –Cell Tower Placement –Flight Routing Which BBSA, such as Evolutionary Algorithms, Particle Swarm Optimization, or Simulated Annealing, to use? Problem

Automated Algorithm Selection Self-Adaptive Algorithms Meta-Algorithms –Evolving Parameters / Selecting Operators –Evolve the Algorithm Possible Solutions

Evolving BBSAs employing meta- Genetic Programming (GP) Post-ordered parse tree Evolve a repeated iteration Our Solution

Initialization Check for Termination Terminate Iteration

Genetic Programing Post-ordered parse tree Evolve a repeated iteration High-level operations Our Solution

Iteration Sets of Solutions Root returns ‘Last’ set Parse Tree

Mutate Mutate w/o creating new solution (Tweak) Uniform Recombination Diagonal Recombination Nodes: Variation Nodes

k-Tournament Truncation Nodes: Selection Nodes

makeSet Persistent Sets Nodes: Set Nodes

makeSet Persistent Sets addSet ‘Last’ Set Nodes: Set Nodes

Evaluates the nodes passed in Allows multiple operations and accurate selections within an iteration Nodes: Evaluation Node

Meta-Genetic Program Create Valid Population Generate Children Evaluate Children Select Survivors Check Termination

BBSA Evaluation Create Valid Population Generate Children Evaluate Children Select Survivors Generate Children

Evaluations Iterations Operations Convergence Termination Conditions

Deceptive Trap Problem Testing 0 | 0 | 1 | 1 | 00 | 1 | 0 | 1 | 01 | 1 | 1 | 1 | 0

Problem Configuration –Bit-length = 100 –Trap Size = 5 Verification Problem Configurations –Bit-length = 100, Trap Size = 5 –Bit-length = 200, Trap Size = 5 –Bit-length = 105, Trap Size = 7 –Bit-length = 210, Trap Size = 7 External Verification Testing (cont.)

Results 60% Success Rate

Results: Bit-Length = 100 Trap Size = 5

Results: Bit-Length = 200 Trap Size = 5

Results: Bit-Length = 105 Trap Size = 7

Results: Bit-Length = 210 Trap Size = 7

BBSA1BBSA2 BBSA3

BBSA1

BBSA2

BBSA3

Over-Specialization Trained Problem Configuration Alternate Problem Configuration

BBSA2

Created novel meta-GP approach for evolving BBSAs tuned to specific problem classes Ideal for solving repeated problems Evolved custom BBSA which outperformed standard EA and hill-climber on all tested problem instances Future work includes adding additional primitives and testing against state-of-the- art BBSAs on more challenging problems Summary

Take Home Message Practitioners need automated algorithm selection & configuration The number of BBSAs is increasing rapidly, making the selection of the best one to employ for a given problem increasingly difficult Some recent BBSAs facilitate automated real-world problem solving 71

References [1] Brian W. Goldman and Daniel R. Tauritz. Meta-Evolved Empirical Evidence of the Effectiveness of Dynamic Parameters. In Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '11), pages , Dublin, Ireland, July 12-16, [2] Brian W. Goldman and Daniel R. Tauritz. Self-Configuring Crossover. In Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '11), pages , Dublin, Ireland, July 12-16, [3] Brian W. Goldman and Daniel R. Tauritz. Supportive Coevolution. In Proceedings of the 14th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '12), pages 59-66, Philadelphia, U.S.A., July 7-11, [4] Nathaniel R. Kamrath, Brian W. Goldman and Daniel R. Tauritz. Using Supportive Coevolution to Evolve Self-Configuring Crossover. In Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '13), pages , Amsterdam, The Netherlands, July 6-10, [5] Matthew A. Martin and Daniel R. Tauritz. Evolving Black-Box Search Algorithms Employing Genetic Programming. In Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '13), pages , Amsterdam, The Netherlands, July 6-10, 2013.