Symposium “New Directions in Evolutionary Computation” Dr. Daniel Tauritz Director, Natural Computation Laboratory Associate Professor, Department of Computer.

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

A First Course in Genetic Algorithms
Advanced Mate Selection in Evolutionary Algorithms.
Student : Mateja Saković 3015/2011.  Genetic algorithms are based on evolution and natural selection  Evolution is any change across successive generations.
Parameter control Chapter 8. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter Control in EAs 2 Motivation 1 An EA has many.
1 Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control Ferrante Neri University of Jyväskylä.
Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8.
Self-Adaptive Semi-Autonomous Parent Selection (SASAPAS) Each individual has an evolving mate selection function Two ways to pair individuals: –Democratic.
The Adaptive Hierarchical Fair Competition (HFC) Model for EA’s Jianjun Hu, Erik D. Goodman Kisung Seo, Min Pei Genetic Algorithm Research &Applications.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Introduction to Genetic Algorithms Yonatan Shichel.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
EC Awards Lecture ~ Spring 2008 Advances in Parameterless Evolutionary Algorithms Lisa Guntly André Nwamba Research Advisor: Dr. Daniel Tauritz Natural.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Image Registration of Very Large Images via Genetic Programming Sarit Chicotay Omid E. David Nathan S. Netanyahu CVPR ‘14 Workshop on Registration of Very.
Christoph F. Eick: Applying EC to TSP(n) Example: Applying EC to the TSP Problem  Given: n cities including the cost of getting from on city to the other.
Evolutionary algorithms
Genetic Algorithm.
Evolutionary Intelligence
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Learning by Simulating Evolution Artificial Intelligence CSMC February 21, 2002.
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
/ 26 Evolutionary Computing Chapter 8. / 26 Chapter 8: Parameter Control Motivation Parameter setting –Tuning –Control Examples Where to apply parameter.
Edge Assembly Crossover
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
Diversity Preservation in Evolutionary Algorithms Jiří Kubalík Intelligent Data Analysis Group Department of Cybernetics CTU Prague.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Chapter 9 Genetic Algorithms Evolutionary computation Prototypical GA
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Sporadic model building for efficiency enhancement of the hierarchical BOA Genetic Programming and Evolvable Machines (2008) 9: Martin Pelikan, Kumara.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Intelligent Exploration for Genetic Algorithms Using Self-Organizing.
School of Computer Science & Engineering
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
Genetic Algorithms Chapter 3.
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
Basics of Genetic Algorithms
Parameter control Chapter 8.
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Parameter control Chapter 8.
Self-Configuring Crossover
Parameter control Chapter 8.
Coevolutionary Automated Software Correction
Presentation transcript:

Symposium “New Directions in Evolutionary Computation” Dr. Daniel Tauritz Director, Natural Computation Laboratory Associate Professor, Department of Computer Science Research Investigator, Intelligent Systems Center Collaborator, Energy Research & Development Center New Directions in Parameterless Evolutionary Algorithms

Vision EA fitness function representation EA operators EA parameters solution (good solution if operators and parameters are suitably configured) NOWGOAL problem instance Parameter- less EA fitness function representation problem instance good solution

EA Operators Parent selection, mate pairing Recombination Mutation Survival selection

EA Parameters Population size Initialization related parameters Parent selection parameters Number of offspring Recombination parameters Mutation parameters Survivor selection parameters Termination related parameters

Motivation for Parameterless EAs Parameterless EAs do not require parameters to be specified a priori A priori parameter tuning is computationally expensive Facilitate use by non-experts

Static vs. dynamic parameters Static parameters remain constant during evolution, dynamic can change The optimal value of a parameter can change during evolution Parameterless EAs w/ static parameters need a fully automated tuning mechanism (still computationally expensive & suboptimal) Therefore desired: Parameterless EA w/ dynamic parameters

Parameter Control While dynamic parameters can benefit from tuning, they can be much less sensitive to initial values (versus static) Controls dynamic parameters Three main parameter control classes: –Blind –Adaptive –Self-Adaptive

Prior (Semi-)Parameterless EAs 1994 Genetic Algorithm with Varying Population Size (GAVaPS) 2000 Genetic Algorithm with Adaptive Population Size (APGA) – dynamic population size as emergent behavior of individual survival tied to age – both introduce two new parameters: MinLT and MaxLT; furthermore, population size converges to 0.5 * offspring size * (MinLT + MaxLT)

Prior (Semi-)Parameterless EAs 1995 (1,λ)-ES with dynamic offspring size employing adaptive control – adjusts λ based on the second best individual created – goal is to maximize local serial progress-rate, i.e., expected fitness gain per fitness evaluation – maximizes convergence rate, which often leads to premature convergence on complex fitness landscapes

Prior (Semi-)Parameterless EAs 1999 Parameter-less GA – runs multiple fixed size populations in parallel – the sizes are powers of 2, starting with 4 and doubling the size of the largest population to produce the next largest population – smaller populations are preferred by allotting them more generations – a population is deleted if a) its average fitness is exceeded by the average fitness of a larger population, or b) the population has converged – no limit on number of parallel populations

Prior (Semi-)Parameterless EAs 2003 self-adaptive selection of reproduction operators – each individual contains a vector of probabilities of using each reproduction operator defined for the problem – probability vectors updated every generation – in the case of a multi-ary reproduction operator, another individual is selected which prefers the same reproduction operator

Prior (Semi-)Parameterless EAs 2004 Population Resizing on Fitness Improvement GA (PRoFIGA) – dynamically balances exploration versus exploitation by tying population size to magnitude of fitness increases with a special mechanism to escape local optima – introduces several new parameters

Prior (Semi-)Parameterless EAs 2005 (1+λ)-ES with dynamic offspring size employing adaptive control – adjusts λ based on the number of offspring fitter than their parent: if none fitter, than double λ; otherwise divide λ by number that are fitter – idea is to quickly increase λ when it appears to be too small, otherwise to decrease it based on the current success rate – has problems with complex fitness landscapes that require a large λ to ensure that successful offspring lie on the path to the global optimum

Prior (Semi-)Parameterless EAs 2006 self-adaptation of population size and selective pressure – employs “voting system” by encoding individual’s contribution to population size in its genotype – population size is determined by summing up all the individual “votes” – adds new parameters p min and p max that determine an individual’s vote value range

NC-LAB Vision for a New Direction in Parameterless EAs: Autonomous EAs (AutoEAs)

Motivation Selection operators are not commonly used in an adaptive manner Most selection pressure mechanisms are based on Boltzmann selection Framework for creating Parameterless EAs Centralized population size control, parent selection, mate pairing, offspring size control, and survival selection are highly unnatural!

Approach Remove unnatural centralized control by: Letting individuals select their own mates Letting couples decide how many offspring to have Giving each individual its own survival chance

Autonomous EAs (AutoEAs) An AutoEA is an EA where all the operators work at the individual level (as opposed to traditional EAs where parent selection and survival selection work at the population level in a decidedly unnatural centralized manner) Population & offspring size become dynamic derived variables determined by the emergent behavior of the system

Self-Adaptive Semi-Autonomous Parent Selection (SASAPAS) Each individual has an evolving mate selection function Two ways to pair individuals: –Democratic approach –Dictatorial approach

Democratic Approach

Dictatorial Approach

Self-Adaptive Semi-Autonomous Dictatorial Parent Selection (SASADIPS) Each individual has an evolving mate selection function First parent selected in a traditional manner Second parent selected by first parent –the dictator – using its mate selection function

Mate selection function representation Expression tree as in GP Set of primitives – pre-built selection methods

Mate selection function evolution Let F be a fitness function defined on a candidate solution. Let improvement(x) = F(x) – max{F(p1),F(p2)} Max fitness plot; slope at generation i is s(g i )

Mate selection function evolution IF improvement(offspring)>s(g i-1 ) –Copy first parent’s mate selection function (single parent inheritance) Otherwise –Recombine the two parents’ mate selection functions using standard GP crossover (multi-parent inheritance) –Apply a mutation chance to the offspring’s mate selection function

Experiments Counting ones 4-bit deceptive trap –If 4 ones => fitness = 8 –If 3 ones => fitness = 0 –If 2ones => fitness = 1 –If 1 one => fitness = 2 –If 0 ones => fitness = 3 SAT

Counting ones results

Highly evolved mate selection function

SAT results

4-bit deceptive trap results

SASADIPS shortcomings Steep fitness increase in the early generations may lead to premature convergence to suboptimal solutions Good mate selection functions hard to find Provided mate selection primitives may be insufficient to build a good mate selection function New parameters were introduced Only semi-autonomous

Greedy Population Sizing (GPS)

|P 1 | = 2|P 0 | … |P i+1 | = 2|P i | The parameter-less GA P0P0 P1P1 P2P2 Evolve an unbounded number of populations in parallel Smaller populations are given more fitness evaluations Fitness evals Terminate smaller pop. whose avg. fitness is exceeded by a larger pop.

Greedy Population Sizing P0P0 P1P1 P2P2 P3P3 P4P4 P5P5 F1F1 F2F2 F3F3 F4F4 Evolve exactly two populations in parallel Equal number of fitness evals. per population Fitness evals

GPS-EA vs. parameter-less GA F1F1 F2F2 F3F3 F4F4 NN F1F1 2F 1 F2F2 2F 2 F3F3 2F 3 F4F4 2F 4 2F 1 + 2F 2 + … + 2F k + 3N N 2N F 1 + F 2 + … + F k + 2N N Parameter-less GA GPS-EA

GPS-EA vs. the parameter-less GA, OPS-EA and TGA GPS-EA < parameter-less GA TGA < GPS-EA < OPS-EA GPS-EA finds overall better solutions than parameter-less GA Deceptive Problem

Limiting Cases F avg (P i+1 )<F avg (P i ) No larger populations are created No fitness improvements until termination Approx. 30% - limiting cases Large std. dev., but lower MBF Automatic detection of the limiting cases is needed

GPS-EA Summary Advantages –Automated population size control –Finds high quality solutions Problems –Limiting cases –Restart of evolution each time

Estimated Learning Offspring Optimizing Mate Selection (ELOOMS)

Traditional Mate Selection MATES t – tournament selection t is user-specified

ELOOMS NO YES MATES YES NO YES

Mate Acceptance Chance (MAC) j How much do I like ? k b 1 b 2 b 3 … b L d 1 d 2 d 3 … d L

Desired Features j d 1 d 2 d 3 … d L # times past mates’ b i = 1 was used to produce fit offspring # times past mates’ b i was used to produce offspring b 1 b 2 b 3 … b L Build a model of desired potential mate Update the model for each encountered mate Similar to Estimation of Distribution Algorithms

ELOOMS vs. TGA L=500 With Mutation L=1000 With Mutation Easy Problem

ELOOMS vs. TGA Without Mutation With Mutation Deceptive Problem L=100

Why ELOOMS works on Deceptive Problem More likely to preserve optimal structure will equally like: – – – But will dislike individuals not of the form: –1111 xxxx

Why ELOOMS does not work as well on Easy Problem High fitness – short distance to optimal Mating with high fitness individuals – closer to optimal offspring Fitness – good measure of good mate ELOOMS – approximate measure of good mate

ELOOMS computational overhead L – solution length μ – population size T – avg # mates evaluated per individual Update stage: –6L additions Mate selection stage: –2L*T* μ additions

ELOOMS Summary Advantages –Autonomous mate pairing –Improved performance (some cases) –Natural termination condition Disadvantages –Relies on competition selection pressure –Computational overhead can be significant

GPS-EA + ELOOMS Hybrid

Expiration of population P i If F avg (P i+1 ) > F avg (P i ) –Limiting cases possible If no mate pairs in P i (ELOOMS) –Detection of the limiting cases

Comparing the Algorithms Without MutationWith Mutation Deceptive Problem L=100

GPS-EA + ELOOMS vs. parameter-less GA and TGA Without MutationWith Mutation Deceptive Problem L=100

GPS-EA + ELOOMS vs. parameter-less GA and TGA Without MutationWith Mutation Easy Problem L=500

GPS-EA + ELOOMS Summary Advantages –No population size tuning –No parent selection pressure tuning –No limiting cases –Superior performance on deceptive problem Disadvantages –Reduced performance on easy problem –Relies on competition selection pressure

NC-LAB’s current AutoEA research Make λ a dynamic derived variable by self- adapting each individual’s desired offspring size Promote “birth control” by penalizing fitness based on “child support” and use fitness based survival selection Make μ a dynamic derived variable by giving each individual its own survival chance Make individuals mortal by having them age and making an individual’s survival chance dependent on its age as well as its fitness