Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
What is an Evolutionary Algorithm?
Biologically Inspired Computing: Operators for Evolutionary Algorithms
Student : Mateja Saković 3015/2011.  Genetic algorithms are based on evolution and natural selection  Evolution is any change across successive generations.
1 Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control Ferrante Neri University of Jyväskylä.
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
Genetic Algorithms1 COMP305. Part II. Genetic Algorithms.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Hybrid Evolutionary Algorithms Chapter 10. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Hybridisation with other techniques: Memetic.
Evolutionary Computational Intelligence Lecture 10a: Surrogate Assisted Ferrante Neri University of Jyväskylä.
Evolutionary Computational Intelligence
Evolutionary Computational Inteliigence Lecture 6a: Multimodality.
Effective gradient-free methods for inverse problems Jyri Leskinen FiDiPro DESIGN project.
Evolutionary Computational Intelligence Lecture 9: Noisy Fitness Ferrante Neri University of Jyväskylä.
Tutorial 1 Temi avanzati di Intelligenza Artificiale - Lecture 3 Prof. Vincenzo Cutello Department of Mathematics and Computer Science University of Catania.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
CS 447 Advanced Topics in Artificial Intelligence Fall 2002.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Tutorial 4 (Lecture 12) Remainder from lecture 10: The implicit fitness sharing method Exercises – any solutions? Questions?
Differential Evolution Hossein Talebi Hassan Nikoo 1.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
Genetic Algorithm.
1 Paper Review for ENGG6140 Memetic Algorithms By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Genetic Algorithms Michael J. Watts
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
EE459 I ntroduction to Artificial I ntelligence Genetic Algorithms Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
GENETIC ALGORITHMS.  Genetic algorithms are a form of local search that use methods based on evolution to make small changes to a popula- tion of chromosomes.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
1. Genetic Algorithms: An Overview  Objectives - Studying basic principle of GA - Understanding applications in prisoner’s dilemma & sorting network.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
Evolutionary Computing Chapter 3. / 41 Recap of EC metaphor (1/2) A population of individuals exists in an environment with limited resources Competition.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Introduction to Evolutionary Computing II A.E. Eiben Free University Amsterdam with thanks to the EvoNet Training Committee.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
Chapter 2 1 What is an EA A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing.
Hybrid Evolutionary Algorithms A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 10 1.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Evolutionary Computing Chapter 10. / 27 Chapter 10: Hybridisation with Other Techniques: Memetic Algorithms Why to Hybridise What is a Memetic Algorithm?
Paper Review for ENGG6140 Memetic Algorithms
Evolution Strategies Evolutionary Programming
School of Computer Science & Engineering
CSC 380: Design and Analysis of Algorithms
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Genetic Algorithms Chapter 3.
Hybrid Evolutionary Algorithms- Memetic Algorithm
Hybrid Evolutionary Algorithms
CSC 380: Design and Analysis of Algorithms
Presentation transcript:

Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä

2 The Optimization Problem All the problems can be formulated as an Optimization Problem that is the search of the maximum (or the minimum) of a given objective function Deterministic Methods can fail because they could converge to local optimum Evolutionary Algorithms can fail because they could converge to a sub-optimal solution

3 “Dialects” Developing in Artificial Intelligence Fogel Owens (USA, 1965) Evolutionary Programming Holland Genetic Algorithms (USA, 1973) Genetic Algorithm Rechenberg Schwefel (Germany, 1973) Evolution Strategies 90s Evolutionary Algorithms (EA)

4 Historical Info about MAs The term Memetic Algorithm (MA) is coined by Moscato (1989) ….but as always the same idea was also given under the name of – Hybrid GAs – Baldwinian GAs – Lamarckian GAs – Others…

5 The Metaphor The Meme, the “Selfish Gene” (Dawkin, 1976). The Meme is a unit of “cultural transmission” in the same way that genes are the units of biological transmission. In EAs, genes are encoding of candidate solutions, in MAs the memes are also “strategies” of how to improve the solutions.

6 Memetic Algorithms The combination of Evolutionary Algorithms with Local Search Operators that work within the EA loop has been termed “Memetic Algorithms” Term also applies to EAs that use instance specific knowledge in operators Memetic Algorithms have been shown to be orders of magnitude faster and more accurate than EAs on some problems, and are the “state of the art” on many problems

7 Michalewicz’s view on EAs

8 Local Searchers Local Searcher (LS): a deterministic method able to find the nearest local optimum Local Searchers can be classified according to: – Order – Pivot Rule – Depth – Neighborhood

9 Local Searchers’ Classification Order zero if it uses just the function (direct search), order one if it uses the first derivative, order two if it uses the second derivative Steepest Ascent Pivot Rule: the LS explores all the Neighborhood (e.g Hooke- Jeeves Method). Greedy Pivot Rule: the LS chooses the first better search direction found (e.g. Nelder-Mead Method)

10 Local Searchers’ Classification The depth of the Local Search defines the termination condition for the outer loop (stop criterion) The neighborhood generating function n(i) defines a set of points that can be reached by the application of some move operator to the point i

11 General Scheme of EAs

12 Pseudo-Code for typical EA

13 How to Combine EA and LS

14 Intelligent Initialization The initial population is not given at pseudo- random but it is given according to a heuristic rule. Examples: quasi-random generator, orthogonal arrays It increases the average fitness but it decreases the diversity

15 Intelligent Variation Operators Intelligent Crossover: finds the best combination between parents in order to generate the most performing offspring (e.g. heuristic selection of the cut point) Intelligent Mutation: tries several possible mutated individuals in order to obtain the most “lucky” mutation (e.g. bit to flip)

16 Properly Said Memetic Algorithms: Local Search acting on Offspring Can be viewed as a sort of “lifetime learning” The LS are applied to the offspring in order to have more performing individuals A LS can be viewed also like a special mutation operator and it is often (but not only!) used to speed-up the “endgame” of an EA by making the search in the vicinity In fact the EAs are efficient in finding solutions near the optimum but not in finalizing the search

17 How to apply a Local Searcher? Krasnogor (2002) shows that there are theoretical advantages to using a local search with a move operator (LS to the offspring ) that is different to the move operators used by mutation and crossover but….. How many iterations of the local search are done ? Is local search applied to the whole population? – or just the best ? – or just the worst ? – or to a certain part of the population according to some rules? Basically the right choice depends on the problem!

18 Two Models of Lifetime Adaptation Lamarckian traits acquired by an individual during its lifetime can be transmitted to its offspring (refreshing of the genotype) e.g. replace individual with fitter neighbour Baldwinian traits acquired by individual cannot be transmitted to its offspring (suggests new direction search) e.g. individual receives fitness (but not genotype) of fitter neighbour

19 Efficiency and Robustness of the Memetic Algorithms Usually the fitness landscapes are multimodal and very complex, or the decision space is very big We would like to implement an algorithm which is able to converge, every time it is run, to the optimal solution in a short time (avoiding premature convergence and stagnation)

20 Adaptivity and Self-Adaptivity In order to enhance the efficiency and the robustness of a MA an adaptive or self- adaptive scheme can be used Adaptive: the memes are controlled during the evolution by means of some rules depending on the state of the population Self-Adaptive: the adaptive rules are encoded in the genotype of each individual

21 Multi-Meme systems A Meme Algorithm uses one LS (usually complex) A Multi-Meme Algorithm (M-MA) employs a set (a list) of LSs (usually simple) If a M-MA is implemented the problem of how and when to run the LSs arises and some rules are therefore needed

22 Adaptivity + Multi-Meme In order to properly select from the list the LS to use for the different stages of the evolution an adaptive strategy can be used If the “necessities” of the evolutionary process are efficiently encoded it is possible to use different LSs in different moments and on different individuals (or set of individuals)

23 The use of several Local Searchers Local Searchers with different features explore the search space from different perspectives Different Local Searchers should “compete” and “cooperate” (Ong 2004) working to solve the classical problem, in EAs, of the balancing between “exploration” and “exploitation”

24 An Example: Adaptivity + Multi-Meme on the population diversity The state of the convergence of the algorithm can be measured on the basis of the coefficient: if the convergence is going to approach but it is still quite far the Nelder-Mead is applied since it is greedy and explorative in order to jump out from the nearest basin of attraction If the convergence is very near the Hooke-Jeeves is run since it is a LS with steepest ascent pivot rule and can then finalize the work in the hopefully found global optimum

25 Thank You for Your Attention Questions?