Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2013/14 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2013/14 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Biologically Inspired Computing: Operators for Evolutionary Algorithms
Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Evolutionary Computational Intelligence
Iterative Improvement Algorithms
An Introduction to Black-Box Complexity
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
Genetic Algorithm.
Genetic Algorithms and Ant Colony Optimisation
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Genetic algorithms Prof Kang Li
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
Neural and Evolutionary Computing - Lecture 5 1 Evolutionary Computing. Genetic Algorithms Basic notions The general structure of an evolutionary algorithm.
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
1 Combinatorial Problem. 2 Graph Partition Undirected graph G=(V,E) V=V1  V2, V1  V2=  minimize the number of edges connect V1 and V2.
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Genetic Algorithms. Evolutionary Methods Methods inspired by the process of biological evolution. Main ideas: Population of solutions Assign a score or.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Exact and heuristics algorithms
1 Genetic Algorithms and Ant Colony Optimisation.
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
METAHEURISTICS Genetic Algorithm Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 5: Power of Heuristic; non- conventional search.
Biologically Inspired Computing: Evolutionary Algorithms: Some Details and Examples This is additional material for lecture 2 of `Biologically Inspired.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Genetic Algorithms Chapter Description of Presentations
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
►Search and optimization method that mimics the natural selection ►Terms to define ٭ Chromosome – a set of numbers representing one possible solution ٭
1 Combinatorial Problem. 2 Graph Partition Undirected graph G=(V,E) V=V1  V2, V1  V2=  minimize the number of edges connect V1 and V2.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Genetic Algorithms.
Computational Intelligence
Evolutionary Algorithms Jim Whitehead
School of Computer Science & Engineering
CSC 380: Design and Analysis of Algorithms
Artificial Intelligence (CS 370D)
CS621: Artificial Intelligence
Computational Intelligence
Computational Intelligence
EE368 Soft Computing Genetic Algorithms.
Searching for solutions: Genetic Algorithms
Biologically Inspired Computing: Operators for Evolutionary Algorithms
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Computational Intelligence
CSC 380: Design and Analysis of Algorithms
Population Methods.
Presentation transcript:

Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 2 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 2 Plan for Today ● Evolutionary Algorithms (EA) ● Optimization Basics ● EA Basics

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 3 Optimization Basics ? !! ! !? ! ?! modelling simulation optimization system outputinput

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 4 Optimization Basics optimization problem: find x*  X such that f(x*) = min{ f(x) : x  X } note: max{ f(x) : x  X } = – min{ – f(x) : x  X } x* global solution f(x*) global optimum objective: find solution with minimal or maximal value! given: objective function f: X → feasible region X (= nonempty set)

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 5 Optimization Basics local solution x*  X :  x  N(x*): f(x*) ≤ f(x) neighborhood of x* = bounded subset of X example: X = n, N  (x*) = { x  X: || x – x*|| 2 ≤  if x* local solution then f(x*) local optimum / minimum remark: evidently, every global solution / optimum is also local solution / optimum; the reverse is wrong in general! ab x* example: f: [a,b] →, global solution at x*

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 6 Optimization Basics What makes optimization difficult? some causes: local optima (is it a global optimum or not?) constraints (ill-shaped feasible region) non-smoothness (weak causality) discontinuities (  nondifferentiability, no gradients) lack of knowledge about problem (  black / gray box optimization) f(x) = a 1 x a n x n → max! with x i  {0,1}, a i  add constaint g(x) = b 1 x b n x n ≤ b  x i * = 1 iff a i > 0  NP-hard add capacity constraint to TSP  CVRP  still harder strong causality needed!

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 7 Optimization Basics When using which optimization method? mathematical algorithms problem explicitly specified problem-specific solver available problem well understood ressources for designing algorithm affordable solution with proven quality required  don‘t apply EAs randomized search heuristics problem given by black / gray box no problem-specific solver available problem poorly understood insufficient ressources for designing algorithm solution with satisfactory quality sufficient  EAs worth a try

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 8 Evolutionary Algorithm Basics idea: using biological evolution as metaphor and as pool of inspiration  interpretation of biological evolution as iterative method of improvement feasible solution x  X = S 1 x... x S n = chromosome of individual multiset of feasible solutions = population: multiset of individuals objective function f: X → = fitness function often: X = n, X = n = {0,1} n, X = n = {  :  is permutation of {1,2,...,n} } also : combinations like X = n x p x q or non-cartesian sets  structure of feasible region / search space defines representation of individual

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 9 Evolutionary Algorithm Basics initialize population evaluation parent selection variation (yields offspring) survival selection (yields new population) evaluation (of offspring) stop? output: best individual found Y N algorithmic skeleton

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 10 Evolutionary Algorithm Basics population size = 1, number of offspring = 1, selects best from 1+1 individuals parentoffspring 1.initialize X (0)  n uniformly at random, set t = 0 2.evaluate f(X (t) ) 3.select parent: Y = X (t) 4.variation: flip each bit of Y independently with probability p m = 1/n 5.evaluate f(Y) 6.selection: if f(Y) ≤ f(X (t) ) then X (t+1) = Y else X (t+1) = X (t) 7.if not stopping then t = t+1, continue at (3) no choice, here Specific example: (1+1)-EA in n for minimizing some f: n →

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 11 Evolutionary Algorithm Basics population size = 1, number of offspring = 1, selects best from 1+1 individuals parentoffspring 1.initialize X (0)  C  n uniformly at random, set t = 0 2.evaluate f(X (t) ) 3.select parent: Y = X (t) 4.variation = add random vector: Y = Y + Z, e.g. Z  N(0, I n ) 5.evaluate f(Y) 6.selection: if f(Y) ≤ f(X (t) ) then X (t+1) = Y else X (t+1) = X (t) 7.if not stopping then t = t+1, continue at (3) no choice, here compact set = closed & bounded Specific example: (1+1)-EA in n for minimizing some f: n →

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 12 Evolutionary Algorithm Basics Selection (a)select parents that generate offspring → selection for reproduction (b)select individuals that proceed to next generation → selection for survival necessary requirements: - selection steps must not favor worse individuals - one selection step may be neutral (e.g. select uniformly at random) - at least one selection step must favor better individuals typically: selection only based on fitness values f(x) of individuals seldom: additionally based on individuals‘ chromosomes x (→ maintain diversity)

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 13 Evolutionary Algorithm Basics Selection methods population P = (x 1, x 2,..., x  ) with  individuals uniform / neutral selection choose index i with probability 1/  fitness-proportional selection choose index i with probability s i = two approaches: 1. repeatedly select individuals from population with replacement 2. rank individuals somehow and choose those with best ranks (no replacement) problems: f(x) > 0 for all x  X required  g(x) = exp( f(x) ) > 0 but already sensitive to additive shifts g(x) = f(x) + c almost deterministic if large differences, almost uniform if small differences don‘t use!

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 14 Evolutionary Algorithm Basics Selection methods population P = (x 1, x 2,..., x  ) with  individuals rank-proportional selection order individuals according to their fitness values assign ranks fitness-proportional selection based on ranks  avoids all problems of fitness-proportional selection but: best individual has only small selection advantage (can be lost!) outdated! k-ary tournament selection draw k individuals uniformly at random (typically with replacement) from P choose individual with best fitness (break ties at random)  has all advantages of rank-based selection and probability that best individual does not survive:

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 15 Evolutionary Algorithm Basics Selection methods without replacement population P = (x 1, x 2,..., x  ) with  parents and population Q = (y 1, y 2,..., y ) with offspring ( , )-selection or truncation selection on offspring or comma-selection rank offspring according to their fitness select  offspring with best ranks  best individual may get lost, ≥  required (  + )-selection or truncation selection on parents + offspring or plus-selection merge offspring and  parents rank them according to their fitness select  individuals with best ranks  best individual survives for sure

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 16 Evolutionary Algorithm Basics Selection methods: Elitism - Intrinsic elitism: method selects from parent and offspring, best survives with probability 1 - Forced elitism:if best individual has not survived then re-injection into population, i.e., replace worst selected individual by previously best parent methodP{ select best }from parents & offspringintrinsic elitism neutral< 1no fitness proportionate< 1no rank proportionate< 1no k-ary tournament< 1no (  + ) = 1yes ( , ) = 1no Elitist selection: best parent is not replaced by worse individual.

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 17 Evolutionary Algorithm Basics Variation operators: depend on representation mutation→ alters a single individual recombination→ creates single offspring from two or more parents may be applied ● exclusively (either recombination or mutation) chosen in advance ● exclusively (either recombination or mutation) in probabilistic manner ● sequentially (typically, recombination before mutation); for each offspring ● sequentially (typically, recombination before mutation) with some probability

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 18 Evolutionary Algorithm Basics ● Mutation Individuals  { 0, 1 } n a)local→ choose index k  { 1, …, n } uniformly at random, flip bit k, i.e., x k = 1 – x k b)global→ for each index k  { 1, …, n }: flip bit k with probability p m  (0,1) c)“nonlocal“→ choose K indices at random and flip bits with these indices d)inversion→ choose start index k s and end index k e at random invert order of bits between start and end index a) k= b) ksks keke d) c) K=2 → → Variation in n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 19 Evolutionary Algorithm Basics ● Recombination (two parents) a)1-point crossover→ draw cut-point k  {1,…,n-1} uniformly at random; choose first k bits from 1st parent, choose last n-k bits from 2nd parent b)K-point crossover→ draw K distinct cut-points uniformly at random; choose bits 1 to k 1 from 1st parent, choose bits k 1 +1 to k 2 from 2nd parent, choose bits k 2 +1 to k 3 from 1st parent, and so forth … c)uniform crossover→ for each index i: choose bit i with equal probability from 1st or 2nd parent a)  c)  b)  Individuals  { 0, 1 } n Variation in n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 20 Evolutionary Algorithm Basics ● Recombination (multiparent:  = #parents) b)gene pool crossover (  > 2) Individuals  { 0, 1 } n a)diagonal crossover (2 <  < n) AAAAAAAAAA BBBBBBBBBB CCCCCCCCCC DDDDDDDDDD → choose  – 1 distinct cut points, select chunks from diagonals ABBBCCDDDD BCCCDDAAAA CDDDAABBBB DAAABBCCCC can generate  offspring; otherwise choose initial chunk at random for single offspring → for each gene: choose donating parent uniformly at random Variation in n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 21 Evolutionary Algorithm Basics ● Mutation a)local→ 2-swap / 1-translocation b)global→ draw number K of 2-swaps, apply 2-swaps K times K is positive random variable; its distribution may be uniform, binomial, geometrical, …; E[K] and V[K] may control mutation strength expectationvariance Individuals  X =  (1, …, n) Variation in n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 22 Evolutionary Algorithm Basics ● Recombination (two parents) Individuals  X =  (1, …, n) b)partially mapped crossover (PMX) - select two indices k 1 and k 2 with k 1 ≤ k 2 uniformly at random - copy genes k 1 to k 2 from 1 st parent to offspring (keep positions) - copy all genes not already contained in offspring from 2 nd parent (keep positions) - from left to right: fill in remaining genes from 2 nd parent a)order-based crossover (OBX) - select two indices k 1 and k 2 with k 1 ≤ k 2 uniformly at random - copy genes k 1 to k 2 from 1 st parent to offspring (keep positions) - copy genes from left to right from 2 nd parent, starting after position k 2 x x x x x x x x x x Variation in n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 23 Evolutionary Algorithm Basics ● Mutation a)local→ Z with bounded supportDefinition Let f Z : n → + be p.d.f. of r.v. Z. The set { x  n : f Z (x) > 0 } is termed the support of Z. additive: Y = X + Z(Z: n-dimensional random vector) offspring =parent + mutation x 0 fZfZ 0 b)nonlocal→ Z with unbounded support fZfZ x 0 0 most frequently used! Variation in n Individuals X  n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 24 Evolutionary Algorithm Basics ● Recombination (two parents) b)intermediate c)intermediate (per dimension) d)discrete e)simulated binary crossover (SBX) → for each dimension with probability p c draw from: Variation in n Individuals X  n a)all crossover variants adapted from n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 25 Evolutionary Algorithm Basics ● Recombination (multiparent),  ≥ 3 parents a)intermediate where and (all points in convex hull) b)intermediate (per dimension) Variation in n Individuals X  n

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 26 Evolutionary Algorithm Basics Proof: ■ Theorem Let f: n → be a strictly quasiconvex function. If f(x) = f(y) for some x ≠ y then every offspring generated by intermediate recombination is better than its parents.

Lecture 09 G. Rudolph: Computational Intelligence ▪ Winter Term 2015/16 27 Evolutionary Algorithm Basics Proof: ■ Theorem Let f: n → be a differentiable function and f(x) < f(y) for some x ≠ y. If (y – x)‘  f(x) < 0 then there is a positive probability that an offspring generated by intermediate recombination is better than both parents.