Evolutionary Computation Melanie Mitchell Portland State University and Santa Fe Institute Complex Systems Summer School Thursday June 19, 2008 Copyright © 2008 by Melanie Mitchell
Outline of Lectures Brief history of evolutionary computation A simple example: Evolving Robby the Robot EC in the real world: Case studies Co-evolutionary learning
Adaptive Computation Using ideas from natural adaptive systems to build adaptive or “life-like” computing systems. –Neural networks –Genetic algorithms –Cellular automata –Ant-colony optimization –Swarm computing –Artificial immune systems –Market-inspired artificial intelligence
Evolution made simple Essentials of Darwinian evolution: –Organisms reproduce in proportion to their fitness in the environment –Offspring inherit traits from parents –Traits are inherited with some variation, via mutation and sexual recombination Charles Darwin 1809–1882
Evolution made simple Essentials of Darwinian evolution: –Organisms reproduce in proportion to their fitness in the environment –Offspring inherit traits from parents –Traits are inherited with some variation, via mutation and sexual recombination Essentials of evolutionary algorithms: –Computer “organisms” (e.g., programs) reproduce in proportion to their fitness in the environment (e.g., how well they perform a desired task) –Offspring inherit traits from their parents –Traits are inherited, with some variation, via mutation and “sexual recombination”
Appeal of ideas from evolution: Successful method of searching large spaces for good solutions (chromosomes / organisms) Massive parallelism Adaptation to changing environments Emergent complexity from simple rules
Evolutionary Computation: A Brief History 1940s –1950s: –“Automata theory” considered to be both an engineering discipline and an approach to theoretical biology (e.g., von Neumann, Wiener, Turing) 1960s s: –Evolutionary programming (Fogel, Owens, and Walsh) –Evolution strategies (Rechenberg and Schwefel) –Genetic algorithms (Holland)
Evolutionary Computation: A Brief History 1980s –1990s: –Genetic Programming (Koza) –“Evolutionary Computation”
EC in the real world: A few examples Automating parts of aircraft design (Boeing) Analyzing satellite images (Los Alamos) Automating assembly line scheduling (John Deere) Improving telecommunications networks (British Telecom) Generating realistic computer animation (The Lord of the Rings, Troy)
Drug discovery (Daylight Chemical Information Systems) Detecting fraudulent stock trades (London Stock Exchange) Analysis of credit card data (Capital One) Forecasting financial markets and portfolio optimization (First Quadrant). Interactive evolution for graphics design (Karl Sims)
A Simple Example: Using Genetic Algorithms to Evolve Robby the Robot
What Robby can see: North: empty South: empty East: empty West: can Center: empty
North: wall South: empty East: wall West: empty Center: empty
What Robby can do: Actions 0: Move north 1: Move south 2: Move east 3: Move west 4: Stay put 5: Pick up can 6: Random move
Example Strategy (243 possible “situations” or states)
Genetic Algorithm Components of a GA: Population of candidate solutions to a given problem (“chromosomes”) Fitness function that assigns fitness to each chromosome in the population Selection operator that selects individuals to reproduce Genetic operators that take existing chromosomes and produce offspring with variation (e.g., mutation, crossover)
Evolving Robby: Details Repeat for 1000 generations: 1.Generate initial population of 200 random strategies. 2.Calculate the fitness of each strategy in the population. 3.Apply evolution to the current population of strategies to create a new population of 200 strategies. 4.Return to step 2 with this new generation.
Example random initial population Individual 1: Individual 2: Individual 3: Individual 200:
Calculate the fitness of each strategy Points: Successfully picks up can: 10 Tries to pick up can in empty site: -1 Crashes in to wall: -5 (bounces back to original site)
Run for 200 actions per session, 100 different random sessions. Fitness = average number of points per session... Calculate the fitness of each strategy
Apply evolution to the current population of strategies to create a new population of 200 strategies. Individual 1: Fitness: Individual 2: Fitness: Individual 3: Fitness: Individual 200: Fitness:
Select pair of parents, crossover, and mutate Parent 1: Parent 2: Cross over parents
Select pair of parents, crossover, and mutate Parent 1: Parent 2: Cross over parents
Select pair of parents, crossover, and mutate Parent 1: Parent 2: Cross over parents
Select pair of parents, crossover, and mutate Parent 1: Parent 2: Child 1: Child 2: Cross over parents
Select pair of parents, crossover, and mutate Parent 1: Parent 2: Child 1: Child 2: Mutate: At each locus, with low probability, replace digit by random digit in [0,6]
Select pair of parents, crossover, and mutate Parent 1: Parent 2: Child 1: Child 2: Mutate: At each locus, with low probability, replace digit by random digit in [0,6]
Select pair of parents, crossover, and mutate Parent 1: Parent 2: Child 1: Child 2: Add children to new generation
Select pair of parents, crossover, and mutate Parent 1: Parent 2: Child 1: Child 2: Continue this process until new generation is full (200 strings) Replace current population with new generation. Repeat algorithm for a total of 1000 generations
Strategy M (Designed by me): Strategy G (evolved by GA): “If there is a can in the current site, pick it up. Otherwise, if there is a can in one of the adjacent sites, move to that site. Otherwise, choose a random direction to move in.”
Fitness: 346 (out of maximum 500) Fitness: 483 (out of maximum 500)
Strategy M in a cluster of cans
Strategy G in a cluster of cans
Dynamics from one run Best fitness Generation
Evolutionary Computation in the Real World: Case Studies
Any computer program can be expressed as a “parse tree”: (* PI (* R R)) Genetic Programming (Koza, 1990)
1. Choose a set of functions and terminals for the program, e.g., {+, -, *, /, sqrt, sin, cos, abs, pow, R, PI, D, C, rand()} 2. Generate an initial population of random programs (trees), each up to some maximum depth. Koza’s genetic programming algorithm:
3. Run the GA: –Fitness: Run each program on “training data”. Fitness is how many training cases the program gets right (or how close it gets to correct answer). –Selection: Select parents probabilistically, based on fitness. –Crossover: Exchange subtrees of parents. –Mutation: Replace subtree with a random tree.
Genetic Art and Computer Graphics (Karl Sims, 1993) GA individuals: equations that generate a color for each pixel coordinate Function set: +, -, /, mod, round, min, max, abs, expt, log, and, or, xor, sin, cos, atan, if, dissolve, hsv-to-rgb, vector, transform-vector, bw-noise, color-noise, warped-bw- noise, warped-color-noise, blur, band-pass, grade-mag, grad-dir, bump, ifs, warped-ifs, warp-abs, warp-rel, warp-by-grad Terminal set: X, Y, rand_scalar(), rand_vector(n)
Each function returns an image (an array of pixel colors)
Left to right, top to bottom: a.Y b.X c.(abs X) d.(mod X (abs Y)) e.(and X Y) f.(bw-noise.2 2) g.(color-noise.1 2) h.(grad-direction (bw-noise.15 2) 0 0) i.(warped-color-noise (* X.2) Y.1 2)
Demo
Some Results
(round (log (+ y (color-grad (round (+ (abs (round (log (+ y (color-grad (round (+ y (log (invert y) 15.5)) x) #( ) 1.35)) 0.19) x)) (log (invert y) 15.5)) x) #( ) 1.35)) 0.19) x)
Evolution of Collective Computation in Cellular Automata Jim Crutchfield Rajarshi Das Wim Hordijk Melanie Mitchell Erik van Nimwegen Cosma Shalizi
One-dimensional cellular automata
Rule:
One-dimensional cellular automata Rule:
One-dimensional cellular automata Rule:
One-dimensional cellular automata Rule:
One-dimensional cellular automata Rule:
Space Time Space-time diagram ECA 110
Cellular automata as computers: Current renewal of interest due to: –Distributed spatial computing in sensor networks –Reconfigurable computing (FPGAs) –Molecular and quantum-scale computation (e.g., quantum dot cellular automata; molecular self- assembly of nanoscale electronics) –Renewed interest in how biological systems compute (and how that can inspire new computer architectures) –A “new kind of science”?
A computational task for cellular automata
Design a cellular automata to decide whether or not the initial pattern has a majority of “on”cells. A computational task for cellular automata
Design a cellular automata to decide whether or not the initial pattern has a majority of “on”cells. –If a majority of cells are initially on, then after some number of iterations, all cells should turn on A computational task for cellular automata
Design a cellular automata to decide whether or not the initial pattern has a majority of “on”cells. –If a majority of cells are initially on, then after some number of iterations, all cells should turn on –Otherwise, after some number of iterations, all cells should turn off. A computational task for cellular automata
majority on initial
majority on initial final
majority offmajority on initial final
majority offmajority on initial final
majority offmajority on initial final How to design a CA to do this?
We used cellular automata with 6 neighbors for each cell: Rule:......
A candidate solution that does not work: local majority voting
Evolving cellular automata with genetic algorithms
Create a random population of candidate cellular automata rules Evolving cellular automata with genetic algorithms
Create a random population of candidate cellular automata rules The “fitness” of each cellular automaton is how well it performs the task. (Analogous to surviving in an environment.) Evolving cellular automata with genetic algorithms
Create a random population of candidate cellular automata rules The “fitness” of each cellular automaton is how well it performs the task. (Analogous to surviving in an environment.) The fittest cellular automata get to reproduce themselves, with mutations and crossovers. Evolving cellular automata with genetic algorithms
Create a random population of candidate cellular automata rules The “fitness” of each cellular automaton is how well it performs the task. (Analogous to surviving in an environment.) The fittest cellular automata get to reproduce themselves, with mutations and crossovers. This process continues for many generations. Evolving cellular automata with genetic algorithms
The “chromosome” of a cellular automaton is an encoding of its rule table:
Rule table:...
The “chromosome” of a cellular automaton is an encoding of its rule table: Rule table: “Chromosome”:
Create a random population of candidate cellular automata rules:
rule 1: rule 2: rule 3: rule 100:
Calculating the Fitness of a Rule
For each rule, create the corresponding cellular automaton. Run that cellular automaton on many initial configurations.
Calculating the Fitness of a Rule For each rule, create the corresponding cellular automaton. Run that cellular automaton on many initial configurations. Fitness of rule = fraction of correct classifications
For each cellular automaton rule in the population:
rule 1: For each cellular automaton rule in the population:
rule 1: Create rule table For each cellular automaton rule in the population:
rule 1: Create rule table For each cellular automaton rule in the population:...
rule 1 rule table:...
Run corresponding cellular automaton on many random initial lattice configurations rule 1 rule table:...
Run corresponding cellular automaton on many random initial lattice configurations incorrect rule 1 rule table:
Run corresponding cellular automaton on many random initial lattice configurations incorrectcorrect rule 1 rule table:
Run corresponding cellular automaton on many random initial lattice configurations incorrectcorrect etc. rule 1 rule table:
Run corresponding cellular automaton on many random initial lattice configurations incorrectcorrect etc. rule 1 rule table: Fitness of rule = fraction of correct classifications
GA Population:
rule 1: Fitness = 0.5 rule 2: Fitness = 0.2 rule 3: Fitness = 0.4. rule 100: Fitness = 0.0 GA Population:
majority on A cellular automaton evolved by the genetic algorithm (Performance 80%) majority off
Another Task: Synchronization
Evolving image-processing algorithms GENIE project (GENetic Image Exploitation), Los Alamos National Laboratory Brumby, Harvey, Perkins, Porter, Bloch, Szymanski, et al.
Consider a large library of multi-spectral photographic images from satellites and high-flying airplanes.
How to automatically find images containing desired features? E.g., water, clouds, craters on Mars, forest fire damage, vegetation, urban areas, roads, airports, bridges, beaches,...
Special-purpose algorithms can (possibly) be developed for any particular feature, but goal of GENIE project is to have a general-purpose system:
Special-purpose algorithms can (possibly) be developed for any particular feature, but goal of GENIE project is to have a general-purpose system: A user can specify any feature (via training examples), and system will quickly find images containing that feature.
Training Examples for GENIE Input images consist of ~ spectral planes. User brings up training image(s) in GUI User paints examples of “true” and “false” pixels.
From Brumby et al., 2000
Human created training images Green = “feature” Red = “non-feature” Black = “not classified” Roads Vegetation Water
Data and Feature Planes Data planes (spectral channels) “Feature” planes Truth plane
What GENIE Evolves GENIE evolves an image processing algorithm that operates on data planes to produce a new set of feature planes. New feature planes are used as training data for conventional classifiers (e.g., linear discriminants, neural networks, support vector machines)
GENIE’s Gene’s Genes are elementary image-processing operations that read from data planes (spectral channels) or feature planes and write to feature planes e.g., addp(d1, d2, s2) linear_comb(s2, s1, s1, 0.9) erode(s1, s1, 3, 1) Elementary operators include spectral, morphological, logical, and thresholding operators
GENIE’s Chromosomes Chromosomes are algorithms made up of genes lawd (d3, s2); Law’s textural operator variance (s2, s2, 3, 1) ; local variance in circular neighb. min(d1, d2, s1); min at each pixel location linear_comb (s1, d0, s1, 0.9) smooth_r5r5 (s2, s2); textural operator r5r5 open (s2,s2, 3,1); erode, then dilate subp (d1, s1, s1); subtract planes min (d0, s2, s2) adds (d1, s0, 0.25); add scalar opcl (s1,s1, 1, 1); erode, dilate, dilate, erode h_basin (s0, s0, 18); regional minima linear_comb (s0, d9, s0, 0.2) linear_comb (d4, s2, s2, 0.9) addp (d6, s0, s0); add planes
GENIE’s Population Initial population is a set of randomly generated chromosomes: awd (d3, s2) variance (s2, s2, 3, 1) min(d1, d2, s1) linear_comb (s1, d0, s1, 0.9) smooth_r5r5 (s2, s2) open (s2,s2, 3,1) subp (d1, s1, s1) min (d0, s2, s2) adds (d1, s0, 0.25) opcl (s1,s1, 1, 1) erode(d3, s2,3,1) linear_comb (d2, d1, s2, 0.5) min (s1, s2, s2) open (s2,s2, 3,1) open (s1,s2, 1,1) variance (s2, s2, 3, 1) addp (s1, s0, s2) smooth_r5r5 (d1, s3) opcl (s1,s1, 1, 3) lawd(d2, s0) subp (s1, s1, s1)
GENIE’s Fitness Fitness: –Run chromosome on image data to produce feature planes. –Find linear combination of planes that maximizes spectral separation between labeled “true” and “false” pixels. –Threshold resulting plane to get binary image (each pixel is 1 or 0): –1 = feature, 0= non-feature –Use threshold that will maximize fitness
GENIE’s Fitness –Fitness: where R d is detection rate: fraction of “feature” pixels (green) classified correctly R f is false alarm rate: fraction of non-feature pixels (red) classified incorrectly F=1000 is perfect fitness.
The GA Selection: Selection some number of best in population Crossover: Recombine different genes from two different parents Mutation: Randomly change a chosen gene
awd (d3, s2) variance (s2, s2, 3, 1) min(d1, d2, s1) linear_comb (s1, d0, s1, 0.9) smooth_r5r5 (s2, s2) open (s2,s2, 3,1) subp (d1, s1, s1) min (d0, s2, s2) adds (d1, s0, 0.25) opcl (s1,s1, 1, 1) open (s1,s2, 1,1) variance (s2, s2, 3, 1) addp (s1, s0, s2) smooth_r5r5 (d1, s3) opcl (s1,s1, 1, 3) lawd(d2, s0) subp (s1, s1, s1) Parent 1Parent 2
awd (d3, s2) variance (s2, s2, 3, 1) min(d1, d2, s1) linear_comb (s1, d0, s1, 0.9) smooth_r5r5 (s2, s2) open (s2,s2, 3,1) subp (d1, s1, s1) min (d0, s2, s2) adds (d1, s0, 0.25) opcl (s1,s1, 1, 1) open (s1,s2, 1,1) variance (s2, s2, 3, 1) addp (s1, s0, s2) smooth_r5r5 (d1, s3) opcl (s1,s1, 1, 3) lawd(d2, s0) subp (s1, s1, s1) awd (d3, s2) variance (s2, s2, 3, 1) min(d1, d2, s1) linear_comb (s1, d0, s1, 0.9) smooth_r5r5 (s2, s2) open (s2,s2, 3,1) smooth_r5r5 (d1, s3) opcl (s1,s1, 1, 3) lawd(d2, s0) subp (s1, s1, s1) Parent 1 Child Parent 2
awd (d3, s2) variance (s2, s2, 3, 1) min(d1, d2, s1) linear_comb (s1, d0, s1, 0.9) smooth_r5r5 (s2, s2) open (s2,s2, 3,1) subp (d1, s1, s1) min (d0, s2, s2) adds (d1, s0, 0.25) opcl (s1,s1, 1, 1) open (s1,s2, 1,1) variance (s2, s2, 3, 1) addp (s1, s0, s2) smooth_r5r5 (d1, s3) opcl (s1,s1, 1, 3) lawd(d2, s0) subp (s1, s1, s1) awd (d3, s2) variance (s2, s2, 3, 1) min(d1, d2, s1) linear_comb (s1, d0, s1, 0.9) smooth_r5r5 (s2, s2) smooth_r5r5 (d1, s3) opcl (s1,s1, 1, 3) lawd(d2, s0) subp (s1, s1, s1) Parent 1 Child Parent 2
awd (d3, s2) variance (s2, s2, 3, 1) min(d1, d2, s1) linear_comb (s1, d0, s1, 0.9) smooth_r5r5 (s2, s2) open (s2,s2, 3,1) subp (d1, s1, s1) min (d0, s2, s2) adds (d1, s0, 0.25) opcl (s1,s1, 1, 1) open (s1,s2, 1,1) variance (s2, s2, 3, 1) addp (s1, s0, s2) smooth_r5r5 (d1, s3) opcl (s1,s1, 1, 3) lawd(d2, s0) subp (s1, s1, s1) awd (d3, s2) variance (s2, s2, 3, 1) min(d1, d2, s1) linear_comb (s1, d0, s1, 0.9) smooth_r5r5 (s2, s2) lawd(s1,s3) smooth_r5r5 (d1, s3) opcl (s1,s1, 1, 3) lawd(d2, s0) subp (s1, s1, s1) Parent 1 Child Parent 2
The GA is run until acceptable performance on the training examples is found. Generalization performance is evaluated, and user then creates new training example, based on generalization performance.
Some Results
From Brumby et al., 2000
Roads and buildings Vegetation Water Training DataEvolved classifier
From Brumby et al., 2000
Recent work by Ghosh & Mitchell Combine GENIE’s texture-learning ability with genetic shape-learning algorithm. Application: Find prostate in CAT scans of pelvic region
Training image, contour drawn by radiologist From Ghosh & Mitchell, 2006
Results from our hybrid algorithm Test image, with hand-segmentation by a radiologist Results from GENIE alone
Examples of Advanced EC topics Multi-objective optimization Self-adapting evolutionary algorithms Stochastic process theory of EC More realistic analogies with evolution: –Diploid chromosomes –Development of phenotype –Ecological interactions (e.g., co-evolution, spatial proximity)
Examples of Advanced EC topics Multi-objective optimization Self-adapting evolutionary algorithms Stochastic process theory of EC More realistic analogies with evolution: –Diploid chromosomes –Development of phenotype –Ecological interactions (e.g., co-evolution, spatial proximity)
Co-evolutionary Learning Melanie Mitchell (Portland State U.) Ludo Pagie (U. Utrecht) Mick Thomure (Portland State U.) Jennifer Meneghin (Portland State U.) Martin Cenek (Portland State U.)
Problem for GAs (and other machine learning methods): How to select training examples appropriate to different stages of learning? One solution: Co-evolve training examples, using inspiration from host- parasite co-evolution in nature.
Host-parasite co-evolution in nature Hosts evolve defenses against parasites Parasites find ways to overcome defenses Hosts evolve new defenses Continual “biological arms race”
Heliconius-egg mimicry in Passiflora Coevol/Coevol.html
Darwin recognized the importance of co-evolution in driving evolution
Co-evolution was later hypothesized to be major factor in evolution of sexual reproduction
Co-evolutionary Learning
Candidate solutions and training examples co-evolve.
Co-evolutionary Learning Candidate solutions and training examples co-evolve. –Fitness of candidate solution (host): how well it performs on training examples.
Co-evolutionary Learning Candidate solutions and training examples co-evolve. –Fitness of candidate solution (host): how well it performs on training examples. –Fitness of training example (parasite): how well it defeats candidate solutions.
Co-evolutionary Learning Candidate solutions and training examples co-evolve. –Fitness of candidate solution (host): how well it performs on training examples. –Fitness of training example (parasite): how well it defeats candidate solutions. Co evolutionary learning originally proposed by Hillis (1990).
Why should we expect co-evolution to work?
Hypotheses:
Why should we expect co-evolution to work? Hypotheses: 1.Allows arms races to emerge, with the evolution of training examples targeted to weaknesses in learners, and subsequent adaptation of learners to those training examples, and so on.
Why should we expect co-evolution to work? Hypotheses: 1.Allows arms races to emerge, with the evolution of training examples targeted to weaknesses in learners, and subsequent adaptation of learners to those training examples, and so on. 2.Helps maintain diversity in the population
Why should we expect co-evolution to work? Hypotheses: 1.Allows arms races to emerge, with the evolution of training examples targeted to weaknesses in learners, and subsequent adaptation of learners to those training examples, and so on. 2.Helps maintain diversity in the population Effect: Increases success of learning while reducing the amount of training data needed
Why should we expect co-evolution to work? Hypotheses: 1.Allows arms races to emerge, with the evolution of training examples targeted to weaknesses in learners, and subsequent adaptation of learners to those training examples, and so on. 2.Helps maintain diversity in the population Effect: Increases success of learning while reducing the amount of training data needed These hypotheses are plausible but have been largely untested in work on co-evolution.
Results Cellular AutomataFunction Induction Spatial Coev.64% (32/50) 78% (39/50) Non-Spatial Coev.0% (0/50) Spatial Evol.0% (0/50) 14% (7/50) Non-Spatial Evol.0% (0/50) 6% (3/50) Non-Spatial Boosting 0% (0/10)— Percentage of successful runs (mutation only)
Spatial Co-evolution
2D toroidal lattice with one host (h) and one parasite (p) per site
Spatial Co-evolution 2D toroidal lattice with one host (h) and one parasite (p) per site hphp hphp hphp hphp hphp
Spatial Co-evolution 2D toroidal lattice with one host (h) and one parasite (p) per site hphp hphp hphp hphp hphp
Spatial Co-evolution 2D toroidal lattice with one host (h) and one parasite (p) per site hphp hphp hphp hphp hphp
Spatial Co-evolution 2D toroidal lattice with one host (h) and one parasite (p) per site hphp hphp hphp hphp hphp Each h is replaced by mutated copy of winner of tournament among itself and 8 neighboring hosts.
Spatial Co-evolution 2D toroidal lattice with one host (h) and one parasite (p) per site hphp hphp hphp hphp hphp Each h is replaced by mutated copy of winner of tournament among itself and 8 neighboring hosts. Each p is replaced by mutated copy of winner tournament among itself and 8 neighboring parasites.
Non-Spatial Co-evolution
No spatial distribution of host and parasite populations
Non-Spatial Co-evolution No spatial distribution of host and parasite populations hosts parasites
Non-Spatial Co-evolution No spatial distribution of host and parasite populations hosts parasites
Non-Spatial Co-evolution No spatial distribution of host and parasite populations hosts parasites
Non-Spatial Co-evolution No spatial distribution of host and parasite populations Each h is replaced by mutated copy of winner of tournament among itself and 8 randomly chosen hosts. hosts parasites
Non-Spatial Co-evolution No spatial distribution of host and parasite populations Each h is replaced by mutated copy of winner of tournament among itself and 8 randomly chosen hosts. Each p is replaced by mutated copy of winner tournament among itself and 8 randomly chosen parasites. hosts parasites
Spatial Evolution: –Same as spatial co-evolution, except parasites don’t evolve. –A new population of random parasites is generated at each generation.
Spatial Evolution: –Same as spatial co-evolution, except parasites don’t evolve. –A new population of random parasites is generated at each generation. Non-Spatial Evolution: –Same as non-spatial co-evolution, except parasites don’t evolve. –A new sample of 100 random parasites is generated at each generation. –Fitness of a host is classification accuracy on these 100 randomly generated parasites
Results Cellular AutomataFunction Induction Spatial Coev.64% (32/50) 78% (39/50) Non-Spatial Coev.0% (0/50) Spatial Evol.0% (0/50) 14% (7/50) Non-Spatial Evol.0% (0/50) 6% (3/50) Non-Spatial Boosting 0% (0/10)— Percentage of successful runs (mutation only)
In short: Spatial co-evolution significantly out- performs other methods on both problems
Analysis Why was spatial co-evolution successful? Hypotheses: 1.Maintains diversity over long period of time 2.Creates extended “arms race” between hosts and parasite populations Here we examine these hypotheses for the CA task.
The GA evolves four types of strategies: –Random –Default –Block Expanding –Particle
“Default” strategy (performance 0.5)
“Block-expanding” strategy (0.55 performance < 0.72)
“Particle” strategy (performance 0.72)
Measuring diversity in host population Plot distribution of host strategies in typical CA runs RandomVery low performance DefaultLow performance Block expandingMedium performance ParticleHigh performance
Spatial Co-evolution (typical run)
Non-Spatial Co-evolution (typical run)
Spatial Evolution (typical run) 173
Non-Spatial Evolution (typical run) 174
Summary of diversity results
Spatial co-evolution seems to preserve more diversity in the host population than the other methods.
Summary of diversity results Spatial co-evolution seems to preserve more diversity in the host population than the other methods. In the other methods: –Spatial methods: One strategy quickly comes to dominate the population –Nonspatial methods: Population oscillates between two different strategies
Arms races between the host and parasite populations
Hypothesis: Initial configurations are evolving “deceptive blocks”:
Arms races between the host and parasite populations Hypothesis: Initial configurations are evolving “deceptive blocks”: –Occurrence of a block of seven or more adjacent 0s or 1s in the 149-bit IC –Probability of such an occurrence in a randomly generated string of the same density is less than 0.01.
Arms races between the host and parasite populations Hypothesis: Initial configurations are evolving “deceptive blocks”: –Occurrence of a block of seven or more adjacent 0s or 1s in the 149-bit IC –Probability of such an occurrence in a randomly generated string of the same density is less than For a typical run of spatial co-evolution, we plotted the fraction of parasites with deceptive blocks
generations
Summary: Why does spatial co-evolution improve performance?
Maintains diversity in population
Summary: Why does spatial co-evolution improve performance? Maintains diversity in population Produces “arms race” with ICs targeting weaknesses in CAs (e.g., block expanding) and CAs adapting to overcome weaknesses (e.g., particle strategies)
Summary: Why does spatial co-evolution improve performance? Maintains diversity in population Produces “arms race” with ICs targeting weaknesses in CAs (e.g., block expanding) and CAs adapting to overcome weaknesses (e.g., particle strategies) Note that spatial co-evolution requires significantly fewer training examples for successful learning
But why, precisely, does putting population on a spatial lattice improve performance?
Still an open question.
Possible applications to real-world problems
–Drug design to foil evolving viruses/bacteria
Possible applications to real-world problems –Drug design to foil evolving viruses/bacteria –Coevolving software/hardware with test cases
Possible applications to real-world problems –Drug design to foil evolving viruses/bacteria –Coevolving software/hardware with test cases –Evolving game-playing programs
Possible applications to real-world problems –Drug design to foil evolving viruses/bacteria –Coevolving software/hardware with test cases –Evolving game-playing programs –Coevolving computer security systems with possible threats