Evolutionary Computation

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Informed search algorithms
Computer Science Genetic Algorithms8/23/20011 Applications Boeing 777 engines designed by GE I2 technologies ERP package uses Gas John Deere – manufacturing.
Genetic Algorithms Sushil J. Louis Evolutionary Computing Systems LAB Dept. of Computer Science University of Nevada, Reno
Computer ScienceGenetic Algorithms Slide 1 Random/Exhaustive Search l Generate and Test 1. Generate a candidate solution and test to see if it solves the.
EvoNet Flying Circus Introduction to Evolutionary Computation Brought to you by (insert your name) The EvoNet Training Committee The EvoNet Flying Circus.
CS 460 Spring 2011 Lecture 3 Heuristic Search / Local Search.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
CSC344: AI for Games Lecture 4: Informed search
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Computer Science Genetic Algorithms CS 776: Evolutionary Computation Syllabus Objectives: –Learn about Evolutionary Computation.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Artificial Intelligence CS482, CS682, MW 1 – 2:15, SEM 201, MS 227 Prerequisites: 302, 365 Instructor: Sushil Louis,
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Informed search algorithms
Introduction to GAs: Genetic Algorithms How to apply GAs to SNA? Thank you for all pictures and information referred.
Neural and Evolutionary Computing - Lecture 5 1 Evolutionary Computing. Genetic Algorithms Basic notions The general structure of an evolutionary algorithm.
Chapter 4.1 Beyond “Classic” Search. What were the pieces necessary for “classic” search.
Genetic Algorithms Siddhartha K. Shakya School of Computing. The Robert Gordon University Aberdeen, UK
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
Exact and heuristics algorithms
1 Genetic Algorithms and Ant Colony Optimisation.
1 Genetic Algorithms K.Ganesh Introduction GAs and Simulated Annealing The Biology of Genetics The Logic of Genetic Programmes Demo Summary.
Evolutionary Algorithms K. Ganesh Research Scholar, Ph.D., Industrial Management Division, Humanities and Social Sciences Department, Indian Institute.
For Wednesday Read chapter 6, sections 1-3 Homework: –Chapter 4, exercise 1.
For Wednesday Read chapter 5, sections 1-4 Homework: –Chapter 3, exercise 23. Then do the exercise again, but use greedy heuristic search instead of A*
Princess Nora University Artificial Intelligence Chapter (4) Informed search algorithms 1.
Artificial Intelligence CS482, CS682, MW 1 – 2:15, SEM 201, MS 227 Prerequisites: 302, 365 Instructor: Sushil Louis,
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 5: Power of Heuristic; non- conventional search.
4/11/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 4, 4/11/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Feng Zhiyong Tianjin University Fall  Best-first search  Greedy best-first search  A * search  Heuristics  Local search algorithms  Hill-climbing.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Computer ScienceGenetic Algorithms Slide 1 Random/Exhaustive Search l Generate and Test 1. Generate a candidate solution and test to see if it solves the.
CPSC 420 – Artificial Intelligence Texas A & M University Lecture 5 Lecturer: Laurie webster II, M.S.S.E., M.S.E.e., M.S.BME, Ph.D., P.E.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Romania. Romania Problem Initial state: Arad Goal state: Bucharest Operators: From any node, you can visit any connected node. Operator cost, the.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Informed Search Methods
Introduction to Genetic Algorithms
Genetic Algorithm in TDR System
Genetic Algorithms.
CSCI 4310 Lecture 10: Local Search Algorithms
Evolutionary Computation
Last time: Problem-Solving
Introduction to Genetic Algorithm (GA)
C.-S. Shieh, EC, KUAS, Taiwan
For Monday Chapter 6 Homework: Chapter 3, exercise 7.
Department of Computer Science
Local Search Algorithms
Informed Search and Exploration
Artificial Intelligence (CS 370D)
Chapter 4 Beyond Classical Search
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Genetic Algorithms, Search Algorithms
Advanced Artificial Intelligence Evolutionary Search Algorithm
HW #1 Due 29/9/2008 Write Java Applet to solve Goats and Cabbage “Missionaries and cannibals” problem with the following search algorithms: Breadth first.
Artificial Intelligence Informed Search Algorithms
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
GENETIC ALGORITHMS & MACHINE LEARNING
Genetic Algorithms Chapter 3.
More on Search: A* and Optimization
EE368 Soft Computing Genetic Algorithms.
Searching for solutions: Genetic Algorithms
Local Search Algorithms
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Beyond Classical Search
Local Search Algorithms
Presentation transcript:

Evolutionary Computation Instructor: Sushil Louis, sushil@cse.unr.edu, http://www.cse.unr.edu/~sushil

Informed Search Best First Search Basic idea A* Heuristics Order nodes for expansion using a specific search strategy Order nodes, n, using an evaluation function f(n) Most evaluation functions include a heuristic h(n) For example: Estimated cost of the cheapest path from the state at node n to a goal state Heuristics provide domain information to guide informed search

Romania with straight line distance heuristic h(n) = straight line distance to Bucharest

Greedy search F(n) = h(n) = straight line distance to goal Draw the search tree and list nodes in order of expansion (5 minutes) Time? Space? Complete? Optimal?

Greedy search

Greedy analysis Optimal? Complete? Time and Space Path through Rimniu Velcea is shorter Complete? Consider Iasi to Fagaras In finite spaces Time and Space Worst case 𝑏 𝑚 where m is the maximum depth of the search space Good heuristic can reduce complexity

𝐴 ∗ f(n) = g(n) + h(n) = cost to state + estimated cost to goal = estimated cost of cheapest solution through n

𝐴 ∗ Draw the search tree and list the nodes and their associated cities in order of expansion for going from Arad to Bucharest 5 minutes

A*

𝐴 ∗ f(n) = g(n) + h(n) = cost to state + estimated cost to goal = estimated cost of cheapest solution through n Seem reasonable? If heuristic is admissible, 𝐴 ∗ is optimal and complete for Tree search Admissible heuristics underestimate cost to goal If heuristic is consistent, 𝐴 ∗ is optimal and complete for graph search Consistent heuristics follow the triangle inequality If n’ is successor of n, then h(n) ≤ c(n, a, n’) + h(n’) Is less than cost of going from n to n’ + estimated cost from n’ to goal Otherwise you should have expanded n’ before n and you need a different heuristic f costs are always non-decreasing along any path

Non-classical search - Path does not matter, just the final state - Maximize objective function

Model Obj. func Evaluate candidate state We have a black box “evaluate” function that returns an objective function value Evaluate Obj. func candidate state Application dependent fitness function

Consider a problem Genetic Algorithms Combination lock 08/24/09 Consider a problem Genetic Algorithms Combination lock 30 digit combination lock How many combinations?

Generate and test Generate a candidate solution and test to see if it solves the problem Repeat Information used by this algorithm You know when you have found the solution No candidate solution is better than another EXCEPT for the correct combination

Combination lock Solutions  Obj. Function

Generate and Test is Search 08/24/09 Generate and Test is Search Genetic Algorithms Exhaustive Search How many must you try before p(success)>0.5 ? How long will this take? Will you eventually open the lock? Random Search Same! RES  Random or Exhaustive Search

Search techniques Genetic Algorithms 08/24/09 Search techniques Genetic Algorithms Hill Climbing/Gradient Descent Needs more information You are getting closer OR You are getting further away from correct combination Quicker Problems Distance metric could be misleading Local hills

Hill climbing issues - Path does not matter, just the final state - Maximize objective function

Search as a solution to hard problems Strategy: generate a potential solution and see if it solves the problem Make use of information available to guide the generation of potential solutions How much information is available? Very little: We know the solution when we find it Lots: linear, continuous, … A little: Compare two solutions and tell which is “better”

Search tradeoff Very little information for search implies we have no algorithm other than RES. We have to explore the space thoroughly since there is no other information to exploit Lots of information (linear, continuous, …) means that we can exploit this information to arrive directly at a solution, without any exploration Little information (partial ordering) implies that we need to use this information to tradeoff exploration of the search space versus exploiting the information to concentrate search in promising areas

Exploration vs Exploitation More exploration means Better chance of finding solution (more robust) Takes longer Versus More exploitation means Less chance of finding solution, better chance of getting stuck in a local optimum Takes less time

Choosing a search algorithm The amount of information available about a problem influences our choice of search algorithm and how we tune this algorithm How does a search algorithm balance exploration of a search space against exploitation of (possibly misleading) information about the search space? What assumptions is the algorithm making?

Information used by RES 08/24/09 Information used by RES Genetic Algorithms Exhaustive Search Random Search Found solution or not Solutions  Obj. Function

Search techniques Genetic Algorithms 08/24/09 Search techniques Genetic Algorithms Hill Climbing/Gradient Descent Needs more information You are getting closer OR You are getting further away from correct combination Gradient information (partial ordering) Problems Distance metric could be misleading Local hills

Search techniques Genetic Algorithms Parallel hillclimbing 08/24/09 Search techniques Genetic Algorithms Parallel hillclimbing Everyone has a different starting point Perhaps not everyone will be stuck at a local optima More robust, perhaps quicker

Genetic Algorithms Genetic Algorithms 08/24/09 Genetic Algorithms Genetic Algorithms Parallel hillclimbing with information exchange among candidate solutions Population of candidate solutions Crossover for information exchange Good across a variety of problem domains

Assignment 1 Genetic Algorithms Maximize a function 08/24/09 Assignment 1 Genetic Algorithms Maximize a function 100 bits – we use integers whose values are 0, 1

Applications Genetic Algorithms Boeing 777 engines designed by GE 08/24/09 Applications Genetic Algorithms Boeing 777 engines designed by GE I2 technologies ERP package uses Gas John Deere – manufacturing optimization US Army – Logistics Cap Gemini + KiQ – Marketing, credit, and insurance modeling

Niche Genetic Algorithms Poorly-understood problems 08/24/09 Niche Genetic Algorithms Poorly-understood problems Non-linear, Discontinuous, multiple optima,… No other method works well Search, Optimization, Machine Learning Quickly produces good (usable) solutions Not guaranteed to find optimum

History Genetic Algorithms 08/24/09 History Genetic Algorithms 1960’s, Larry Fogel – “Evolutionary Programming” 1970’s, John Holland – “Adaptation in Natural and Artificial Systems” 1970’s, Hans-Paul Schwefel – “Evolutionary Strategies” 1980’s, John Koza – “Genetic Programming” Natural Selection is a great search/optimization algorithm GAs: Crossover plays an important role in this search/optimization Fitness evaluated on candidate solution GAs: Operators work on an encoding of solution

History Genetic Algorithms 1989, David Goldberg – our textbook 08/24/09 History Genetic Algorithms 1989, David Goldberg – our textbook Consolidated body of work in one book Provided examples and code Readable and accessible introduction 2011, GECCO , 600+ attendees Industrial use of Gas Combinations with other techniques

Start: Genetic Algorithms 08/24/09 Start: Genetic Algorithms Genetic Algorithms Model Natural Selection the process of Evolution Search through a space of candidate solutions Work with an encoding of the solution Non-deterministic (not random) Parallel search

Vocabulary Natural Selection is the process of evolution – Darwin Evolution works on populations of organisms A chromosome is made up of genes The various values of a gene are its alleles A genotype is the set of genes making up an organism Genotype specifies phenotype Phenotype is the organism that gets evaluated Selection depends on fitness During reproduction, chromosomes crossover with high probability Genes mutate with very low probability

Genetic Algorithm Generate pop(0) Evaluate pop(0) T=0 While (not converged) do Select pop(T+1) from pop(T) Recombine pop(T+1) Evaluate pop(T+1) T = T + 1 Done

Genetic Algorithm Generate pop(0) Evaluate pop(0) T=0 While (not converged) do Select pop(T+1) from pop(T) Recombine pop(T+1) Evaluate pop(T+1) T = T + 1 Done

Generate pop(0) Initialize population with randomly generated strings of 1’s and 0’s for(i = 0 ; i < popSize; i++){ for(j = 0; j < chromLen; j++){ Pop[i].chrom[j] = flip(0.5); }

Genetic Algorithm Generate pop(0) Evaluate pop(0) T=0 While (not converged) do Select pop(T+1) from pop(T) Recombine pop(T+1) Evaluate pop(T+1) T = T + 1 Done

Evaluate pop(0) Fitness Decoded individual Evaluate Application dependent fitness function

Genetic Algorithm Generate pop(0) Evaluate pop(0) T=0 While (T < maxGen) do Select pop(T+1) from pop(T) Recombine pop(T+1) Evaluate pop(T+1) T = T + 1 Done

Genetic Algorithm Generate pop(0) Evaluate pop(0) T=0 While (T < maxGen) do Select pop(T+1) from pop(T) Recombine pop(T+1) Evaluate pop(T+1) T = T + 1 Done

Selection Each member of the population gets a share of the pie proportional to fitness relative to other members of the population Spin the roulette wheel pie and pick the individual that the ball lands on Focuses search in promising areas

Code int roulette(IPTR pop, double sumFitness, int popsize) { /* select a single individual by roulette wheel selection */ double rand,partsum; int i; partsum = 0.0; i = 0; rand = f_random() * sumFitness; i = -1; do{ i++; partsum += pop[i].fitness; } while (partsum < rand && i < popsize - 1) ; return i; }

Genetic Algorithm Generate pop(0) Evaluate pop(0) T=0 While (T < maxGen) do Select pop(T+1) from pop(T) Recombine pop(T+1) Evaluate pop(T+1) T = T + 1 Done

Crossover and mutation Mutation Probability = 0.001 Insurance Xover Probability = 0.7 Exploration operator

Crossover code void crossover(POPULATION *p, IPTR p1, IPTR p2, IPTR c1, IPTR c2) { /* p1,p2,c1,c2,m1,m2,mc1,mc2 */ int *pi1,*pi2,*ci1,*ci2; int xp, i; pi1 = p1->chrom; pi2 = p2->chrom; ci1 = c1->chrom; ci2 = c2->chrom; if(flip(p->pCross)){ xp = rnd(0, p->lchrom - 1); for(i = 0; i < xp; i++){ ci1[i] = muteX(p, pi1[i]); ci2[i] = muteX(p, pi2[i]); } for(i = xp; i < p->lchrom; i++){ ci1[i] = muteX(p, pi2[i]); ci2[i] = muteX(p, pi1[i]); } else { for(i = 0; i < p->lchrom; i++){

Mutation code int muteX(POPULATION *p, int pa) { return (flip(p->pMut) ? 1 - pa : pa); }

Simulated annealing Gradient descent (not ascent) Accept bad moves with probability 𝑒 𝑑𝐸/𝑇 T decreases every iteration If schedule(t) is slow enough we approach finding global optimum with probability 1

Crossover helps if

Linear and quadratic programming Constrained optimization Optimize f(x) subject to Linear convex constraints – polynomial time in number of vars Quadratic constraints – special cases polynomial time