Chapter 4: Evolutionary Computation Implementations.

Slides:



Advertisements
Similar presentations
CS6800 Advanced Theory of Computation
Advertisements

Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Evolutionary Computation (EC)
CHAPTER 9 E VOLUTIONARY C OMPUTATION I : G ENETIC A LGORITHMS Organization of chapter in ISSO –Introduction and history –Coding of  –Standard GA operations.
Particle Swarm Optimization (PSO)
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Design of Curves and Surfaces by Multi Objective Optimization Rony Goldenthal Michel Bercovier School of Computer Science and Engineering The Hebrew University.
Tutorial 1 Temi avanzati di Intelligenza Artificiale - Lecture 3 Prof. Vincenzo Cutello Department of Mathematics and Computer Science University of Catania.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Tutorial 4 (Lecture 12) Remainder from lecture 10: The implicit fitness sharing method Exercises – any solutions? Questions?
Khaled Rasheed Computer Science Dept. University of Georgia
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
Evolutionary algorithms
Genetic Algorithm.
Computer Implementation of Genetic Algorithm
Evolutionary Intelligence
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Efficient Model Selection for Support Vector Machines
Genetic Algorithms Genetic algorithms imitate natural optimization process, natural selection in evolution. Developed by John Holland at the University.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
Genetic Algorithms Genetic algorithms imitate a natural optimization process: natural selection in evolution. Developed by John Holland at the University.
PSO and its variants Swarm Intelligence Group Peking University.
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Introduction to Evolutionary Algorithms Session 4 Jim Smith University of the West of England, UK May/June 2012.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Computational Complexity Jang, HaYoung BioIntelligence Lab.
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
FINAL EXAM SCHEDULER (FES) Department of Computer Engineering Faculty of Engineering & Architecture Yeditepe University By Ersan ERSOY (Engineering Project)
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
ECE 103 Engineering Programming Chapter 52 Generic Algorithm Herbert G. Mayer, PSU CS Status 6/4/2014 Initial content copied verbatim from ECE 103 material.
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
Alice E. Smith and Mehmet Gulsen Department of Industrial Engineering
Solving Function Optimization Problems with Genetic Algorithms September 26, 2001 Cho, Dong-Yeon , Tel:
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Particle Swarm Optimization (PSO)
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Advanced AI – Session 6 Genetic Algorithm By: H.Nematzadeh.
Genetic algorithm. Definition The genetic algorithm is a probabilistic search algorithm that iteratively transforms a set (called a population) of mathematical.
Genetic Algorithms. Solution Search in Problem Space.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Genetic Algorithms.
Evolutionary Algorithms Jim Whitehead
Particle Swarm Optimization (2)
Meta-heuristics Introduction - Fabien Tricoire
C.-S. Shieh, EC, KUAS, Taiwan
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
EE368 Soft Computing Genetic Algorithms.
Introduction to Genetic Algorithm and Some Experience Sharing
Artificial Intelligence CIS 342
Presentation transcript:

Chapter 4: Evolutionary Computation Implementations

Evolutionary Computation Implementations: Outline Genetic Algorithm Mainly a canonical version Crossover: one-point, two-point, uniform Selection: Roulette wheel, tournament, ranking Five benchmark functions Particle Swarm Optimization Global and local versions Multiple swarm capability Same benchmark functions as GA plus three for constraint satisfaction

EC Implementation Issues (Generic) Homogeneous vs. heterogeneous representation Online adaptation vs. offline adaptation Static adaptation versus adaptive adaptation Flowcharts versus finite state machines

Homogeneous vs. Heterogeneous Representation Homogeneous representation Used traditionally Simple; can use existing EC operators Binary is traditional coding for GAs; it’s simple and general Use integer representation for discrete valued parameters Use real values to represent real valued parameters if possible Heterogeneous representation Most natural way to represent problem Real values represent real parameters, integers or binary strings represent discrete parameters Complexity of evolutionary operators increases Representation-specific operators needed

Binary Representations Advantages Simple and popular Use standard operators Disadvantages Can result in long chromosomes Can introduce inaccuracies

Final Thoughts on Representation The best representation is usually problem-dependent. Representation is often a major part of solving a problem. In general, represent a problem the way it appears in the system implementation.

Population Adaptation Versus Individual Adaptation Individual: Most commonly used. Pittsburgh approach; each chromosome represents the entire problem. Performance of each candidate solution is proportional to the fitness of its representation. Population: Used when system can’t be evaluated offline. Michigan approach: entire population represents one solution. (Only one system evaluated each generation.) Cooperation and competition among all components of the system.

Static Adaptation Versus Dynamic Adaptation Static: Most commonly used. Algorithms have fixed (or pre- determined) values. Adaptive: Can be done at Environment level Population level (most common, if done) Individual level Component level Balance exploration and exploitation.

Flowcharts Versus Finite State Machines Flowcharts: Easy to understand and use. Traditionally used; best for simpler systems Finite State Machine Diagrams: Used for systems with frequent user interaction, and for more complex systems. More suited to structured systems, and when multi-tasking is involved.

Handling Multiple Similar Cases If two possibilities, use if-then If three or more, use switch (with cases); or function pointer (order is critical)

Allocating and Freeing Memory Space Arrays and vectors should be dynamically configured Allocate memory: calloc Release memory: free

Error Checking Use frequently Use to debug Can use assert() [remove when program debugged]

Genetic Algorithm Implementation Essentially a canonical GA that utilizes crossover and mutation Uses binary representation Searches for optima with real value parameters Several benchmark functions are included

Data Types Enumeration data type used for selection types, crossover types, and to select the test function. C has no data type for ‘bit’ so used unsigned character type for population. A bit (or a byte) can represent a bit; computational complexity issues must be addressed.

The GA main() Routine The GA_Start_Up routine: Reads in problem-related parameters such as the number of bits per parameter from the input file. Allocates memory Initializes population The GA_Main_Loop runs the GA algorithm: Evaluation Selection Crossover Mutation The GA_Clean_Up: Stores results in an output file De-allocates memory

GA Selection Mechanisms All use elitism Proportional selection – roulette wheel that uses fitness shifting and keeps fitnesses positive Binary tournament selection – better of two randomly-selected individuals Ranking selection – evenly-spaced fitness values; then like roulette wheel In ga_selection() routine

Mutate According to Bit Position Flag When 0, bit-by-bit consideration When 1, mutation done that is approximation of Gaussian Probability of mutation m b varies with bit position: where b=0 for the least significant bit, 1 for the next, etc. and m 0 is the value in the run file. Bit position is calculated for each variable. The mutation rate for the first bit is thus about.4 times the value in the run file. (This mutation is similar to that carried out in EP and ES (Gaussian).

Crossover Flag 0: One-point crossover 1: Uniform crossover 2: Two-point crossover

result.dat result file name dimension function type 0: F6 1: PARABOLIC 2: ROSENBROCK 3: RASTRIGRIN 4: GRIEWANK maximum number of iterations bits per parameter population size rate of crossover rate of mutation termination criterion (not used in this implementation, but must be present) mutation flag 0: base mutation 1: bit position mutation crossover operator 0: one point; 1: uniform; 2: two point selection operator 0: roulette; 1: binary tournament; 2: ranking; GA.RUN To run implementation: C>\ga ga.run Directory with ga.exe and run file

Result file: part 1 of 2 resultFile result function type input dim max. No. generation bits for eachPara boundary value popu_size individual length crossover rate mutation rate term. criterion flag_m (1:bit position;0:cons) c_type (0:one,1:unif,2:two) selection type generation: best fitness: variance:

fitness values: fit[ 0]: fit[ 1]: fit[ 2]: fit[ 3]: fit[ 4]: fit[ 5]: fit[ 6]: fit[ 7]: fit[ 8]: fit[ 9]: fit[10]: fit[11]: fit[12]: fit[13]: fit[14]: fit[15]: fit[16]: fit[17]: fit[18]: fit[19]: parameters: para[ 0]: para[ 1]: para[ 2]: para[ 3]: para[ 4]: para[ 5]: para[ 6]: para[ 7]: para[ 8]: para[ 9]: begin time at: Mon Oct 01 08:35: finish time at: Mon Oct 01 08:36: Result file: part 1 of 2

PSO Implementation Basic PSO as previously described is implemented first A multi-swarm version (co-evolutionary PSO) is also implemented The implementation is based on a state machine Arrows represent transitions Transition labels indicate trigger for transition Can initialize symmetrically or asymmetrically

PSO Attributes Symmetrical or nonsymmetrical initialization Minimize or maximize Choice of five functions Inertia weight can be constant, linearly decreasing, or noisy Choose population size Specify number of dimensions (variables)

PSO State Machine Nine states A state handler performs action until state transition State machine runs until it reaches PSOS_DONE

PSO State Diagram

Definitions of States and Data Types

Definitions of States and Data Types, Cont’d.

State Handling Routines State handling routine called depends on current state The routine runs until its conditions are met, i.e., the maximum population index is reached

PSO main() Routine Simple Startup: reads parameters, and allocates memory to dynamic variables Cleanup: stores results and de-allocates memory

The Co-Evolutionary PSO Can use for problems with multiple constraints Uses augmented Lagrangian method to convert problem into min and max problems One solves min problem with max problem as fixed environment Other solves max problem with min problem as fixed environment

Co-Evolutionary PSO Procedure 1.Initialize two PSOs 2.Run first PSO for max_gen_1 generations 3.If not first cycle, evaluate the pbest values for second PSO 4.Run second PSO for max_gen_2 generations 5.Re-evaluate pbest values for first PSO 6.Loop to 2) until termination criterion met

Augmented Lagrangian

Method of Lagrange Multiplier (Constraint Optimization) Example Suppose a nuclear reactor is to have the shape of a cylinder of radius R and height H. Neutron diffusion theory tells that such reactor must have the following constraint. We would like to minimize the volume of the reactor By using the equations above, then, By multiplying first equation by R/2 and the second by H, you should obtain

Co-Evolutionary PSO Example 1st PSO: Population member is a vector of elements (variables); run as minimization problem 2nd PSO: Population member is a vector of λ values [0,1]; run as maximization problem Process: 1.Run first PSO for max_gen_1 generations (e.g., 10); fitness of particle is maximum obtained with any λ vector (λ values are fixed). 2.If not first cycle, re-calculate pbests for 2nd PSO 3.Run second PSO for max_gen_2 generations; optimize with respect to λ values in 2nd population; variable values are fixed. 4.Recalculate pbest values for first PSO. 5.Increment cycle count and go to 1. if not max cycles

Benchmark Problems For all benchmark problems, population sizes set to 40 and generations per PSO per cycle Different numbers of cycles tested: 40, 80, and 120 In book, linearly decreasing inertia weight used 50 runs (to max number of cycles) done for each combination of settings

State Machine for Multi-PSO Version typedef enum PSO_State_Tag { PSO_UPDATE_INERTIA_WEIGHT, // Update inertia weight PSO_EVALUATE, // Evaluate particles PSO_UPDATE_GLOBAL_BEST, // Update global best PSO_UPDATE_LOCAL_BEST, // Update local best PSO_UPDATE_VELOCITY, // Update particle's velocity PSO_UPDATE_POSITION, // Update particle's position PSO_GOAL_REACH_JUDGE, // Judge whether reach the goal PSO_NEXT_GENERATION, // Move to the next generation PSO_UPDATE_PBEST_EACH_CYCLE,// Update pbest each cycle for //co-pso due to the //environment changed PSO_NEXT_PSO, // Move to the next PSO in the same cycle or // the first pso in the next cycle PSOS_DONE, // Finish one cycle of PSOs NUM_PSO_STATES // Total number of PSO states } PSO_State_Type;

Multi-PSOs State Diagram

PSO-Evaluate for Multi-PSOs For the co-evolutionary PSO, each PSO passes its function type to the evaluate_functions() routine to call its corresponding function to evaluate the PSO’s performance. For example, if the problem to be solved is the G7 problem, one PSO for solving the minimization problem calls G7_MIN(), and the other PSO for solving maximization problem will call G7_MAX().

G1 Problem where The global minimum is known to be x * = (1,1,1,1,1,1,1,1,1,3,3,3,1) with f(x * ) = -15

For G1 Problem For both swarms, the function that is evaluated is the augmented Lagrangian.

Sample PSOS Run File, Part 1 2# of PSOs 1update pbest each cycle flag 300 total number of cycles to run 0optimization type (0 = min, 1 = max) 0function type (G1_min) 1inertia update method (1 = linearly decreasing) 1initialization (1 = asymmetric) 0.0left initialization 50.0right initialization 10max velocity 100max position 100max generations per cycle 30population size 13dimensions 0.9initial inertia weight 1boundary flag (1 = enabled) lower and upper boundaries for parameters (13 for G1)

Sample PSOS Run File, Part 2 Values for second swarm, as in part 1 11 = max 1(G1_max)

Single PSO Run File (annotated) 1num of PSOs 0pso_update_pbest_each_cycle_flag (only for multiple swarms) 40 total cycles of running PSOs 0optimization type: 0=min or 1=max 6evaluation function (F6) 1inertia weight update method: 1=linear decreasing 1initialization type: 0=sym,1=asym -10.0left initialization range 50.0right initialization range 40maximum velocity 100maximum position 50max number of generations per cycle 30population size 2dimension 0.9initial inertia weight 0boundary flag 0=disabled 1=enabled boundaries if boundary flag is 1 Evaluation functions 0: G1_MIN 1: G1_MAX 2: G7_MIN 3: G7_MAX 4: G9_MIN 5: G9_MAX 6: F6 7: SPHERE 8: ROSENBROCK 9: RASTRIGRIN 10: GRIEWANK