Differential Evolution (DE) and its Variant Enhanced DE (EDE)

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Genetic Algorithms for Bin Packing Problem Hazem Ali, Borislav Nikolić, Kostiantyn Berezovskyi, Ricardo Garibay Martinez, Muhammad Ali Awan.
Hybridization of Search Meta-Heuristics Bob Buehler.
Effective gradient-free methods for inverse problems Jyri Leskinen FiDiPro DESIGN project.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Differential Evolution Hossein Talebi Hassan Nikoo 1.
Particle Swarm Optimization Algorithms
Genetic Algorithm.
Introduction to Genetic Algorithms and Evolutionary Computation
(Particle Swarm Optimisation)
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
Genetic Algorithms Czech Technical University in Prague, Faculty of Electrical Engineering Ondřej Vaněk, Agent Technology Center ZUI 2011.
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
Particle Swarm Optimization Using the HP Prime Presented by Namir Shammas 1.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms. Solution Search in Problem Space.
Genetic Algorithm (Knapsack Problem)
Supervised by: Dr. Nadeem Javaid
Genetic Algorithm in TDR System
Selected Topics in CI I Genetic Programming Dr. Widodo Budiharto 2014.
Genetic Algorithms.
Introduction Genetic programming falls into the category of evolutionary algorithms. Genetic algorithms vs. genetic programming. Concept developed by John.
The 2st Chinese Workshop on Evolutionary Computation and Learning
Evolution Strategies Evolutionary Programming
Yong Wang Associate Professor, Ph.D.
Scientific Research Group in Egypt (SRGE)
Scientific Research Group in Egypt (SRGE)
Discrete ABC Based on Similarity for GCP
Evolving the goal priorities of autonomous agents
Balancing of Parallel Two-Sided Assembly Lines via a GA based Approach
Introduction to Harmony Search Algorithm (HSA)
Differential Evolution
Meta-heuristics Introduction - Fabien Tricoire
School of Computer Science & Engineering
Introduction to Genetic Algorithm (GA)
C.-S. Shieh, EC, KUAS, Taiwan
Who cares about implementation and precision?
Evolution strategies Can programs learn?
Example: Applying EC to the TSP Problem
CSC 380: Design and Analysis of Algorithms
Multy- Objective Differential Evolution (MODE)
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Genetic Algorithms, Search Algorithms
METAHEURISTIC Jacques A. Ferland
Advanced Artificial Intelligence Evolutionary Search Algorithm
CS621: Artificial Intelligence
metaheuristic methods and their applications
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
G5BAIM Artificial Intelligence Methods
Genetic Algorithms CSCI-2300 Introduction to Algorithms
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
Methods and Materials (cont.)
EE368 Soft Computing Genetic Algorithms.
Boltzmann Machine (BM) (§6.4)
experimental apparatus
High Resolution Velocity Analysis for Resource Plays
Artificial Intelligence CIS 342
6 Differential Evolution
Traveling Salesman Problem by Genetic Algorithm
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
SWARM INTELLIGENCE Swarms
Population Based Metaheuristics
Population Methods.
Presentation transcript:

Differential Evolution (DE) and its Variant Enhanced DE (EDE) Presented By: Ayesha Zafar Student of MS(CS) Research Domain: Energy Management in Smart Grid Supervised By: Dr. Nadeem Javaid

Differential evolution Introduction Metaheuristic Evolutionary Genetic algorithm Genetic programing Memetic algorithm Differential evolution Swarm Intelligence Particle swarm optimization Ant colony optimization Artificial bee colony Cuckoo search Stochastic Random search Tabu search Hill climbing DE Population based evolutionary algorithm Proposed by Storn and Price in 1995 [1] Idea behind DE is a scheme for generating trial parameter vectors Advantages of DE Simple structure Ease of use Speed [1] R. Storn and K. V. Price, ”Differential Evolution - a Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces,” ICSI, TR-95-012 March 1995.

DE Basic Steps (1/6) Initialization Mutation Recombination / Crossover Selection Initialization

DE Basic Steps (2/6) Initialization Randomly generate initial population X ini = X l + rand()*(X u - X l) Initialization

DE Basic Steps (3/6) Mutation Expand search space For each vector Xi randomly select three vectors X r1 , X r2 , X r3 r1, r2, r3 are distinct from each other Add difference of two vectors to the third to form mutant vector Vi = X r1 + F. (Xr2 - X r3 ) F is a constant factor range between [0 2] Initialization

DE Basic Steps (4/6) Crossover or Recombination Recombination incorporates successful solution from previous generation Exchanges the element of mutant vector and target vector to form trial vector Types of crossover Exponential and Binomial Binomial crossover is more commonly used than exponential.[2] Initialization [2] Arafa, M., Sallam, E. A., & Fahmy, M. M. (2014, May). An enhanced differential evolution optimization algorithm. In Digital Information and Communication Technology and it’s Applications (DICTAP), 2014 Fourth International Conference on (pp. 216-225). IEEE.

DE Basic Steps (5/6) Crossover or Recombination Xi = (Xi1, X i2, X i3, X i4, X i5) Vi = (Vi1, V i2, V i3, Vi4, Vi5) Ui = (Vi1, Xi2, Xi3, Vi4, Xi5) Initialization

DE Basic Steps (6/6) Selection Survival of the fittest solution Compare target vector with trial vector, one with the better fitness admitted to the next generation Mutation, crossover and selection continue until some stopping criteria reaches

Example DE (1/5) Consider following two dimensional function f (x, y) = x2 + y2 Start with a population of 5 candidates randomly initiated in the range [-10, 10] Initial population X 1,0 (2, -1) X 2,0 (6, 1) X 3,0 (-3, 5) X 4,0 (-2, 6) X 5,0 (6, -7)

Example DE (2/5) Randomly select three vectors X 2, X 4 and X 5 Now form the mutant vector V 1,0 = X 2,0 + F. (X 4,0 – X5,0 )

Example DE (3/5) Now form the trial vector by exchanging components of V1,0 with the target vector X1,0 Let rand(0,1) = 0.6 Set CR = 0.9 If 0.6 < 0.9 then U1,1,0 = V1,1,0 = - 0.4 Again next time let rand(0, 1) = 0.95 0.95 > 0.9 U1,2,0 = X1,2,0 = -1 Finally the trial vector is U1,0 = [-0.4, -1]

Example DE (4/5) Compare fitness of trial vector with target vector If fitness of trial vector is better than fitness of target Trial vector is replaced by target vector at G=1 f (x, y) = x2 + y2 Fitness of target Fitness of trial f (2, -1) = 22 + (-1)2 = 5 f (-0.4, -1) = (-0.4)2 + (-1)2 = 1.16

Example DE (5/5) Initial population G = 0 Initial fitness Mutant vector Trial vector Fitness of trial vector G = 1 Evolved population X 1,0 =(2, -1) 5 V 1,0 = (-0.4, 10.4) U 1,0 = (-0.4, -1) 1.16 X 1,1 = (-0.4, -1) X 2,0 =(6, 1) 37 V 2,0 = (1.2, -0.2) U 2,0 = (1.2, 1) 2.44 X 2,1 = (1.2, 1) X 3,0 = (-3, 5) 34 V 3,0 = (-4.4, -0.2) U 3,0 = (-4.4, -0.2) 19.4 X 3,1 = (-4.4, -0.2) X 4,0 = (-2, 6) 40 V 4,0 =(9.2, 6) U 4,0 = (9.2, 6) 120.64 X 4,1 = (-2, 6) X 5,0 = (6, -7) 85 V 5,0 = (5.2, 0.2) U 5,0 = (6, 0.2) 36.04 X 5,1 = (6, 0.2)

Enhanced DE (EDE) Introduction Developed in 2014 [2] Improve trial vector strategy Increase accuracy Create five different trial vectors First three trail vectors are obtained by taking three different crossover rates 0.3, 0.6 and 0.9 instead of one in case of DE Fourth trial vector increase its convergence speed Last trial vector increases diversity of search space [2] Arafa, M., Sallam, E. A., & Fahmy, M. M. (2014, May). An enhanced differential evolution optimization algorithm. In Digital Information and Communication Technology and it’s Applications (DICTAP), 2014 Fourth International Conference on (pp. 216-225). IEEE.

Enhanced DE (EDE)  

Enhanced DE (EDE) Randomly initialize the population While termination criteria not satisfied do Perform mutation If (G ≤ 100) then Perform crossover using Eqs.(1-5) , then we have five groups of trial vectors Find the best member in each group of trial vectors Compare trial vectors Choose trial vector with best value End if 4. Perform selection set G = G+1 End while

Energy Optimization by EDE (1/13) Parameter initialization EDE parameters Population size 30 CR1 0.3 CR2 0.6 CR3 0.9 MaxItr 100 HEMS parameters Number of homes 1 Number of appliances 9 Pricing signal Real Time Pricing (RTP)

Energy Optimization by EDE (2/13) Appliances Group Appliances Power rating (KWh) Daily usage (hours) Interruptible burst load Vacuum cleaner 0.7 6 Water heater 5 12 Water pump 1 8 Dish washer 1.8 10 Base load Refrigerator 0.225 18 AC 1.5 15 Oven 2.15 Non-interruptible load Washing machine Cloth dryer 4

Energy Optimization by EDE (3/13) Generate initial population for i=1:pop_size for j=1:D X(i,j) = xl+rand(1)*(xu-xl); end End Vacum_cleaner Water heater Water pump Dish washer Washing machine Cloth dryer Refrigerator AC oven 1

Energy Optimization by EDE (4/13) Mutation b1=randperm(pop_size); b11=b1(1,1); r11=X(b11,:); b21=b1(1,2); r21=X(b21,:); b31=b1(1,3); r31=X(b31,:); Mutant1= r11+0.5*(r21-r31); Vacum_cleaner Water heater Water pump Dish washer Washing machine Cloth dryer Refrigerator AC oven 1

Energy Optimization by EDE (5/13) Mutation Mutant1= r11+0.5*(r21-r31); Vacum_cleaner Water heater Water pump Dish washer Washing machine Cloth dryer Refrigerator AC oven 1 Mutant vector 1

Energy Optimization by EDE (6/13) Crossover for i=1:D if rand(1)>=0.3 Y11(1,i)=r11(1,D); else Y11(1,i)=Mutant1(1,D); end Mutant vector 1 Target vector 1 First trial vector 1

Energy Optimization by EDE (7/13) Crossover for i=1:D if rand(1)>=0.6 Y21(1,i)=r11(1,i); else Y21(1,i)=Mutant1(1,i); end Mutant vector 1 Target vector 1 Second trial vector 1

Energy Optimization by EDE (8/13) Crossover for i=1:D if rand(1)>=0.9 Y31(1,i)=r11(1,i); else Y31(1,i)=Mutant1(1,i); end Mutant vector 1 Target vector 1 third trial vector 1

Energy Optimization by EDE (9/13) Crossover for i=1:D Y41(1,i) = rand(1)*r11(1,i); end Y51(1,i) = rand(1)*Mutant1(1,i)+(1- rand(1))*r11(1,i); Mutant vector 1 Target vector 1 Fourth trial vector 1 Fifth trial vector 1

Energy Optimization by EDE (10/13) Crossover F11=Electricity_cost*Y1_1'; F21=Electricity_cost*Y2_1'; F31=Electricity_cost*Y3_1'; F41=Electricity_cost*Y4_1'; F51=Electricity_cost*Y5_1'; F=[F11 F21 F31 F41 F51 ]; Trial vectors Cost 1 14.5 11 16 17.5 19 if min(F)==F11 T=Y1_1; elseif min(F)==F21 T=Y2_1; elseif min(F)==F31 T=Y3_1; elseif min(F)==F41 T=Y4_1; else min(F)==F51 T=Y5_1; end Final trial vector 1

Energy Optimization by EDE (11/13) Selection T11=Electricity_cost*T'; T12=Electricity_cost*r11_1'; if T12 < T11 X(b11,:)=r11_1; else X(b11,:)=T; end Target vector Cost 1 25 Final trial vector Cost 1 11

Energy Optimization by EDE (12/13) Selection Vacum_cleaner Water heater Water pump Dish washer Washing machine Cloth dryer Refrigerator AC oven fitness 1 25 15 18 13 12 Final trial vector cost 1 11

Energy Optimization by EDE (13/13) Selection Vacum_cleaner Water heater Water pump Dish washer Washing machine Cloth dryer Refrigerator AC oven 1 Final trial vector 1

Thank You Any Question