- Divided Range Multi-Objective Genetic Algorithms -

Slides:



Advertisements
Similar presentations
GA Approaches to Multi-Objective Optimization
Advertisements

ZEIT4700 – S1, 2014 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Topic Outline ? Black-Box Optimization Optimization Algorithm: only allowed to evaluate f (direct search) decision vector x objective vector f(x) objective.
24th may Use of genetic algorithm for designing redundant sensor network Carine Gerkens Systèmes chimiques et conception de procédés Département.
Intelligent Control Methods Lecture 12: Genetic Algorithms Slovak University of Technology Faculty of Material Science and Technology in Trnava.
Divided Range Genetic Algorithms in Multiobjective Optimization Problems Tomoyuki HIROYASU Mitsunori MIKI Sinya WATANBE Doshisha University.
Fractal Element Antenna Genetic Optimization Using a PC Cluster ACES Proceedings March 21, 2002 Monterey, CA.
Evolutionary Synthesis of MEMS Design Ningning Zhou, Alice Agogino, Bo Zhu, Kris Pister*, Raffi Kamalian Department of Mechanical Engineering, *Department.
A New Evolutionary Algorithm for Multi-objective Optimization Problems Multi-objective Optimization Problems (MOP) –Definition –NP hard By Zhi Wei.
Data Mining CS 341, Spring 2007 Genetic Algorithm.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Genetic Algorithms in Materials Processing N. Chakraborti Department of Metallurgical & Materials Engineering Indian Institute of Technology Kharagpur.
Genetic Algorithm What is a genetic algorithm? “Genetic Algorithms are defined as global optimization procedures that use an analogy of genetic evolution.
A New Algorithm for Solving Many-objective Optimization Problem Md. Shihabul Islam ( ) and Bashiul Alam Sabab ( ) Department of Computer Science.
S. Mohsen Sadatiyan A., Samuel Dustin Stanley, Donald V. Chase, Carol J. Miller, Shawn P. McElmurry Optimizing Pumping System for Sustainable Water Distribution.
Optimal Arrangement of Ceiling Cameras for Home Service Robots Using Genetic Algorithms Stefanos Nikolaidis*, ** and Tamio Arai** *R&D Division, Square.
1 Reasons for parallelization Can we make GA faster? One of the most promising choices is to use parallel implementations. The reasons for parallelization.
Evolutionary Multi-objective Optimization – A Big Picture Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical.
Parallel Genetic Algorithms with Distributed-Environment Multiple Population Scheme M.Miki T.Hiroyasu K.Hatanaka Doshisha University,Kyoto,Japan.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
Evolutionary algorithms
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
A New Model of Distributed Genetic Algorithm for Cluster Systems: Dual Individual DGA Tomoyuki HIROYASU Mitsunori MIKI Masahiro HAMASAKI Yusuke TANIMURA.
Neural and Evolutionary Computing - Lecture 10 1 Parallel and Distributed Models in Evolutionary Computing  Motivation  Parallelization models  Distributed.
Example II: Linear truss structure
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
MOGADES: Multi-Objective Genetic Algorithm with Distributed Environment Scheme Intelligent Systems Design Laboratory , Doshisha University , Kyoto Japan.
Doshisha Univ. JapanGECCO2002 Energy Minimization of Protein Tertiary Structure by Parallel Simulated Annealing using Genetic Crossover Takeshi YoshidaTomoyuki.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Genetic Algorithms Genetic algorithms imitate a natural optimization process: natural selection in evolution. Developed by John Holland at the University.
Doshisha Univ., Japan Parallel Evolutionary Multi-Criterion Optimization for Block Layout Problems ○ Shinya Watanabe Tomoyuki Hiroyasu Mitsunori Miki Intelligent.
Optimal resource assignment to maximize multistate network reliability for a computer network Yi-Kuei Lin, Cheng-Ta Yeh Advisor : Professor Frank Y. S.
More on Heuristics Genetic Algorithms (GA) Terminology Chromosome –candidate solution - {x 1, x 2,...., x n } Gene –variable - x j Allele –numerical.
Introduction to GAs: Genetic Algorithms How to apply GAs to SNA? Thank you for all pictures and information referred.
Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory 1.
Distributed Genetic Algorithms with a New Sharing Approach in Multiobjective Optimization Problems Tomoyuki HIROYASU Mitsunori MIKI Sinya WATANABE Doshisha.
Doshisha Univ., Kyoto, Japan CEC2003 Adaptive Temperature Schedule Determined by Genetic Algorithm for Parallel Simulated Annealing Doshisha University,
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
A Parallel Genetic Algorithm with Distributed Environment Scheme
Kanpur Genetic Algorithms Laboratory IIT Kanpur 25, July 2006 (11:00 AM) Multi-Objective Dynamic Optimization using Evolutionary Algorithms by Udaya Bhaskara.
DIVERSITY PRESERVING EVOLUTIONARY MULTI-OBJECTIVE SEARCH Brian Piper1, Hana Chmielewski2, Ranji Ranjithan1,2 1Operations Research 2Civil Engineering.
 Genetic Algorithms  A class of evolutionary algorithms  Efficiently solves optimization tasks  Potential Applications in many fields  Challenges.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
ECE 103 Engineering Programming Chapter 52 Generic Algorithm Herbert G. Mayer, PSU CS Status 6/4/2014 Initial content copied verbatim from ECE 103 material.
2/29/20121 Optimizing LCLS2 taper profile with genetic algorithms: preliminary results X. Huang, J. Wu, T. Raubenhaimer, Y. Jiao, S. Spampinati, A. Mandlekar,
Parallel Genetic Algorithms By Larry Hale and Trevor McCasland.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Tamaki Okuda ● Tomoyuki Hiroyasu   Mitsunori Miki   Shinya Watanabe  
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Multiobjective Optimization  Particularities of multiobjective optimization  Multiobjective.
Evolutionary multi-objective algorithm design issues Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical.
Neural Networks And Its Applications By Dr. Surya Chitra.
Doshisha Univ., Kyoto Japan NCGA : Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems Intelligent Systems Design Laboratory,
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Global topology optimization of truss structures Dmitrij Šešok Rimantas Belevičius Department of Engineering Mechanics. Vilnius Gediminas Technical University.
Parallel Simulated Annealing using Genetic Crossover Tomoyuki Hiroyasu Mitsunori Miki Maki Ogura November 09, 2000 Doshisha University, Kyoto, Japan.
Genetic Algorithm(GA)
Presented By: Farid, Alidoust Vahid, Akbari 18 th May IAUT University – Faculty.
Hirophysics.com The Genetic Algorithm vs. Simulated Annealing Charles Barnes PHY 327.
Power Magnetic Devices: A Multi-Objective Design Approach
Bulgarian Academy of Sciences
Doshisha Univ., Kyoto Japan
○ Hisashi Shimosaka (Doshisha University)
New Crossover Scheme for Parallel Distributed Genetic Algorithms
Tomoyuki HIROYASU Mitsunori MIKI Masahiro HAMASAKI Yusuke TANIMURA
Md. Tanveer Anwar University of Arkansas
Energy Minimization of Protein Tertiary Structure by Parallel Simulated Annealing using Genetic Crossover Doshisha University, Kyoto, Japan Takeshi Yoshida.
Mitsunori MIKI Tomoyuki HIROYASU Takanori MIZUTA
Presentation transcript:

The New Model of Parallel Genetic Algorithm in Multi-Objective Optimization Problems - Divided Range Multi-Objective Genetic Algorithms - ○Tomoyuki Hiroyasu Mitsunori Miki Shinya Watanabe Intelligent Systems Design Laboratory, Doshisha University,Japan Doshisha Univ., Japan I’m Shinya Watanabe and a graduate student of Doshisha University in japan. Now I’m talking about our study, the title is “Parallel Evolution Multi-Criterion Optimization for Block Layout Problems”.

EMO Background (1) ●Multi-criterion Optimizations ●Genetic Algorithms (Ex. VEGA,MOGA,NPGA…etc) ・High computation cost Parallel Computing Doshisha Univ., Japan These are the background of our story. Multi-criterion optimizations solved by Evolutionary algorithms are often called EMO. The study on EMO is not few. Many of the studies in this category get good results. But EMO has been known to be several problems. One of them is high computation cost. One of the simplest and most powerful solutions is performing EMO on Parallel Computers. Because Evolutionary Algorithms have potential parallelism and PC Cluster systems become very popular these days.

Background (2) ●Parallel EMOs Algorithms Divided Range Multi-Objective Distributed GA model Master slave model Cellular GA model SW-HUB Divided Range Multi-Objective Genetic Algorithms (DRMOGA) Doshisha Univ., Japan And Now,we’d like to focus parallel Emos Algorithms. Some parallel models for EMOs are proposed. But there are few studies for the validity on parallel model.So, we proposed the new parallel model Divided Range Multi-Objective Genetic Algorithms(DRMOGA). This model is applied to some test functions and it is found that this model is effective model for continuous multi-objective problems. But this model hasn’t been applied to discrete problems. Therefore, to find the effectiveness of DRMOGA, DRMGOA is applied to discrete problems. Now, I selected block layout problems as discrete problems.

Multi-Criterion Optimization Problems(1) ●Multi-Criterion Optimization Problems (MOPs) f 2 (x) Feasible region 1 Weak pareto optimal solutions Pareto optimal solutions Design variables X={x1, x2, …. , xn} Objective function F={f1(x), f2(x), … , fm(x)} Constraints Gi(x)<0 ( i = 1, 2, … , k) Doshisha Univ., Japan In the optimization problems, when there are several objective function, the problems are called multi-objective or multi-criterion problems(MOPs). In general,The MOPs are formulated as follows. Usually ,there are trade off relation between the objective functions. Therefore the optimum solution is not only one. In MOPs, the concept of the pareto optimal solution is used. This figure shows the Pareto-optimal solution of a two-objective problem. In figure, Pareto-optimal solutions are illustrated as Red line , and Weak pareto-optimal solutions are illustrated as Blue line. Weak pareto solution is the set of solutions that have the minimum value of one objective function. Since Pareto-optimal solution is a rational solution to MOPs, The first goal of solving the MOPs is to obtain Pareto-optimal solutions.

f Multi-Criterion Optimization Problems(2) ・Pareto dominant and Ranking method Pareto-optimal Set f 2 1 3 Pareto optimal solutions The set of non-inferior individuals in each generation. Ranking number of dominant individuals Rank = 1+ Doshisha Univ., Japan Here I talk about the method that judge which one individuals is pareto-solution or not. This figure shows the minimum problems to a two-objective function. In this figure, Red Line is pareto opromal solutions. As you see, The blue points in this figure is non-inferior in comparison with red point from any point of view. In brief, pareto-optimal set is the set of non-inferior individuals in each generation. In MOPs, concept of Ranking is usually made use of. Ranking expand concept of pareto-optimal. Fonseca determined the ranking as follow . Usually, these rankings are used as fitness function for selection such as the roulette selection.

Genetic Algorithms population Features crossover Start (Initialization) individuals mutation population Crossover Mutation Selection Evaluation Features From a metaphor of the same mechanism of the evolution in nature Stochastic searching Multi-point searching High calculation cost Iteration End (Find solution) Doshisha Univ., Japan Here, I’m explaining Genetic Algorithms briefly. GAs are optimization methods that derive its behavior from a metaphor of the same mechanisms of the evolution in nature. The features of GAs are stochastic searching and multi-point searching. In GAs, there are several searching points called “Individuals” like this. So, it is possible to apply GAs both to problems having continuous values and problems having discrete values. This is the typical flow of Simple GAs. In GAs, evaluation, selection, crossover, mutation, those are genetic operations repeated every generation. After some generations, we may get a good solution in result. One of the disadvantages of GAs is the high calculation cost, GAs need a lot of iterations and it takes much time. One of the solution of this problem is performing GAs on parallel computers.

Pareto optimal solutions Multi-objective GA (1) ・Multi-objective GA f 1 (x) 1 gene 5 gene 10 gene 30 gene 2 Pareto optimal solutions 50 gene Doshisha Univ., Japan Now, I’d like to talk about Multi-objective GA. In Multi-objective GA, Population is scattered(スカッター,スキャッター) In the objective field like this figure. like single objective GA , genetic operations such as evaluation, selection, crossover, and mutation, are repeated. and As generation grow, the set of individuals move toward the pareto optimum solutions.

Multi-objective GA (2) Squire EMO VEGA Schaffer (1985) VEGA+Pareto optimum individuals Tamaki  (1995) Ranking Goldberg (1989) MOGA Fonseca (1993) Non Pareto optimum Elimination Kobayashi (1996) Ranking + sharing Srinvas (1994) Others Doshisha Univ., Japan There are some researches that are focused on the multi-objective GA. I could like to explain leading research a little. Schaffer developed the VEGA. This research is the first research in this category. Goldberg introduced the ranking method and Fonseca also developed the MOGA. And There are more and more others research in this category.. Like this way, there are several models of multi objective GA and they can derive the good Pareto optimal solutions. However, it needs a lot of iterations to calculate the values of objective functions and the constrains. This leads to the high calculation coast. One of the solution of this problem is performing GAs on parallel computers. Especially, multi-objective GA need more availability of memory. That’s why, when there are many objective functions, the many points are necessary.

Parallerization of Genetic Algorithms Distributed GA model Island model (Free topology) Master slave model Global Parallelization (Only evaluate in parallel) Cellular GA model Neighborhood model (Mainly Grid topology) Doshisha Univ., Japan

DGA model Island 1 Island 2 ・Cannot perform the efficient search (x) f 2 f 1 (x) 2 f (x) 1 (x) Island 2 f 2 f (x) 1 ・Cannot perform the efficient search ・Need a big population size in each island Doshisha Univ., Japan And I’d like to talk about parallel EMO. Regardless of single or multi objective GA, Most of parallel GA is DGA. In this model, population is divided into several subpopulations and a SGA is performed in each subpopulation. and Sometimes, individuals are exchanged by the operation of the migration. As some researchers investigated, This models can obtain better solution than SGA. And we also proposed the new parallel model in multi objective GA.

Divided Range Multi-Objective GA(1) f 1 (x) 2 Division 1 f (x) 1 2 Division 1 Division 2 Max Pareto Optimum solution Min f 1 (x) 2 Division 2 1st The individuals are sorted by the values of focused objective function. 2nd The N/m individuals are chosen in sequence. 3rd SGA is performed on each sub population. 4th After some generations, the step is returned to first Step Doshisha Univ., Japan That is called Divided Range Multi-Objective GA (DRMOGA). The DRMOGA is one of the divided population models and a population is divided into sub populations. This figure shows the concept of the DRMOGA. In this figure, there are two objective function. Individuals are divided into two by the value of focused objective function F_1 . This algorithm is following the next steps. 1st Step The individuals are sorted by the values of focused objective function. 2nd Step The N/m(N over m) individuals are chosen in sequence. 3rd Step SGA is performed on each sub population. 4th Step After some generations, the step is returned to first Step. As the results, there exist m sub populations. The most Important point is that the searching domain is different in each sub population..

Divided Range Multi-Objective GA(2) ・DGA( Island model) f2(x) f1(x) f2(x) f1(x) f2(x) f1(x) + = ・DRMOGA f2(x) f1(x) f2(x) f1(x) f2(x) f1(x) = + Doshisha Univ., Japan This figure shows comparison of DGA and DRMOGA’s searching. As you can see, the sub population of DGA search same feasible domain, Therefore the efficient search can not be preformed. On the other hand, The sub population of DRMOGA is determined by the value of focused objective function. Therefore each subpopulation’s search area doesn’t overlap and I think that DRMOGA can have the high diversity.

Configuration of GA (1) Vector None a1 = {0.02, 10.03, ・・・, 7.52} Expression of genes a1 = {0.02, 10.03, ・・・, 7.52} Crossover Center Neighborhood Crossover Selection ・Rank 1 selection with sharing ・Roulette selection ・Roulette selection + sharing Mutation None When the movement of the Pareto frontier is very small Terminal condition Doshisha Univ., Japan Now I’d like to explain configuration of GA. This figure shows coding example. In block packing method, a chromosome has two kinds of information: those are block number and the direction of block. And This table show that we selected genetic operations. In operation of selection, we selected Pareto reservation (リベイション) strategy (ストラテジー ). this method is that all of the rank 1 individuals are preserved. and In crossover, PMX method is used. PMX is originally developed for Traveling Salesman problems. In the mutation, 2 bit substitution method is used. In this method, arbitrary 2 bits are selected and these bits are substituted.

・Center Neighbored Crossover (CNX) Configuration of GA (2) ・Center Neighbored Crossover (CNX) 1st N+1 parent Individuals are selected randomly. 2nd The vector of the gravity is derived with the following equation. 3rd New individual is generated. rg = Σri 1 n+1 Parent2 (x2,y2 ) ● Parent3 (x3,y3 (x1,y1 Parent1 N ominee 1 (x4,y4) 2 (x 5 ,y 3 6 Gravity × Child 7 rchild = rg + Σti ei σi = α|r i - r g| Doshisha Univ., Japan

Configuration of GA (3) α = 3 α = 6 ・Normal Distribution Doshisha Univ., Japan

Parameter (1) Application models Parameter SGA , DGA , DRMOGA Parameter GA parameter value SGA DGA・DRMOGA Population size 100 500 crossover rate 1.0 mutation rate migration interval (sort interval) 5 5 migration rate 0.2 Doshisha Univ., Japan We applied SGA,DGA,and DRMOGA to block layout problems. To investigate the characteristics of the three models, We use two layout problems that has thirteen , and twenty seven blocks. This table shows width and length of each block in 13 block problem. The parameters of the GA are showed in this table.

Cases Parameter (2) selection method Case α Case1 3 Case2 6 Pareto optimal 3 Case2 6 Case3 Only roulette selection Case4 Case5 Roulette selection with sharing Case6 Doshisha Univ., Japan We applied SGA,DGA,and DRMOGA to block layout problems. To investigate the characteristics of the three models, We use two layout problems that has thirteen , and twenty seven blocks. This table shows width and length of each block in 13 block problem. The parameters of the GA are showed in this table.

Matrix - Evaluation methods - Pareto optimum individuals Error (smaller values arebetter ( E>0) Cover rate (index of diversity, 0<C<1) Generation (smaller values are better) Doshisha Univ., Japan

Cover rate Cover rate f f Cover rate 85 . 2 9 8 = + (x) (x) 1 (x) 2 f 1 (x) 2 cover rate(f1)=8/10=0.8 cover rate(f2)=9/10=0.9 Cover rate 85 . 2 9 8 = + Doshisha Univ., Japan

Cluster system for calculation Spec. of Cluster (5 nodes) Processor PentiumⅡ(Deschutes) Clock 400MHz # Processors 1 × 5 Main memory 128Mbytes × 5 Network Fast Ethernet (100Mbps) Communication TCP/IP, MPICH 1.1.2 OS Linux 2.2.10 Compiler gcc (egcs-2.91.61) Doshisha Univ., Japan The numerical examples were performed on the PC Cluster System. This shows the specification of our cluster system.

Veldhuizen and Lamount (1999) K. Deb (1999) Numerical Example Tamaki et al. (1995) Veldhuizen and Lamount (1999) K. Deb (1999) Doshisha Univ., Japan

Example 1 Objective functions Constraints Doshisha Univ., Japan

Example 2 Objective functions Constraints f3 f2 f1 Doshisha Univ., Japan

Example 3 Objective functions f2 ・・・ f1 x1 f1 Doshisha Univ., Japan

å Example 4 x g ( x , , x ) = 1 + 10 N - 1 Objective functions f2 f1 ・・・ 2 N N - 1 Doshisha Univ., Japan

Results (Example1) DGA (Case5) DRMOGA (Case5) f2 f2 f1 f1 Doshisha Univ., Japan

Results (Example1) 40 48 Case error cover rate Simple 436 0.00 1.00 number of solutions error cover rate generations Simple Case1 436 0.00 1.00 799 Case2 382 0.03 1.00 1000 Case3 471 0.00 1.00 35 Case4 444 0.00 1.00 367 Case5 461 0.00 1.00 39 Case6 330 0.00 1.00 1000 island Case1 436 0.01 1.00 43 Case2 438 0.01 1.00 59 Case3 423 0.01 1.00 273 Case4 435 0.01 1.00 44 Case5 431 0.01 1.00 66 Case6 404 0.01 1.00 927 DR Case1 500 0.00 1.00 40 Case2 500 0.00 1.00 48 Case3 494 0.00 1.00 105 Case4 494 0.00 1.00 548 Case5 495 0.00 1.00 199 Case6 494 0.00 1.00 814 Doshisha Univ., Japan

Results (Example2) DGA (Case1) DRMOGA (Case1) Doshisha Univ., Japan

Results (Example2) x1 x1 x2 x2 DGA (Case1) DRMOGA (Case1) Doshisha Univ., Japan

Results (Example2) Case cover rate 0.95 0.96 0.92 0.85 0.88 0.53 number of solutions cover rate generations Simple Case1 500 0.75 15 Case2 500 0.74 18 Case3 491 0.51 19 Case4 485 0.50 30 Case5 316 0.48 19 Case6 207 0.47 198 island Case1 428 0.79 19 Case2 426 0.79 36 Case3 434 0.76 22 Case4 403 0.77 55 Case5 6 0.04 1000 Case6 125 0.43 943 DR Case1 386 0.95 44 Case2 330 0.96 256 Case3 429 0.92 82 Case4 255 0.85 277 Case5 337 0.88 66 Case6 90 0.53 117 Doshisha Univ., Japan

Results (Example3) DGA (Case4) DRMOGA (Case4) f2 f2 f1 f1 Doshisha Univ., Japan

Results (Example3) x1 x1 DGA (Case4) DRMOGA (Case4) f1 f1 Doshisha Univ., Japan

Results (Example3) Case number of solutions error cover rate generations Simple Case1 500 7.21 0.41 394 Case2 456 5.92 0.32 612 Case3 469 3.75 0.14 1000 Case4 374 2.37 0.47 1000 Case5 482 3.84 0.48 1000 Case6 423 2.60 0.43 1000 island Case1 345 6.41 0.46 570 Case2 322 5.88 0.48 919 Case3 301 3.70 0.22 1000 Case4 220 2.60 0.35 1000 Case5 283 3.39 0.43 1000 Case6 240 2.41 0.31 1000 DR Case1 412 6.87 0.38 533 Case2 363 5.38 0.28 774 Case3 425 4.53 0.40 780 Case4 293 0.01 0.99 1000 Case5 393 3.92 0.41 692 Case6 254 0.14 0.94 971 Doshisha Univ., Japan

Results (Example3) DRMOGA (Case4) - Object sharing - DRMOGA (Case4)-Plan sharing- Doshisha Univ., Japan

Results (Example3) DRMOGA (Case4) - Object sharing - fx1 x1 fx1 x1 DRMOGA (Case4) - Object sharing - DRMOGA (Case4)-Plan sharing- Doshisha Univ., Japan

Results (Example4) DGA (Case6) DRMOGA (Case6) f2 f2 f1 f1 Doshisha Univ., Japan

Results (Example4) x1 x1 DGA (Case6) DRMOGA (Case6) f1 f1 Doshisha Univ., Japan

Results (Example4) Case error cover rate 0.03 0.08 0.60 0.24 0.25 0.60 number of solutions error cover rate generations Simple Case1 500 1.70 0.31 209 Case2 500 1.71 0.38 358 Case3 470 0.32 0.22 1000 0.03 Case4 477 0.40 1000 Case5 492 0.34 0.58 855 0.08 0.60 Case6 493 899 island Case1 385 1.89 0.40 333 Case2 409 1.75 0.53 403 Case3 304 0.31 0.33 1000 0.24 Case4 361 0.46 1000 Case5 376 0.27 0.60 1000 Case6 365 0.25 0.60 1000 DR Case1 494 1.93 0.37 212 Case2 457 3.10 0.34 54 Case3 460 0.39 0.30 262 0.03 Case4 451 0.52 387 Case5 442 0.39 0.47 291 0.07 0.61 Case6 402 654 Doshisha Univ., Japan

Conclusion In this study, we introduced the new model of genetic algorithm in the multi objective optimization problems: Distributed Genetic Algorithms (DRGAs). DRGA is the model that is suitable for the parallel processing. can derive the solutions with short time. can derive the solutions that have high accuracy. can sometimes derive the better solutions compared to the single island model. Doshisha Univ., Japan

Conclusions In this study, we introduced the new model of genetic algorithm in the multi objective optimization problems: Distributed Genetic Algorithms (DRGAs). DRGA is the model that is suitable for the parallel processing. can derive the solutions with short time. can derive the solutions that have high accuracy. can sometimes derive the better solutions compared to the single island model. Doshisha Univ., Japan

27 And these are the results of DGA and SGA. Results of 27 Blocks case 27 SGA Doshisha Univ., Japan And these are the results of DGA and SGA. They often got the real weak pareto solutions.