A Hybrid Optimization Approach for Global Exploration 2005 年度 713 番 日和 悟 Satoru HIWA 知的システムデザイン研究室 Intelligent Systems Design Laboratory.

Slides:



Advertisements
Similar presentations
Evolutionary Algorithms Nicolas Kruchten 4 th Year Engineering Science Infrastructure Option.
Advertisements

Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
G5BAIM Artificial Intelligence Methods
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Global Optimization General issues in global optimization Classification of algorithms The DIRECT algorithm – Divided rectangles – Exploration and Exploitation.
Optimization methods Review
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimal Design Laboratory | University of Michigan, Ann Arbor 2011 Design Preference Elicitation Using Efficient Global Optimization Yi Ren Panos Y. Papalambros.
Particle Swarm Optimization (PSO)
Tabu Search for Model Selection in Multiple Regression Zvi Drezner California State University Fullerton.
A Heuristic Bidding Strategy for Multiple Heterogeneous Auctions Patricia Anthony & Nicholas R. Jennings Dept. of Electronics and Computer Science University.
Department of Engineering, Control & Instrumentation Research Group 22 – Mar – 2006 Optimisation Based Clearance of Nonlinear Flight Control Laws Prathyush.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
A Comparative Study Of Deterministic And Stochastic Optimization Methods For Integrated Design Of Processes Mario Francisco a, Silvana Revollar b, Pastora.
MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Optimization Methods One-Dimensional Unconstrained Optimization
A New Algorithm for Solving Many-objective Optimization Problem Md. Shihabul Islam ( ) and Bashiul Alam Sabab ( ) Department of Computer Science.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Ranga Rodrigo April 6, 2014 Most of the sides are from the Matlab tutorial. 1.
Genetic Algorithm.
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Efficient Model Selection for Support Vector Machines
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
1 Paper Review for ENGG6140 Memetic Algorithms By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
MOGADES: Multi-Objective Genetic Algorithm with Distributed Environment Scheme Intelligent Systems Design Laboratory , Doshisha University , Kyoto Japan.
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Chih-Ming Chen, Student Member, IEEE, Ying-ping Chen, Member, IEEE, Tzu-Ching Shen, and John K. Zao, Senior Member, IEEE Evolutionary Computation (CEC),
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
Doshisha Univ., Japan Parallel Evolutionary Multi-Criterion Optimization for Block Layout Problems ○ Shinya Watanabe Tomoyuki Hiroyasu Mitsunori Miki Intelligent.
(Particle Swarm Optimisation)
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
Doshisha Univ., Kyoto, Japan CEC2003 Adaptive Temperature Schedule Determined by Genetic Algorithm for Parallel Simulated Annealing Doshisha University,
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
FORS 8450 Advanced Forest Planning Lecture 11 Tabu Search.
Exact and heuristics algorithms
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Non-Linear Programming © 2011 Daniel Kirschen and University of Washington 1.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
Tamaki Okuda ● Tomoyuki Hiroyasu   Mitsunori Miki   Shinya Watanabe  
Genetic algorithms: A Stochastic Approach for Improving the Current Cadastre Accuracies Anna Shnaidman Uri Shoshani Yerach Doytsher Mapping and Geo-Information.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Ch. Eick: Num. Optimization with GAs Numerical Optimization General Framework: objective function f(x 1,...,x n ) to be minimized or maximized constraints:
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
Parallel Simulated Annealing using Genetic Crossover Tomoyuki Hiroyasu Mitsunori Miki Maki Ogura November 09, 2000 Doshisha University, Kyoto, Japan.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Introduction to genetic algorithm
Optimization via Search
Genetic Algorithms.
Evolutionary Algorithms Jim Whitehead
Meta-heuristics Introduction - Fabien Tricoire
C.-S. Shieh, EC, KUAS, Taiwan
Local Search Algorithms
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Dr. Arslan Ornek IMPROVING SEARCH
Meta-Heuristic Algorithms 16B1NCI637
○ Hisashi Shimosaka (Doshisha University)
More on Search: A* and Optimization
Aiman H. El-Maleh Sadiq M. Sait Syed Z. Shazli
Boltzmann Machine (BM) (§6.4)
Local Search Algorithms
Presentation transcript:

A Hybrid Optimization Approach for Global Exploration 2005 年度 713 番 日和 悟 Satoru HIWA 知的システムデザイン研究室 Intelligent Systems Design Laboratory

Optimization  Optimization problem consists of: -Objective function: we want to minimize or maximize. -Design variables: affect the objective function value. -Constraints: allow the design variables to take on certain values but exclude others. Mathematical discipline that concerns the finding of minima or maxima of functions, subject to constraints Real-world applications  Optimization techniques have been applied to various real- world problems. e.g.)Structural design Electric device design

Problem Solving by Optimization  There are many good optimization algorithms.  Each method has its own characteristics. -It is difficult to choose the best method for the optimization problem.  It is important to select and apply the appropriate algorithms according to the complexities of the problems.  It is hard to solve the problem with only one algorithm when the problem is complicated. Hybrid optimization approach, which combines plural optimization algorithms, should be necessary. Purpose of the research: To develop an efficient hybrid optimization algorithm

Hybrid Optimization Approach  It provides the high performance which cannot be accomplished with only one algorithm. Hybrid optimization algorithm  We have to determine what kinds of solutions are required.  Desired solutions may vary depending on the user: -One may require the better result within a reasonable time. -The other may want not only the optimum, but also the information of the landscape.  Optimization strategy -First, how the optimization process is performed should be determined. To develop an efficient hybrid optimization algorithm

Optimization Strategy  By this, we can obtain not only the optimum point, but also the information of the landscape.  Many optimization algorithms are designed only to derive an optimum. To explore the search space uniformly and equally Why is the strategy needed?

Why is the Strategy Needed?  When we solve real-world optimization problems; -Usually, the landscape and the optimum are unknown. -In this case, the obtained results should be reliable.  Genetic Algorithms (GAs) are powerful techniques to obtain the global optimum. -Probabilistic algorithm inspired by evolutionary biology  Example of optimization by GAs: ProblemGAs

Why is the Strategy Needed?  When we solve real-world optimization problems; -Usually, the landscape and the optimum are unknown. -In this case, the obtained results should be reliable.  Genetic Algorithms (GAs) are powerful techniques to obtain the global optimum. -Probabilistic algorithm inspired by evolutionary biology  Example of optimization by GAs: ProblemGAs

Why is the Strategy Needed?  When we solve real-world optimization problems; -Usually, the landscape and the optimum are unknown. -In this case, the obtained results should be reliable.  Genetic Algorithms (GAs) are powerful techniques to obtain the global optimum. -Probabilistic algorithm inspired by evolutionary biology  Example of optimization by GAs: ProblemGAs Unknown The result is not reliable. Unexplored area exists. Is real optimum in the area?

Why is the Strategy Needed?  When we solve real-world optimization problems; -Usually, the landscape and the optimum are unknown. -In this case, the obtained results should be reliable.  Genetic Algorithms (GAs) are powerful techniques to obtain the global optimum. -Probabilistic algorithm inspired by evolutionary biology  Example of optimization by GAs: ProblemGAs Unknown The result is not reliable. Unexplored area exists. Is real optimum in the area? The strategy is not achieved only by GAs.

Why is the Strategy Needed?  When we solve real-world optimization problems; -Usually, the landscape and the optimum are unknown. -In this case, the obtained results should be reliable.  Genetic Algorithms (GAs) are powerful techniques to obtain the global optimum. -Probabilistic algorithm inspired by evolutionary biology  Example of optimization by GAs: ProblemGAs Unknown Reliability can be evaluated. The strategy is achieved. The landscape is grasped.

Optimization Algorithms  The strategy is not achieved only by GAs.  Other algorithm, which provides more global search, is needed.  However, the globally-intensified search converges slowly compared to GAs or local search algorithms. -the much time is consumed in exploring the entire search space.  There are tradeoff between the search broadness and the convergence rate. It is necessary to balance the global and local search.  GAs  DIRECT: explores search space globally.  SQP: is high-convergence local search method. Both global and local search algorithms are hybridized.

DIRECT  Deterministic, global optimization algorithm  Its name comes from ‘DIviding RECTangles’. -Search space is considered to be a hyper-rectangle (box). -Each box is trisected in each dimension. -Center point of each box is sampled as solution.  Boxes to be divided -are mathematically guaranteed to be promising. -are called ‘potentially optimal boxes.’

Characteristics of the DIRECT search  Potentially optimal boxes potentially contain a better value than any other box.  DIRECT divides the potentially optimal boxes at each iteration.

Characteristics of the DIRECT search  Example: 2-dimensional Schwefel Function -Some Local optima exist far from the global optimum. -DIRECT explores the search space uniformly and equally. -DIRECT also detects the promising area. Global Optimum Local Optima

Characteristics of the DIRECT search  Example: 2-dimensional Schwefel Function -Some Local optima exist far from the global optimum. -DIRECT explores the search space uniformly and equally. -DIRECT also detects the promising area. Global Optimum Local Optima

Characteristics of the DIRECT search  Example: 2-dimensional Schwefel Function -Some Local optima exist far from the global optimum. -DIRECT explores the search space uniformly and equally. -DIRECT also detects the promising area. Global Optimum Local Optima

Genetic Algorithms (GAs)  Heuristic algorithms inspired by evolutionary biology. -Solutions are called ‘individuals’, and genetic operators (Crossover, Selection, Mutation) are applied.  Real-coded GAs -Individuals are represented by real number vector.  Although GAs are global optimization algorithm, the search broadness is inferior to DIRECT.  GAs are used as more locally-intensified search than DIRECT. Parents Children Individuals

Sequential Quadratic Programming (SQP)  Gradient-based local search algorithm -The most efficient method in nonlinear programming -By using gradient information, SQP rapidly converges to the optimum.  Advantage -High convergence  Disadvantage -SQP is often trapped to the local optima, for the problem which has many local optima.

Idea of the proposed hybrid optimization approach Global exploration by DIRECT Locally-intensified search by GAs Fine tuning by SQP Hybrid Optimization Algorithm 1.Perform the DIRECT search. 2.Execute GAs. 3.Improve the best solution obtained in GAs search by SQP. Procedure of the proposed algorithm

Idea of the proposed hybrid optimization approach Global exploration by DIRECT Optimum Locally-intensified search by GAs Fine tuning by SQP Hybrid Optimization Algorithm 1.Perform the DIRECT search. 2.Execute GAs. 3.Improve the best solution obtained in GAs search by SQP. Procedure of the proposed algorithm

How to Combine DIRECT and GAs  GAs utilize the center points of the potentially optimal boxes in DIRECT as their individuals.  Number of potentially optimal = number of individuals -Number of potentially optimal differs at each iteration. -Number of individuals are determined according to the complexities of the problems. (e.g. In N-dim. space, N×10 individuals are recommended.) DIRECT stopped. GAs start.

 Number of potentially optimal = number of individuals -Number of potentially optimal differs at each iteration. -Number of individuals are determined according to the complexities of the problems. (e.g. In N-dim. Space, N×10 individuals are recommended.) How to Combine DIRECT and GAs  GAs utilize the center points of the potentially optimal boxes in DIRECT as their individuals. DIRECT stopped. GAs start. Number of potentially optimal boxes should be adjusted according to the number of individuals.

How to Combine DIRECT and GAs  If the number of potentially optimal is smaller than Ni, randomly generated individuals are added.  If the number of potentially optimal is larger than Ni, a certain number of potentially optimal boxes are selected. Box selection rules are proposed and applied. Ni: Number of individuals in GAs

Box Selection Rules for DIRECT Idea of selecting the boxes to be divided  DIRECT sometimes performs an local improvement.  In the hybrid optimization, it is not necessary for DIRECT to perform locally-intensified search.  Proposed rules reduce the crowded boxes. -Distance from the box with best function value is calculated. -A certain number of boxes far from the best point are selected. -The rules are applied at each iteration in DIRECT search.

Box Selection Rules for DIRECT Idea of selecting the boxes to be divided The number of potentially optimal boxes is reduced without breaking the global search characteristics of DIRECT. Potentially optimal boxes near the best point are discarded, and locally-biased search is prevented.  DIRECT sometimes performs an local improvement.  In the hybrid optimization, it is not necessary for DIRECT to perform locally-intensified search.  Proposed rules reduce the crowded boxes. -Distance from the box with best function value is calculated. -A certain number of boxes far from the best point are selected. -The rules are applied at each iteration in DIRECT search.

Experiments  10-dimensional Schwefel function -A lot of local optimum exist. -The function value of the global optimum is zero. Target problem  Numerical example is shown -to verify whether the proposed method achieve the proposed strategy − to explore the search space uniformly and equally.  The proposed hybrid optimization algorithm -is applied to the benchmark problem. -is compared to the search only by GAs. Verification of effectiveness of the hybrid approach

Results and Discussions Searching ability Average of 30 runs HybridGAs Function value 9.07× ×10 2 Function evaluations 129,373279,703  Average values of function value and the number of function evaluations are shown. Average values of function value and the number of function evaluations are shown.  Proposed hybrid algorithm obtains better function value than that of GAs, with less function evaluations. Proposed hybrid algorithm obtains better function value than that of GAs, with less function evaluations.

Results and Discussions To see whether the proposed strategy is achieved…  Search histories of DIRECT and GAs in the hybrid algorithm are checked. Search histories of DIRECT and GAs in the hybrid algorithm are checked.  History in 10-dimensional space is projected into History in 10-dimensional space is projected into 2-dimensional plane.  Although 45 plots exist, 4 typical examples are picked. Although 45 plots exist, 4 typical examples are picked. (x1, x2, …, x10) → (x1, x2), (x1, x3), …

Search History of DIRECT (x1, x2) (x3, x6) (x2, x5) (x7, x9)

Search History of DIRECT

Search Histories of DIRECT and GAs

The proposed strategy is achieved.

Conclusions  ‘optimization strategy’ is proposed: -To explore the search space uniformly and equally  Optimization algorithms used for the strategy: -DIRECT, GAs, and SQP Hybrid optimization approach is proposed. Modification to DIRECT  Box selection rules are proposed and applied. Hybrid optimization algorithm  It achieved the proposed strategy.  It provided the efficient performance than the search only by GAs.

Paper List  Mitsunori Miki, Satoru Hiwa, Tomoyuki Hiroyasu “Simulated Annealing using an Adaptive Search Vector” Proceedings of IEEE International Conference on Cybernetics and Intelligent Systems 2006 (Bangkok, Thailand) Proceeding of International Conference The Science and Engineering Review of Doshisha University  三木光範,日和 悟,廣安知之 「 LED を用いた調色用照明システムの基礎的検討」 同志社大学理工学研究報告 Vol.46 No.3 pp 9-18 , 2005 Oral Presentation (in Japan)  日和 悟,廣安知之,三木光範 「大域的最適化のための複数最適化手法の動的制御法」 日本機械学会 第 7 回最適化シンポジウム, 2006  日和 悟,廣安知之,三木光範 「大域的最適化のための複数最適化手法の動的制御法」 日本機械学会 第 6 回設計工学・システム部門講演会, 2006  三木光範,日和 悟,廣安知之 「適応的探索ベクトルをもつシミュレーテッドアニーリング」 日本機械学会 第 8 回計算力学講演会, 2005

Lipschitzian Optimization [Shubert 1972] It requires the user to specify the Lipschitz constant K. ab x1x1 Slope = −K Slope = +K ab x1x1 x2x2 ab x1x1 x2x2 x3x3  K is used as a prediction of the maximum possible slope of the objective function over the global domain. – K +K

DIRECT (one-dimensional) ab Box 1 Box 2 Box 3 Box 1 Box 2 Box 3 Box 4 Box 5

DIRECT (one-dimensional) ab Box 1 Box 2 Box 3 Box 1 Box 2 Box 3 Box 4 Box 5 Box 4 Box 1 Box 5 Box 2 Slope = K 1 Slope = K 2 Slope = K

DIRECT (one-dimensional) ab Box 1 Box 2 Box 3 Box 1 Box 2 Box 3 Box 4 Box 5 Box 4 Box 1 Box 5 Box 2 Slope = K 1 Slope = K 2 Slope = K  If box i is potentially optimal, then f(c i ) <= f(c j ) for all boxes that are of the same size as i.  In the largest boxes, the box with the best function value is potentially optimal.

DIRECT ー Potentially Optimal Boxes  DIRECT divides all potentially optimal boxes.  Potentially optimal boxes are defined by: Identification of potentially optimal boxes A hyper box j is potentially optimal if there exists some such that c j : center point of the box j d j : distance from the center point to vertices

DIRECT ー Potentially Optimal Boxes  DIRECT divides all potentially optimal boxes. Identification of potentially optimal boxes Search space

DIRECT ー Potentially Optimal Boxes  DIRECT divides all potentially optimal boxes. djdj Search space cjcj Box j Identification of potentially optimal boxes

DIRECT ー Potentially Optimal Boxes  DIRECT divides all potentially optimal boxes. djdj Search space Center - vertex distance (d j ) f (c j ) cjcj Box j Identification of potentially optimal boxes

DIRECT ー Potentially Optimal Boxes  DIRECT divides all potentially optimal boxes. djdj Center - vertex distance (d j ) f (c j ) cjcj Box j ( 0, f min -ε| f min | ) f min Identification of potentially optimal boxes

DIRECT ー Potentially Optimal Boxes  DIRECT divides all potentially optimal boxes. djdj Center - vertex distance (d j ) f (c j ) cjcj Box j ( 0, f min -ε| f min | ) f min Make the convex hull which contains all points. Identification of potentially optimal boxes

DIRECT ー Potentially Optimal Boxes  DIRECT divides all potentially optimal boxes. djdj Center - vertex distance (d j ) f (c j ) cjcj Box j ( 0, f min -ε| f min | ) f min : Potentially optimal Boxes on the lower part of convex hull is selected as potentially optimal. Identification of potentially optimal boxes

DIRECT ー Potentially Optimal Boxes  DIRECT divides all potentially optimal boxes. djdj Center - vertex distance (d j ) f (c j ) cjcj Box j : Potentially optimal Boxes on the lower part of convex hull is selected as potentially optimal. Search space Identification of potentially optimal boxes

Genetic Algorithms (GAs)  Global search algorithm inspired by evolutionary biology. -Solutions are called ‘individuals’, and genetic operators (Crossover, Selection, Mutation) are applied.  Real-Coded GAs (RCGAs) -Individuals are represented by real number vector. -Crossover operator significantly affects the searching ability.  Simplex Crossover (SPX) -One of the efficient crossover operator for RCGAs. -Generates offspring in a simplex, which is formed by n+1 individuals in n-dimensional space  RCGAs using the SPX operator -has both global and local search characteristics. RCGAs using the SPX operator are used.

GAs and SQP  Gradient-based local search algorithm -By using gradient information, SQP rapidly converges to the optimum. GAs (Genetic Algorithms) SQP (Sequential Quadratic Programming)  Heuristic algorithm inspired by evolutionary biology. -Solutions are called ‘individuals’, and genetic operators (Crossover, Selection, Mutation) are applied. Parents Children Individuals

Stopping Criterion  is terminated when the size of the best potentially optimal box is less than certain value prescribed.  A certain depth of search space exploration is obtained. DIRECT  are terminated when their individuals converged.  Spread of the individuals in design variable space: x max – x min < threshold GAs SQP  continues its search until the improvement of solution becomes a minute value.

Stopping Criterion (DIRECT)  is terminated when the longest side length of the best potentially optimal box is less than  A certain depth of search space exploration is obtained. DIRECT

Stopping Criterion (GAs) GAs  are terminated when their individuals converged.  Spread of the individuals in design variable space: Spread i = x max – x min x max : the maximum value of i-th design variables in all individuals. x min : the minimum value of i-th design variables in all individuals.  If Spread i is smaller than × feasible range for all dimensions, GAs are terminated. Spread 1 Spread 2 Population converged

Results (of each algorithm) Function value DIRECT (po 52) GAs (Ind 100) SQPHybrid GAs only (Ind 100) Average3.52x x x x10 2 St. Dev x x x10 2 Num. of eval. DIRECTGAsSQPHybridGAs only Average St. Dev

How to Combine DIRECT and GAs  GAs utilize the center points of the potentially optimal boxes in DIRECT as their individuals.  If N po > N ind -Box selection rules are applied.  If N po < N ind -Randomly generated individuals are added to GAs. DIRECT stopped. GAs start.

1.Select two boxes, with the smallest size and with the largest from the set of potentially optimal boxes. 2.For each boxes, calculate the distance from two box. 3.Sort the boxes by the distance in descending order, and select N boxes from them. Modification to DIRECT Box selection rules The number of potentially optimal boxes is reduced without breaking the global search characteristics of DIRECT. Potentially optimal boxes near two boxes are discarded, and locally-biased search is prevented.

1.Select two boxes, with the smallest size and with the largest from the set of potentially optimal boxes. 2.For each boxes, calculate the distance from two box. 3.Sort the boxes by the distance in descending order, and select N boxes from them. Modification to DIRECT Box selection rules The number of potentially optimal boxes is reduced without breaking the global search characteristics of DIRECT. Potentially optimal boxes near two boxes are discarded, and locally-biased search is prevented.

Potentially optimal boxes (when DIRECT was terminated) (x1, x2) (x3, x6) (x2, x5) (x7, x9)

History of the search only by GAs (x1, x2) (x3, x6) (x2, x5) (x7, x9)

History of the search only by GAs (x1, x2) (x3, x6) (x2, x5) (x7, x9) GAs were trapped to the local optima.