Download presentation
Presentation is loading. Please wait.
1
Introduction to Scatter Search
ENGG*6140 – Paper Review Presented by: Jason Harris & Stephen Coe M.Sc. Candidates University of Guelph
2
Outline Introduction Scatter Search Template
Diversification Generation Method Improvement Method Reference Set Update Method Subset Generation Method Solution Combination Method Example (0-1 Knapsack Problem) Comparison with GA
3
Introduction Evolutionary method
Fundamental concepts were first introduced in the 1970’s and are based on formulations from the 1960’s Original proposal in 1977 was tabled by Fred Glover, after that Scatter Search was left until 1990. Uses strategies that both diversify and intensify solutions Solutions are generated using combination strategies as opposed to probabilistic learning approaches
4
Scatter Search Foundations
Useful information about the solution is typically contained in a diverse collection of elite solutions Combination strategies incorporate both diversity (extrapolation) and intensification (interpolation). Multiple solution combination enhance the opportunity to exploit information contained in the union of elite solutions
5
Introduction C These solutions are in a raw form. In most scatter searches these solutions are subject to heuristic improvement. 2 Convex 1 A B 4 3 Non-convex
6
Scatter Search Notation
RefSet – Reference set of Solutions b – Size of the reference set xi – The ith solution in the Reference set (x1 is the best and xb is the worst). P – Set of solutions generated by diversification generation method PSize – Size of the population of diverse solutions s – A subset of reference solutions ssize – The size of the subset of reference solutions
7
Scatter Search Template
Diversification Generation Method Improvement Method Solution Combination Method Improvement Method Reference Set Update Method RefSet Subset Generation Method s No New Reference Solutions Added
8
Diversification Generation Method
The idea behind the diversification generation method is to generate a collection of diverse solutions The quality of the solutions is not of importance Generation methods are often customized to specific problems PSize is usually set to the maximum of 100 or 5*b Can be totally deterministic or partially random
9
Improvement Method Must be able to handle both feasible and infeasible solutions It is possible to generate multiple instances of the same solution Generally employs local searches not unlike those previously introduced in this class (steepest descent) This is the only component that is not necessary to implement the scatter search algorithm
10
Subset Combination Method
Construct subsets by building subsets of Type 1, Type 2, Type 3 and Type 4 subsets For a RefSet of b there are approximately (3b-7)*b/2 subset combinations The number of subsets can be reduced by considering just one layer of subsets to reduce computational time
11
Solution Combination Method
Generally problem specific, because it is directly related to a solution representation Can generate more than one solution and can depend on the quality of the solutions being combined Can also generate infeasible solutions If a subset has been calculated on a previous iteration it is not necessary to do the calculation again
12
Reference Update Method
Objective is to generate a collection of both high quality solutions and diverse solutions The number of Solution included in the RefSet is usually less than 20 Consists of the b1 best solutions from the preceding step (solution combination or diversification generation) Consists of the b2 solutions that have the largest Euclidian distance from the current RefSet solutions Multiple techniques are employed to update the reference set (Static, Dynamic, 2-Tier etc…)
13
Example (0-1 Knapsack) Maximize: Subject to:
11x1+10x2+9x3+12x4+10x5+6x6+7x7+5x8+3x9+8x10 (Co-efficients represent profit for each item) Subject to: 33x1+27x2+16x3+14x4+29x5+30x6+31x7+33x8+14x9+18x10100 (Co-efficients represent weight for each item) xi={0,1} for i=1,…,10
14
Diversification Generator
Our goal is to generate a diverse population from a random seed: x=(0,0,0,0,0,0,0,0,0,0) Select h n-1, we will choose h=5 x’1+hk=1-x1+hk for k=0,1,…,n/h (Values that are not visited are equal to the original seed x) Another set of diverse solutions are generated base on the compliment of x’: x”i=1-x’I h x' x'' 1 (1,1,1,1,1,1,1,1,1,1) (0,0,0,0,0,0,0,0,0,0) 2 (1,0,1,0,1,0,1,0,1,0) (0,1,0,1,0,1,0,1,0,1) 3 (1,0,0,1,0,0,1,0,0,1) (0,1,1,0,1,1,0,1,1,0) 4 (1,0,0,0,1,0,0,0,1,0) (0,1,1,1,0,1,1,1,0,1) 5 (1,0,0,0,0,1,0,0,0,0) (0,1,1,1,1,0,1,1,1,1)
15
Diversification Generator
Solution Trial Solution Objective Value Weight Constraint 1 (1,1,1,1,1,1,1,1,1,1) 81 245 2 (1,0,1,0,1,0,1,0,1,0) 40 123 3 (1,0,0,1,0,0,1,0,0,1) 38 96 4 (1,0,0,0,1,0,0,0,1,0) 24 76 5 (1,0,0,0,0,1,0,0,0,0) 17 63 6 (0,0,0,0,0,0,0,0,0,0) 7 (0,1,0,1,0,1,0,1,0,1) 41 122 8 (0,1,1,0,1,1,0,1,1,0) 43 149 10 (0,1,1,1,1,0,1,1,1,1) 64 182 (0,1,1,1,0,1,1,1,0,1)
16
Improvement Method (0,1,1,1,0,1,1,1,0,1) Objective = 57 Weight = 169
x1 = 0.333 (0,1,1,1,0,1,1,1,0,1) x2 = 0.370 x3 = 0.563 Objective = 57 x4 = 0.857 x5 = 0.345 x6 = 0.200 Weight = 169 x7 = 0.226 x8 = 0.152 x9 = 0.214 x10 = 0.444
17
Improvement Method (0,1,1,1,0,1,1,0,0,1) Objective = 52 Weight = 136
x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,1,1,0,0,1) x5 = 0.345 x6 = 0.200 Objective = 52 x7 = 0.226 x8 = 0.152 1→0 Weight = 136 x9 = 0.214 x10 = 0.444
18
Improvement Method (0,1,1,1,0,0,1,0,0,1) Objective = 46 Weight = 106
x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,1,0,0,1) x5 = 0.345 x6 = 0.200 1→0 Objective = 46 x7 = 0.226 x8 = 0.152 1→0 Weight = 106 x9 = 0.214 x10 = 0.444
19
Improvement Method (0,1,1,1,0,0,0,0,0,1) Objective = 39 Weight = 75
x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,0,1) x5 = 0.345 x6 = 0.200 1→0 Objective = 39 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 75 x9 = 0.214 x10 = 0.444
20
Improvement Method (0,1,1,1,1,0,0,0,0,1) Objective = 49 Weight = 104
x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,1,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 49 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 104 x9 = 0.214 x10 = 0.444
21
Improvement Method (0,1,1,1,0,0,0,0,0,1) Objective = 39 Weight = 75
x1 = 0.333 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 39 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 75 x9 = 0.214 x10 = 0.444
22
Improvement Method (1,1,1,1,0,0,0,0,0,1) Objective = 50 Weight = 108
x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (1,1,1,1,0,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 50 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 108 x9 = 0.214 x10 = 0.444
23
Improvement Method (0,1,1,1,0,0,0,0,0,1) Objective = 39 Weight = 75
x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 39 x7 = 0.226 1→0 x8 = 0.152 1→0 Weight = 75 x9 = 0.214 x10 = 0.444
24
Improvement Method (0,1,1,1,0,0,1,0,0,1) Objective = 46 Weight = 106
x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,1,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 46 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 106 x9 = 0.214 x10 = 0.444
25
Improvement Method (0,1,1,1,0,0,0,0,0,1) Objective = 39 Weight = 75
x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,0,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 39 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 75 x9 = 0.214 x10 = 0.444
26
Improvement Method (0,1,1,1,0,0,0,0,1,1) Objective = 42 Weight = 89
x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,1,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 42 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 89 x9 = 0.214 0→1 x10 = 0.444
27
Improvement Method (0,1,1,1,0,1,0,0,1,1) Objective = 48 Weight = 119
x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,1,0,0,1,1) x5 = 0.345 0→1 x6 = 0.200 1→0 0→1 Objective = 48 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 119 x9 = 0.214 0→1 x10 = 0.444
28
Improvement Method (0,1,1,1,0,0,0,0,1,1) Objective = 42 Weight = 89
x1 = 0.333 0→1 x2 = 0.370 x3 = 0.563 x4 = 0.857 (0,1,1,1,0,0,0,0,1,1) x5 = 0.345 0→1 x6 = 0.200 1→0 Objective = 42 x7 = 0.226 1→0 0→1 x8 = 0.152 1→0 Weight = 89 x9 = 0.214 0→1 x10 = 0.444 Most Improved Feasible Solution of #9
29
Improvement Method Solution Trial Solution Objective Value
Improved Solution 1 (1,1,1,1,1,1,1,1,1,1) 81 (0,1,1,1,0,0,0,0,1,1) 42 2 (1,0,1,0,1,0,1,0,1,0) 40 (1,0,1,1,1,0,0,0,0,0) 3 (1,0,0,1,0,0,1,0,0,1) 38 4 (1,0,0,0,1,0,0,0,1,0) 24 (1,0,0,1,1,0,0,0,1,0) 36 5 (1,0,0,0,0,1,0,0,0,0) 17 (1,0,1,1,0,1,0,0,0,0) 6 (0,0,0,0,0,0,0,0,0,0) (0,1,1,1,0,0,0,0,0,1) 39 7 (0,1,0,1,0,1,0,1,0,1) 41 (0,1,0,1,0,1,0,0,0,1) 8 (0,1,1,0,1,1,0,1,1,0) 43 (0,1,1,1,1,0,0,0,1,0) 44 9 (0,1,1,1,0,1,1,1,0,1) 57 10 (0,1,1,1,1,0,1,1,1,1) 64
30
Reference Set Update Solution Improved Solution Objective Value 42 3
(1,0,0,1,0,0,1,0,0,1) 38 4 (1,0,0,1,1,0,0,0,1,0) 36 5 (1,0,1,1,0,1,0,0,0,0) 6 (0,1,1,1,0,0,0,0,0,1) 39 7 (0,1,0,1,0,1,0,0,0,1) 44 9 (0,1,1,1,0,0,0,0,1,1) 10 (0,1,1,1,0,0,0,0,1,1) (1,0,1,1,1,0,0,0,0,0) (0,1,1,1,1,0,0,0,1,0)
31
Reference Set Update In the previous slide, the elite solutions were taken as 1, 2 and 8. Diversity also needs to be incorporated into the Reference set by taking the solutions that are farthest from the best solutions. Distance: (Solution 1 and Solution 8) (0, 1, 1, 1, 0, 0, 0, 0, 1, 1) Solution 1 (0 ,1 ,1 ,1 ,1 ,0 ,0 ,0 ,1 ,0) Solution 8 ( ) = 2 Candidate Solution Distance to solution Minimum Distance Solution 1 Solution 2 Solution 8 5 4 7 (1,0,0,1,1,0,0,0,1,0) 3 2 (1,0,1,1,0,1,0,0,0,0) 6 (0,1,1,1,0,0,0,0,0,1) 1 (1,0,0,1,0,0,1,0,0,1) (0,1,0,1,0,1,0,0,0,1)
32
Subset Generation Type 1 Solutions (1,2) (1,8) (1,3) (1,7)
Objective Value 1 (0,1,1,1,0,0,0,0,1,1) 42 2 (1,0,1,1,1,0,0,0,0,0) 8 (0,1,1,1,1,0,0,0,1,0) 44 3 (1,0,0,1,0,0,1,0,0,1) 38 7 (0,1,0,1,0,1,0,0,0,1) 36 Type 1 Solutions (1,2) (1,8) (1,3) (1,7) (2,8) (2,3) (2,7) (8,3) (8,7) (3,7) Type 2 Solutions (1,2,8) (1,3,8) (1,7,8) (2,3,8) (2,7,8) (3,7,8) Type 3 Solutions (1,2,8,3) (1,7,8,2) (3,7,8,1) Type 4 Solutions (1,2,8,3,7)
33
Solution Combination Sol’n x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 3 0.322
0.000 7 0.305 8 0.373 Total 0.678 1.000 0.627
34
Solution Combination If score (i) > 0.5 If score (i) ≤ 0.5
Sol’n x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 3 0.322 0.000 7 0.305 8 0.373 Total 0.678 1.000 0.627 x’ = ( , , , , , , , , , )
35
Scatter Search Comparison to GA
RefSet size vs. Generation Size All solutions participate in combination in Scatter Search Evolution of the population is controlled by deterministic rules Local search procedures are integral to Scatter Search Scatter Searches are not generally limited to combining two “parent” solutions Initial population is not constructed in a random manner
36
Scatter Search Applications
Vehicle Routing Rochat and Taillard (1995); Ochi, et al. (1998); Atan and Secomandi (1999); Rego and Leão (2000); Corberán, et al. (2002) TSP Olivia San Martin (2000) Arc Routing Greistorfer (1999) Quadratic Assignment Cung et al. (1996) Financial Product Design Consiglio and Zenios (1999) Neural Network Training Kelly, Rangaswamy and Xu (1996); Laguna and Martí (2001) Combat Forces Assessment Model Bulut (2001) Graph Drawing Laguna and Martí (1999) Linear Ordering Laguna, Martí and Campos (2001) Unconstrained Optimization Fleurent, et al. (1996); Laguna and Martí (2000) Bit Representation Rana and Whitely (1997) Multi-objective Assignment Laguna, Lourenço and Martí (2000) Optimization Simulation Glover, Kelly and Laguna (1996); Grant (1998) Tree Problems Canuto, Resende and Ribeiro (2001); Xu Chiu and Glover (2000)
37
References Cung, V., T. Mautor, P. Michelon, and A. Tavares (1997), “A Scatter Search Based Approach for the Quadratic Assignment Problem” Proceedings of the IEEE-ICEC'97 Conference in Indianapolis, April 13-16 Glover, F., A. Løkketangen and D. Woodruff (1999), “Scatter Search to Generate Diverse MIP Solutions” in OR Computing Tools for Modeling, Optimization and Simulation: Interfaces in Computer Science and Operations Research, M. Laguna and J.L. Gonazalez-Velarde (Eds.), Kluwer Academic Publishers, pp (Last Access: March 24th 2003) Glover, F., M. Laguna and R. Martí (2000), “Scatter Search” To appear in Theory and Applications of Evolutionary Computation: Recent Trends, A. Ghosh and S. Tsutsui (Eds.), Springer-Verlag (Last Access: March 24th 2003) Glover, F., M. Laguna and R. Martí (2000), “Fundamentals of Scatter Search and Path Relinking” Control and Cybernetics, 29 (3), pp (Last Access: March 24th 2003) Glover, F., M. Laguna and R. Martí (2002), “Fundamentals of Scatter Search and Path Relinking: Foundations and Advanced Designs” to appear in New Optimization Techniques in Engineering, Godfrey Onwubolu (Eds.) (Last Access: March 24th 2003) Laguna, M. (2002), “Scatter Search” in Handbook of Applied Optimization, P. M. Pardalos and M. G. C. Resende (Eds.), Oxford University Press, pp (Last Access: March 24th 2003) Laguna, M., and R. Martí (2003), “Scatter Search: Methodology and Implementations in C” Kluwer Academic Publishers, Boston, 312 pp.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.