Download presentation
Presentation is loading. Please wait.
1
Heuristic Optimization Methods
Scatter Search
2
Agenda Scatter Search (SS) For Local Search based Metaheuristics:
SA based on ideas from nature TS based on problem-solving and learning For population based Metaheuristics: GA based on ideas from nature SS based on problem-solving and learning Nature works, but usually very slowly Being clever is better than emulating nature?
3
Scatter Search: Methodology and Applications
The following is a presentation previously held at the conference ICS Scatter Search: Methodology and Applications Manuel Laguna University of Colorado Rafael Martí University of Valencia
4
Based on … Laguna, M. and R. Martí (2003) Scatter Search: Methodology and Implementations in C, Kluwer Academic Publishers, Boston.
5
Scatter Search Methodology
6
Metaheuristic A metaheuristic refers to a master strategy that guides and modifies other heuristics to produce solutions beyond those that are normally generated in a quest for local optimality. A metaheuristic is a procedure that has the ability to escape local optimality
7
Typical Search Trajectory
8
Metaheuristic Classification
x/y/z Classification x = A (adaptive memory) or M (memoryless) y = N (systematic neighborhood search) or S (random sampling) Z = 1 (one current solution) or P (population of solutions) Some Classifications Tabu search (A/N/1) Genetic Algorithms (M/S/P) Scatter Search (M/N/P)
9
Diversification Generation
Scatter Search Repeat until |P| = PSize P Diversification Generation Method Improvement Method Improvement Method Reference Set Update Method RefSet Solution Combination Method Subset Generation Method Stop if no more new solutions
10
Scatter Search with Rebuilding
P Diversification Generation Method Repeat until |P| = PSize Improvement Method Improvement Method Reference Set Update Method RefSet Stop if MaxIter reached Solution Combination Method Improvement Method Subset Generation Method No more new solutions Diversification Generation Method
11
Tutorial Unconstrained Nonlinear Optimization Problem
12
Diversification Generation Method
Subrange 1 Subrange 2 Subrange 3 Subrange 4 -10 -5 +5 +10 Probability of selecting a subrange is proportional to a frequency count
13
Diverse Solutions
14
Improvement Method Nelder and Mead (1965)
15
Reference Set Update Method (Initial RefSet)
Objective function value to measure quality b1 high-quality solutions Min-max criterion and Euclidean distances to measure diversity b2 diverse solutions RefSet of size b
16
Initial RefSet High-Quality Solutions Diverse Solutions
17
Subset Generation Method
All pairs of reference solutions that include at least one new solution The method generates (b2-b)/2 pairs from the initial RefSet
18
Combination Method
19
Alternative Combination Method
20
Reference Set Update Method
Quality 1 Best Updated RefSet Worst Best 1 b 2 . 2 . New trial solution b Worst RefSet of size b
21
Static Update Pool of new trial solutions Quality 1 Best 2 .
Updated RefSet = Best b from RefSet Pool b Worst RefSet of size b
22
RefSet after Update x1 x2 x3 x4 f(x) 1.1383 1.2965 0.8306 0.715 0.14
0.7016 0.5297 1.2078 1.4633 0.36 0.5269 0.287 1.2645 1.6077 0.59 1.1963 1.3968 0.6801 0.446 0.62 0.3326 0.1031 1.3632 1.8311 0.99 0.3368 0.1099 1.3818 1.9389 1.02 0.3127 0.0949 1.3512 1.8589 1.03 0.7592 0.523 1.3139 1.7195 1.18 0.2004 0.0344 1.4037 1.9438 1.24 1.3892 1.9305 0.1252 1.45
23
Additional Strategies
Reference Set Rebuilding Multi-tier Subset Generation Subsets of size > 2 Combination Method Variable number of solutions
24
Rebuilding RefSet Rebuilt RefSet b1 b2 Diversification
Generation Method Reference Set Update Method
25
2-Tier RefSet RefSet b1 b2 Solution Combination Method Improvement
Try here first If it fails, then try here
26
3-Tier RefSet RefSet b1 b2 Solution Combination Method Improvement
Try here first If it fails, then try here b3 Try departing solution here
27
Subset Generation Subset Type 1: all 2-element subsets.
Subset Type 2: 3-element subsets derived from the 2-element subsets by augmenting each 2-element subset to include the best solution not in this subset. Subset Type 3: 4-element subsets derived from the 3-element subsets by augmenting each 3-element subset to include the best solutions not in this subset. Subset Type 4: the subsets consisting of the best i elements, for i = 5 to b.
28
Subsets of Size > 2
29
Variable Number of Solutions
Quality 1 Best 2 . Generate 5 solutions Generate 3 solutions Generate 1 solution b Worst RefSet of size b
30
Hybrid Approaches Use of Memory GRASP Constructions
Tabu Search mechanisms for intensification and diversification GRASP Constructions Combination Methods GA Operators Path Relinking
31
Multiobjective Scatter Search
This is a fruitful research area Many multiobjective evolutionary approaches exist (Coello, et al. 2002) SS can use similar techniques developed for MOEA (multiobjective evolutionary approches)
32
Multiobjective EA Techniques
Independent Sampling Search on f(x) = wi fi(x) Change weights and rerun Criterion Selection Divide reference set into k subsets Admission to ith subset is according to fi(x)
33
Advanced Designs Reference Set Update Subset Generation Use of Memory
Dynamic / Static 2 Tier / 3 Tier Subset Generation Use of Memory Explicit Memory Attributive Memory Path Relinking
34
An Example The Linear Ordering Problem
Given a matrix of weights E = {eij}mxm, the LOP consists of finding a permutation p of the columns (and rows) in order to maximize the sum of the weights in the upper triangle Applications Triangulation for Input-Output Economic Tables. Aggregation of individual preferences Classifications in Sports Maximize
35
An Instance 1 2 3 4 3 4 1 2 p=(1,2,3,4) cE(p)= =37 p*=(3,4,1,2) cE(p*)= =47
36
Diversification Generator
Use of problem structure to create methods in order to achieve a good balance between quality and diversity. Quality Deterministic constructive method Diversity Random Generator Systematic Generators (Glover, 1998) GRASP constructions. The method randomly selects from a short list of the most attractive sectors. Use of Memory Modifying a measure of attractiveness proposed by Becker with a frequency-based memory measure that discourages sectors from occupying positions that they have frequently occupied.
37
Diversity vs. Quality Compare the different generators
Create a set of 100 solutions with each one d = Standardized Diversity C = Standardized Quality
38
Improvement Method Apply a first strategy
INSERT_MOVE (pj, i) consist of deleting pj from its current position j to be inserted in position i Apply a first strategy scans the list of sectors in search for the first sector whose movement results in an improvement MoveValue = CE(p’) - CE(p) CE (p’) = (1 - 4) + (6 - 0) + (2 - 6) (13 - 4) = = 86
39
Solution Combination Method
The method scans (from left to right) each reference permutation. Each reference permutation votes for its first element that is still not included in the combined permutation (“incipient element”). The voting determines the next element to enter the first still unassigned position of the combined permutation. The vote of a given reference solution is weighted according to the incipient element’s position. Incipient element (3,1,4,2,5) votes for 4 Solution under construction: (1,4,3,5,2) votes for 4 (3,1,2,4,_ ) (2,1,3,5,4) votes for 5
40
Experiments with LOLIB
49 Input-Output Economic Tables GD CK CK10 TS SS Optima deviation 0.15% 0.02% 0.04% 0.01% Number of optima 11 27 33 42 Run time (seconds) 0.01 0.10 1.06 0.49 2.35
41
Another Example A commercial SS implementation
OptQuest Callable Library (by OptTek) As other context-independent methods separates the method and the evaluation.
42
OptQuest based Applications
Solution Generator Solution Evaluator
43
Feasibility and Evaluation
User Implementation Returns to OptQuest The OptQuest engine generates a new solution
44
Comparison with Genocop
Average on 28 hard nonlinear instances
45
Conclusions The development of metaheuristics usually entails a fair amount of experimentation (“skill comes from practice”). Code objectives: Quick Start Benchmark Advanced Designs Scatter Search provides a flexible “framework” to develop solving methods.
46
Metaheuristic Classification
x/y/z Classification x = A (adaptive memory) or M (memoryless) y = N (systematic neighborhood search) or S (random sampling) Z = 1 (one current solution) or P (population of solutions) Some Classifications Tabu search (A/N/1) Genetic Algorithms (M/S/P) Scatter Search (M/N/P)
47
Some Classifications (local search) Simulated Annealing M/S/1
Tabu Search A/N/1 (randomized) (systematic) Genetic Algorithm M/S/P Scatter Search M/N/P (population)
48
About the Classifications
Our four main methods (SA, TS, GA, SS) all belong far from the center (they are very randomized or very systematic) Other methods have both some element of randomized and some element of systematic behaviour Most implementations will mix the ingredients, and we have an element of local search in population based methods (e.g., Memetic Algorithms), or an element of randomness in systematic approaches (such as random tabu tenure in TS) The classifications highlight the differences between methods, but there are also many similarities
49
GA vs. SS (1) GA has a ”long” history: proposed in the 1970s, and immediately becoming popular Not initially used for optimization Gradually morphed into a metodology whose major concern is the solution of optimization problems The concepts and principles of SS was also proposed early (1970s), but was not popularized until the 1990s The SS template most often used is from 1998 Propsed to solve Integer Programming problems
50
GA vs. SS (2) GA is based on natural processes (genetics, the ”survival of the fittest”, and imitation of the nature) SS is based on strategic ideas for how to use adaptive memory Some TS concepts are critically linked with SS
51
GA vs. SS (3) Diversification Intensification GA: mutation
SS: favoring diverse solutions in the reference set, and generating diverse solutions in the initialization Intensification GA: probabilistic selection of parents, favoring the fittest parents (but this is not really very intensifying) SS: the improvement method
52
GA vs. SS (4) GA has a population that is usually 10x larger than the reference set in SS GA applies operators (mutation, crossover) to random solutions, SS applies operators (combination, improvement) non-randomly Evolution in GA follows random ”survival of the fittest”, in SS there are deterministic rules in the reference set update method
53
GA vs. SS (5) Use of Local Search (improvement) is an integral part of SS, but added to GA only to create hybrid/improved approaches GA usually limited to combine only a pair of solutions (parents), while SS allows combination of any number of solutions GA uses full randomization to create initial population, while SS balances diversity and quality (diversification generation method)
54
Summary of Todays’s Lecture
Scatter Search M/N/P (memoryless, systematic ”neighborhood”, population of solutions) Components of Scatter Search: Diversification Generation Improvement Reference Set Update Subset Generation Solution Combination
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.