Download presentation
Presentation is loading. Please wait.
Published byElisabeth Bø Modified over 6 years ago
1
Comparing Genetic Algorithm and Guided Local Search Methods
Mehrdad Nojoumian & Divya Nair September 22, 2018 David R. Cheriton School of Computer Science
2
Contents Problem Clarification & Motivation
Definition and Related Work Genetic Algorithms (GA) Guided Local Search (GLS) Experimental Results Conclusion & Future Work
3
Problem Definition & Motivation
Selecting the best strategy in order to solve various TSP instances Many engineering problems can be mapped to the Travelling Salesman Problem Motivation: Compare two major AI approaches by the evaluation of their performance on TSPs Genetic Algorithm (GA) Guided Local Search (GLS) Scrutinize behaviours of these techniques on the solution space
4
Definition and Related Work
Travelling Salesman Problem Given a set of cities: represented as points in the plane with X & Y co-ordinates The goal is to find the shortest tour that visits each city exactly once It is an NP-complete problem Related Work Various GA implementations for TSP instances Comparing various search strategy (not GA & GLS) Hybrid approaches which combine GA and GLS TSP instance: 48 capitals of the US X Y
5
Local Search: HC, SA, TS, etc
Global Max Local Max Iterative process Generate s0 While (¬stopping condition) Generate Neighbours N (si) Evaluate N (si) Si+1 = Select-Next N (si) Return sn Sn Si Exploitation S0 Exploration
6
Population-Based: Genetic Algorithms
Global Max Local Max X Iterative process Generate p0 While (¬stopping condition) pm = Mutate(pi) pc = Crossover(pm) Pi+1 = Select(pc) Return s* S4,0 S1,0 S3,0 S2,0 Exploitation X Candidates for the next Generation Exploration
7
TSP Solution Space 500 ! =
8
Genetic Algorithms (GA)
Mutation Generating Random Solutions Chromosomes Combining Crossover New Population = 25 = 20 Population = 25 = 35 = 20 = 40 Offspring Selecting Evaluating
9
Fitness Function & Mutation
Calculating the length of each path (chromosome) Ch1: = = 46 Ch2: = = 52 … Reciprocal Mutation ==> Inversion Mutation ==> 1 2 4 3 10 13 7 16 20 15
10
Crossover Partially-Mapped Crossover
Pick crossover points A and B randomly and copy the cities between A and B from P1 into the child For parts of the child's array outside the range [A,B], copy only those cities from P2 which haven't already been taken from P1 Fill in the gaps with cities that have not yet been taken
11
Crossover (Cont.) Order Crossover
Choose points A and B and copy that range from P1 to the child Fill in the remaining indices with the unused cities in the order that they appear in P2
12
Selection Rank Selection Sort the population by their Fitness values
Each individual is assigned a Rank: R = 1for the best individual and so on Then, the probability of being selected is P Rank : (0.5) 1 = 0.5 , (0.5) 2 = 0.25 , etc Tournament Selection Pick a handful of N individuals from the population at random (e.g. N = 20) With a fixed probability: P (e.g. P = 1) choose the one with the highest fitness Choose the second best individual with probability P * ( 1 - P ) Choose the third best individual with probability P * ( ( 1 - P ) ^ 2 ) and so on
13
Local Search Basic Idea Behind Local Search
Basic Idea: Perform an iterative improvement Keep a single current state (rather than multiple paths) Try to improve it Move iteratively to neighbours of the current state Do not retain search path Constant space, often rather fast, but incomplete What is a neighbour? Neighbourhood has to be defined application-dependent
14
A move operator 2-opt? Take a sub-tour and reverse it reverse 9 1 8 7 5 4 3 2 6
15
A move operator 2-opt? Take a sub-tour and reverse it 9 1 8 7 5 4 3 2 6
16
Guided Local Search (GLS)
Goals: -To escape local minima -Introduce memory in search process -Rationally distribute search efforts GLS augments the cost function to include a set of penalty terms and passes the new modification to LS. LS is limited by the penalty terms and conducts search on promising regions of the search space. Each time LS gets caught in local minimum, the penalties are modified and LS is called again for the new modified cost function GLS penalizes solutions which contains specific defined features
17
Solution Features A solution feature projects a specific property based on the major constraint expressed by the problem. For TSP, the solution features are edges between cities A set of features can be defined by considering all possible edges that may appear in a tour with feature costs given by edge lengths. For each feature, feature cost and penalty are defined. A feature, fi is expressed using an indicator function as: Ii (s) = 1, solution s has property i, s Є S , S=set of all feasible solutions = 0, otherwise In TSP problem, the indicator functions express the edges currently included in the candidate tour The indicator functions are incorporated in the cost function to yield the augmented cost function
18
GLS Specifics As soon as the local minimum occurs during local search:
The penalties are modified and the cost function is upgraded to a new augmented cost function based on the following equation: h (s) = g (s) + λ pi. Ii(s) g (s)-objective function M - is total set of features defined over solutions, λ - the regularization parameter pi - the penalty associated with feature, i. The penalty vector is defined as: P = (p1, …, pM) And the feature costs are defined by a cost vector: C = (c1, …, cM) The penalty parameters are increased by 1 for all the features for which the following utility expression is the maximum:
19
GLS Algorithm
20
Snapshot of Guided Local Search (1)
21
Snapshot of Guided Local Search (2)
22
Snapshot of Guided Local Search (3)
23
Fast Local Search (FLS)
For speeding up local search Through reduced neighborhood (sub-neighborhoods) Method: GLS + FLS : associate solution features to sub-neighborhoods Associate activation bit to problem features Procedure: Initially all sub-neighborhoods active (bits to 1) FLS called to reach first local minima During modification action, only the associated sub-neighborhoods bits of penalized features are set to 1
24
Experimental Results 1. Comparison of GLS-FLS-2opt with GLS-greedy LS on s-TSP 2. Comparison of GLS-FLS-2opt with GA on s-TSP 3. Comparison of GLS-FLS-2opt with Branch and Bound on s-TSP GLS-FLS-2opt with GLS-greedy LS:
25
Comparison of GLS-FLS-2opt with GA on s-TSP
26
Comparison of GLS-FLS-2opt with GA
Mean Excess % Mean CPU Time
27
Comparison of GLS-FLS-2opt with Branch and Bound on s-TSP
28
Conclusion GLS – GLS solver developed in C++ GA – Java
Branch and Bound- Volgenant BB technique (Pascal) The GLS-FLS strategy on the s-TSP instances yields the most promising performance in terms of near-optimality and mean CPU time GA results are comparable to GLS-FLS results on the same s-TSP instances and it is also possible for the GA methods to generate near optimal solutions with a little compromise in CPU time (by increasing the number of iterations). BB method generates the optimal solutions similar to GLS-FLS and the mean CPU time is better when compared to our GA approach and , the BB method can guarantee a lower bound for the given s-TSP instance, but works for only a maximum of 100 cities
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.