1 COMP8620 Lecture 5-6 Neighbourhood Methods, and Local Search (with special emphasis on TSP)

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Evolutionary Computational Intelligence
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Ant Colony Optimization Optimisation Methods. Overview.
7/2/2015Intelligent Systems and Soft Computing1 Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Ant Colony Optimization: an introduction
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Genetic Algorithm.
Genetic Algorithms and Ant Colony Optimisation
Evolutionary Intelligence
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Intro. ANN & Fuzzy Systems Lecture 36 GENETIC ALGORITHM (1)
Genetic algorithms Prof Kang Li
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
GRASP: A Sampling Meta-Heuristic
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Introduction to GAs: Genetic Algorithms How to apply GAs to SNA? Thank you for all pictures and information referred.
1 Combinatorial Problem. 2 Graph Partition Undirected graph G=(V,E) V=V1  V2, V1  V2=  minimize the number of edges connect V1 and V2.
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
A New Evolutionary Approach for the Optimal Communication Spanning Tree Problem Sang-Moon Soak Speaker: 洪嘉涓、陳麗徽、李振宇、黃怡靜.
 Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms n Introduction, or can evolution be intelligent? n Simulation.
G5BAIM Artificial Intelligence Methods
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
1 Genetic Algorithms and Ant Colony Optimisation.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Evolutionary Algorithms K. Ganesh Research Scholar, Ph.D., Industrial Management Division, Humanities and Social Sciences Department, Indian Institute.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
Optimization Problems
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
1 Combinatorial Problem. 2 Graph Partition Undirected graph G=(V,E) V=V1  V2, V1  V2=  minimize the number of edges connect V1 and V2.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Introduction to Genetic Algorithms
Genetic Algorithms Author: A.E. Eiben and J.E. Smith
Genetic Algorithms.
CSCI 4310 Lecture 10: Local Search Algorithms
School of Computer Science & Engineering
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Comparing Genetic Algorithm and Guided Local Search Methods
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
Genetic Algorithms Chapter 3.
More on Search: A* and Optimization
Boltzmann Machine (BM) (§6.4)
Population Based Metaheuristics
Presentation transcript:

1 COMP8620 Lecture 5-6 Neighbourhood Methods, and Local Search (with special emphasis on TSP)

2 Assignment “Project 1”

3 Neighbourhood For each solution S  S, N (S)  S is a neighbourhood In some sense each T  N (S) is in some sense “close” to S Defined in terms of some operation Very like the “action” in search

4 Neighbourhood Exchange neighbourhood: Exchange k things in a sequence or partition Examples: Knapsack problem: exchange k 1 things inside the bag with k 2 not in. (for k i, k 2 = {0, 1, 2, 3}) Matching problem: exchange one marriage for another

5 2-opt Exchange

6

7

8

9

10 2-opt Exchange

11 3-opt exchange Select three arcs Replace with three others 2 orientations possible

12 3-opt exchange

13 3-opt exchange

14 3-opt exchange

15 3-opt exchange

16 3-opt exchange

17 3-opt exchange

18 3-opt exchange

19 3-opt exchange

20 3-opt exchange

21 3-opt exchange

22 3-opt exchange

23 Neighbourhood Strongly connected: Any solution can be reached from any other (e.g. 2-opt) Weakly optimally connected The optimum can be reached from any starting solution

24 Neighbourhood Hard constraints create solution impenetrable mountain ranges Soft constraints allow passes through the mountains E.g. Map Colouring (k-colouring) –Colour a map (graph) so that no two adjacent countries (nodes) are the same colour –Use at most k colours –Minimize number of colours

25 Map Colouring Starting sol Two optimal solutions Define neighbourhood as: Change the colour of at most one vertex ?  Make k-colour constraint soft…

26 Iterative Improvement 1.Find initial (incumbent) solution S 2.Find T  N (S) which minimises objective 3.If z(T) ≥ z(S) Stop Else S = T Goto 2

27 Iterative Improvement Best First (a.k.a Greedy Hill-climbing, Discrete Gradient Ascent) –Requires entire neighbourhood to be evaluated –Often uses randomness to split ties First Found –Randomise neighbourhood exploration –Implement first improving change

28 Local Minimum Iterative improvement will stop at a local minimum Local minimum is not necessarily a global minimum How do you escape a local minimum?

29 Restart Find initial solution using (random) procedure Perform Iterative Improvement Repeat, saving best Remarkably effective Used in conjunction with many other meta-heuristics

30 Results from SAT

31 Variable Depth Search Make a series of moves Not all moves are cost-decreasing Ensure that a move does not reverse previous move Very successful VDS: Lin-Kernighan algorithm for TSP (1973) (Originally proposed for Graph Partitioning Problem (1970))

32 Lin-Kernighan (1973) –  -path uv uv w uv wv’ uv w w’

33 Lin-Kernighan (1973) Essentially a series of 2-opt style moves Find best  -path Partially implement the path Repeat until no more paths can be constructed If arc has been added (deleted) it cannot be deleted (added) Implement best if cost is less than original

34 Dynasearch Requires all changes to be independent Requires objective change to be cummulative e.g. A set of 2-opt changes were no two swaps touched the same section of tour Finds best combination of exchanges –Exponential in worst case

35 Variable Neighbourhood Search Large Neighbourhoods are expensive Small neighbourhoods are less effective Only search larger neighbourhood when smaller is exhausted

36 Variable Neighbourhood Search m Neighbourhoods N i |N 1 | < |N 2 | < |N 3 | < … < |N m | 1.Find initial sol S ; best = z (S) 2.k = 1; 3.Search N k (S) to find best sol T 4.If z(T) < z(S) S = T k = 1 else k = k+1

37 Large Neighbourhood Search Partial restart heuristic 1.Create initial solution 2.Remove a part of the solution 3.Complete the solution as per step 1 4.Repeat, saving best

38 LNS – Construct

39 LNS – Construct

40 LNS – Construct

41 LNS – Construct

42 LNS – Construct

43 LNS – Construct

44 LNS – Construct

45 LNS – Construct

46 LNS – Construct

47 LNS – Construct

48 LNS – Construct

49 LNS – Destroy

50 LNS – Destroy

51 LNS – Destroy

52 LNS – Destroy

53 LNS – Construct

54 LNS – Construct

55 LNS The magic is choosing which part of the solution to destroy Different problems (and different instances) need different heuristic

56 Speeding Up 2/3-opt For each node, store k nearest neighbours Only link nodes if they appear on list k = 20 does not hurt performance much k = % better k = 80 was worse FD-trees to help initialise

57 Advanced Stochastic Local Search Simulated Annealing Tabu Search Genetic algorithms Ant Colony optimization

58 Simulated Annealing Kirkpatrick, Gelatt & Vecchi [1983] Always accept improvement in obj Sometimes accept increase in obj P(accept increase of Δ) = e Δ/T T is temperature of system Update T according to “cooling schedule” (T = 0) == Greedy Iterative Improvement

59 Simulated Annealing Nice theoretical result: As number of iters  ∞, probability of finding the optimal solution  1 Experimental confirmation: On many problem, long runs yield good results Weak optimal connection required

60 Simulated Annealing 1.Generate initial S 2.Generate random T  N (S) 3.Δ = z (T) – z (S) 4.if Δ < 0 S = T ; goto 2 5.if rand() < e Δ/T S = T ; goto 2

61 Simulated Annealing Initial T Set equal to max [acceptable] Δ Updating T Geometric update: T k+1 =  T k  usually in [0.9, 0.999] Don’t want too many changes at one temperature (too hot): If (numChangesThisT > maxChangesThisT) updateT()

62 Simulated Annealing Updating T Many other update schemes Sophisticated ones look at mean, std-dev of Δ Re-boil (== Restart) Re-initialise T 0-cost changes Handle randomly Adaptive parameters If you keep falling into the same local minimum, maxChangesThisT *= 2, or initialT *= 2

63 Tabu Search Glover [1986] Some similarities with VDS Allow cost-increasing moves Selects best move in neighbourhood Ensure that solutions don’t cycle by making return to previous solution “tabu” Effectively a modified neighbourhood Maintains more memory than just best sol

64 Tabu Search Theoretical result (also applies to SA): As k  ∞ P(find yourself at an optimal sol) gets larger relative to other solutions

65 Tabu Search Basic Tabu Search: 1.Generate initial solution S, S* = S 2.Find best T  N (S) 3.If z(T) ≥ z(S) Add T to tabu list 4S = T 5if z(S) < z(S*) then S* = S 6if stopping condition not met, goto 2

66 Tabu Search Tabu List: List of solutions cannot be revisited Tabu Tenure The length of time a solution remains tabu = length of tabu list Tabu tenure t ensures no cycle of length t

67 Tabu Search Difficult/expensive to store whole solution Instead, store the “move” (delta between S and T) Make inverse move tabu –e.g. 2-opt adds 2 new arcs to solution –Make deletion of both(?) tabu But Cycle of length t now possible Some non-repeated states tabu

68 Tabu Search Tabu List: List of moves that cannot be undone Tabu Tenure The length of time a move remains tabu Stopping criteria No improvement for iterations Others…

69 Tabu Search Diversification –Make sure whole solution space is sampled –Don’t get trapped in small area Intensification –Search attractive areas well Aspiration Criteria –Ignore Tabu restrictions if very attractive (and can’t cycle) –e.g.: z(T) < best

70 Tabu Search Diversification –Penalise solutions near observed local minima –Penalise solution features that appear often –Penalties can “fill the hole” near a local min Intensification –Reward solutions near observed local minima –Reward solution features that appear often Use z'(S) = z(S) + penalties

71 Tabu Search – TSP TSP Diversification –Penalise (pred,succ) pairs seen in local minima TSP Intensification –Reward (pred,succ) pairs seen in local minima z'(S) = z(S) + Σ ij w ij count(i,j) –count(i,j): how many times have we seen (i,j) –w ij : weight depending on diversify/intensify cycle

72 Adaptive Tabu Search If t (tenure) to small, we will return to the same local min Adaptively modify t –If we see the same local min, increase t –When we see evidence that local min escaped (e.g. improved sol), lower t … my current favourite

73 Tabu Search 1.Generate initial solution S ; S* = S 2.Generate V*  N (S) – Not tabu, or meets aspiration criterea 3.Find T  V* which minimises z' 4.S = T 5.if z(S) < z(S*) then S* = S 6.Update tabu list, aspiration criterea, t 7.if stopping condition not met, goto 2

74 Path Relinking Basic idea: Given 2 good solutions, perhaps a better solution lies somewhere in-between Try to combine “good features” from two solutions Gradually convert one solution to the other

75 Path Re-linking TSP:

76 Genetic Algorithms Simulated Annealing and Tabu Search have a single “incumbent” solution (plus best-found) Genetic Algorithms are “population-based” heuristics – solution population maintained

77 Genetic Algorithms Problems are solved by an evolutionary process resulting in a best (fittest) solution (survivor). Evolutionary Computing – 1960s by I. Rechenberg Genetic Algorithms –Invented by John Holland 1975 –Made popular by John Koza 1992 Nature solves some pretty tough questions – let’s use the same method …begs the question… if homo sapien is the answer, what was the question??

78 Genetic Algorithms Vocabulary Gene – An encoding of a single part of the solution space (often binary) Genotype – Coding of a solution Phenotype – The corresponding solution Chromosome – A string of “Genes” that represents an individual – i.e. a solution. Population - The number of “Chromosomes” available to test

79 Vocabulary Genotype: coded solutions Phenotype: actual solutions Measure fitness Search space Solution space Note: in some evolutionary algorithms there is no clear distinction between genotype and phenotype Genotypes Phenotypes

80 Vocabulary

81 Crossover

82 Mutation Alter each gene independently with a prob p m (mutation rate) 1/pop_size < p m < 1/ chromosome_length

83 Reproduction Chromosomes are selected to crossover and produce offspring Obey the law of Darwin: Best survive and create offspring. Roulette-wheel selection Tournament Selection Rank selection Steady state selection

84 Roulette Wheel Selection Fitness Chr. 13 Chr. 21 Chr. 32 Main idea: better individuals get higher chance – Chances proportional to fitness – Assign to each individual a part of the roulette wheel – Spin the wheel n times to select n individuals

85 Tournament Selection Tournament competition among N individuals (N=2) are held at random. The highest fitness value is the winner. Tournament is repeated until the mating pool for generating new offspring is filled.

86 Rank Selection Roulette-wheel has problem when the fitness value differ greatly In rank selection the –worst value has fitness 1, –the next 2,......, –best has fitness N.

87 Rank Selection vs Roulette 2% 5% 8% 10% 75% Roulette Wheel Rank 7% 13% 20% 27% 33%

88 Crossover Single –site crossover Multi-point crossover Uniform crossover

89 Single-site Choose a random point on the two parents Split parents at this crossover point Create children by exchanging tails P c typically in range (0.6, 0.9)

90 n-point crossover Choose n random crossover points Split along those points Glue parts, alternating between parents Generalisation of 1 point (still some positional bias)

91 Uniform crossover Assign 'heads' to one parent, 'tails' to the other Flip a coin for each gene of the first child Make an inverse copy for the second child Inheritance is independent of position

92 Genetic Algorithm

93 Memetic Algorithm Memetic Algorithm = Genetic Algorithm + Local Search E.g.: –LS after mutation –LS after crossover

94 Demo gb.htmlhttp:// gb.html

95 Ant Colony Optimization Another “Biological Analogue” Observation: Ants are very simple creatures, but can achieve complex behaviours Use pheromones to communicate

96 Ant Colony Optimization Ant leaves a pheromone trail Trails influence subsequent ants Trails evaporate over time E.g. in TSP –Shorter Tours leave more pheromone –Evaporation helps avoid premature intensification

97 ACO for TSP p k (i,j) is prob. moving from i to j at iter k ,  parameters

98 ACO for TSP Pheromone trail evaporates at rate  Phermone added proportional to tour quality

99 References Emile Aarts and Jan Karel Lenstra (Eds), Local Search in Combinatorial Optimisation Princeton University Press, Princeton NJ, 2003 Holger H. Hoos and Thomas Stützle, Stochastic Local Search, Foundations and Applications, Elsevier, 2005