Local Search Algorithms.  involve finding a grouping, ordering, or assignment of a discrete set of objects which satisfies certain constraints  arise.

Slides:



Advertisements
Similar presentations
G5BAIM Artificial Intelligence Methods
Advertisements

CS6800 Advanced Theory of Computation
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Tabu Search for Model Selection in Multiple Regression Zvi Drezner California State University Fullerton.
Local Search Algorithms
Introduction to Artificial Intelligence Local Search (updated 4/30/2006) Henry Kautz.
Intelligent Agents What is the basic framework we use to construct intelligent programs?
MAE 552 – Heuristic Optimization
Imagine that I am in a good mood Imagine that I am going to give you some money ! In particular I am going to give you z dollars, after you tell me the.
Ant Colony Optimization Optimisation Methods. Overview.
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
Informed Search Next time: Search Application Reading: Machine Translation paper under Links Username and password will be mailed to class.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Ant Colony Optimization: an introduction
G5BAIM Artificial Intelligence Methods Tabu Search.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Escaping local optimas Accept nonimproving neighbors – Tabu search and simulated annealing Iterating with different initial solutions – Multistart local.
Genetic Algorithms and Ant Colony Optimisation
Local Search and Optimization
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Swarm Intelligence 虞台文.
An Introduction to Artificial Life Lecture 4b: Informed Search and Exploration Ramin Halavati In which we see how information.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Algorithms and their Applications CS2004 ( )
Introduction to search Chapter 3. Why study search? §Search is a basis for all AI l search proposed as the basis of intelligence l inference l all learning.
Design & Analysis of Algorithms Combinatory optimization SCHOOL OF COMPUTING Pasi Fränti
CS 484 – Artificial Intelligence1 Announcements Homework 2 due today Lab 1 due Thursday, 9/20 Homework 3 has been posted Autumn – Current Event Tuesday.
Object Oriented Programming Assignment Introduction Dr. Mike Spann
Simulated Annealing.
Local Search: walksat, ant colonies, and genetic algorithms.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Lecture 3: Uninformed Search
G5BAIM Artificial Intelligence Methods
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
1 Genetic Algorithms and Ant Colony Optimisation.
Search CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Introduction to Optimization
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Local Search and Optimization Presented by Collin Kanaley.
Optimization Problems
Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.
Chapter 4 (Section 4.3, …) 2 nd Edition or Chapter 4 (3 rd Edition) Local Search and Optimization.
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Local Search Algorithms and Optimization Problems
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Optimization Problems
CSCI 4310 Lecture 10: Local Search Algorithms
Department of Computer Science
Digital Optimization Martynas Vaidelys.
School of Computer Science & Engineering
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Advanced Artificial Intelligence Evolutionary Search Algorithm
metaheuristic methods and their applications
Study Guide for ES205 Yu-Chi Ho Jonathan T. Lee Nov. 7, 2000
Optimization Problems
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
School of Computer Science & Engineering
Design & Analysis of Algorithms Combinatorial optimization
Local Search Algorithms
Presentation transcript:

Local Search Algorithms

 involve finding a grouping, ordering, or assignment of a discrete set of objects which satisfies certain constraints  arise in many domains of computer science and various application areas  have high computational complexity (NP- hard)  are solved in practice by searching an exponentially large space of candidate / partial solutions

 finding shortest/cheapest round trips (TSP)  planning, scheduling, time-tabling  resource allocation  protein structure prediction  genome sequence assembly  Microarchitecture’s design space exploration

 TSP – optimization variant:  For a given weighted graph G = (V,E,w), find a Hamiltonian cycle in G with minimal weight,  i.e., find the shortest round-trip visiting each vertex exactly once.  TSP – decision variant:  For a given weighted graph G = (V,E,w), decide whether a Hamiltonian cycle with minimal weight ≤ b exists in G.

Types of search methods: systematic ←→ local search deterministic ←→ stochastic sequential ←→ parallel

Rezolvarea problemelor prin cautare Definirea problemelor de cautare Strategii de cautare ◦ Strategii de cautare neinformate ◦ Strategii de cautare informate (euristice) ◦ Strategii de cautare locale (Hill Climbing, Simulated Annealing, Tabu Search, Algoritmi evolutivi, PSO, ACO)

Cautare locala simpla - se retine o singura stare vecina – Hill climbing - alege cel mai bun vecin – Simulated annealing - alege probabilistic cel mai bun vecin – Cautare tabu - retine lista solutiilor recent vizitate Cautare locala în fascicol (beam local search) – se retin mai multe stari (o populatie de stari) – Algoritmi evolutivi – Optimizare bazata pe comportamentul de grup (Particle swarm optimisation) – Optimizare bazata pe furnici (Ant colony optmisation)

 start from initial position  iteratively move from current position to neighboring position  use evaluation function for guidance Two main classes:  local search on partial solutions  local search on complete solutions

local search on partial solutions

 Order the variables in some order.  Span a tree such that at each level a given value is assigned a value.  Perform a depth-first search.  But, use heuristics to guide the search. Choose the best child according to some heuristics. (DFS with node ordering) Local search for partial solutions

 search space: space of partial solutions  search steps: extend partial solutions with assignment for the next element  solution elements are often ranked according to a greedy evaluation function

 Once a solution has been found (with the first dive into the tree) we can continue search the tree with DFS and backtracking.  In fact, this is what we did with DFBnB.  DFBnB with node ordering.

At each node, the heuristic prefers one of the children. A discrepancy is when you go against the heuristic. Perform DFS from the root node with k discrepancies Start with k=0 Then increasing k by 1 Stop at anytime.

 Assume a binary tree where the heuristic always prefers the left child

Advantages:  Anytime algorithm  Solutions are ordered according to heuristics

local search on complete solutions

initialize search at some point of search space in each step, move from the current search position to a neighboring position with better evaluation function value

 Choose the nieghbor with the largest improvment as the next state states f-value f-value = evaluation(state) while f-value(state) < f-value(next-best(state)) state := next-best(state)

function Hill-Climbing(problem) returns a solution state current  Make-Node(Initial-State[problem]) loop do next  a highest-valued successor of current if Value[next] < Value[current] then return current current  next end

Typical problems with local search (with hill climbing in particular)  getting stuck in local optima  being misguided by evaluation/objective function

 randomize initialization step  randomize search steps such that suboptimal/worsening steps are allowed  improved performance & robustness  typically, degree of randomization controlled by noise parameter

Pros:  for many combinatorial problems more efficient than systematic search  easy to implement  easy to parallelize Cons:  often incomplete (no guarantees for finding existing solutions)  highly stochastic behavior  often difficult to analyze theoretically / empirically

Random Search (Blind Guessing):  In each step, randomly select one element of the search space. (Uninformed) RandomWalk:  In each step, randomly select one of the neighbouring positions of the search space and move there.

hill climbing עלול להיתקע בנקודה שבה אין התקדמות. נוכל לשפר אותו בצורה הבאה : 1. בחר בנקודה רנדומלית והרץ את hill climbing. 2. אם הפתרון שמצאת טוב יותר מהפתרון הטוב ביותר שנמצא עד כה – שמור אותו. 3. חזור ל - 1. מתי נסיים ? – לאחר מספר קבוע של איטרציות. – לאחר מספר קבוע של איטרציות שבהן לא נמצא שיפור לפתרון הטוב ביותר שנמצא עד כה.

states f-value f-value = evaluation(state)

 initialize search at some point of search space search steps:  with probability p, move from current search position to a randomly selected neighboring position  otherwise, move from current search position to neighboring position with better evaluation function value.  Has many variations of how to choose the randomly neighbor, and how many of them  Example: Take 100 steps in one direction (Army mistake correction) – to escape from local optima.

Combinatorial search technique inspired by the physical process of annealing [Kirkpatrick et al. 1983, Cerny 1985] Outline  Select a neighbor at random.  If better than current state go there.  Otherwise, go there with some probability.  Probability goes down with time (similar to temperature cooling)

e  E/T

function Simulated-Annealing(problem, schedule) returns solution state current  Make-Node(Initial-State[problem]) for t  1 to infinity T  schedule[t] // T goes downwards. if T = 0 then return current next  Random-Successor(current)  E  f-Value[next] - f-Value[current] if  E > 0 then current  next else current  next with probability e  E/T end

baseline implementation:  start with random initial solution  use 2-exchange neighborhood  simple annealing schedule;  relatively poor performance improvements:  look-up table for acceptance probabilities  neighborhood pruning  low-temperature starts

Simulated Annealing...  is historically important  is easy to implement  has interesting theoretical properties (convergence), but these are of very limited practical relevance  achieves good performance often at the cost of substantial run-times

 Combinatorial search technique which heavily relies on the use of an explicit memory of the search process [Glover 1989, 1990] to guide search process  memory typically contains only specific attributes of previously seen solutions  simple tabu search strategies exploit only short term memory  more complex tabu search strategies exploit long term memory

 in each step, move to best neighboring solution although it may be worse than current one  to avoid cycles, tabu search tries to avoid revisiting previously seen solutions by basing the memory on attributes of recently seen solutions  tabu list stores attributes of the tl most recently visited  solutions; parameter tl is called tabu list length or tabu tenure  solutions which contain tabu attributes are forbidden

Problem: previously unseen solutions may be tabu  use of aspiration criteria to override tabu status Stopping criteria:  all neighboring solutions are tabu  maximum number of iterations exceeded  number of iterations without improvement  Robust Tabu Search [Taillard 1991], Reactive Tabu Search[Battiti & Tecchiolli 1994–1997]

 Combinatorial search technique inspired by the evolution of biological species.  population of individual solutions represented as strings  individuals within population are evaluated based on their “fitness” (evaluation function value)  population is manipulated via evolutionary operators – mutation – crossover – selection

 How to generate the next generation.  1) Selection: we select a number of states from the current generation. (we can use the fitness function in any reasonable way)  2) crossover : select 2 states and reproduce a child.  3) mutation: change some of the genues.

 Pop= initial population  Repeat{  NEW_POP = EMPTY;  for i=1 to POP_SIZE{  x=fit_individual; // natural selection  y=fit_individual;  child=cross_over(x,y);  if(small random probability)  mutate(child);  add child to NEW_POP  }  POP=NEW_POP  } UNTIL solution found  Return(best state in POP)

Genetic Algorithms  use populations, which leads to increased search space exploration  allow for a large number of different implementation choices  typically reach best performance when using operators that are based on problem characteristics  achieve good performance on a wide range of problems