The Basics and Pseudo Code

Slides:



Advertisements
Similar presentations
Heuristic Search techniques
Advertisements

Local Search Algorithms
G5BAIM Artificial Intelligence Methods
G5BAIM Artificial Intelligence Methods
Simulated Annealing Premchand Akella. Agenda Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing General Idea: Start with an initial solution
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Local search algorithms
Recent Development on Elimination Ordering Group 1.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
1 Approximate Solution to an Exam Timetabling Problem Adam White Dept of Computing Science University of Alberta Adam White Dept of Computing Science University.
Teaching Stochastic Local Search Todd W. Neller. FLAIRS-2005 Motivation Intro AI course has many topics, little time Best learning is experiential, but.
Iterative Improvement Algorithms
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
Lecture 17 Today: Start Chapter 9 Next day: More of Chapter 9.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
MAE 552 – Heuristic Optimization Lecture 6 February 4, 2002.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 5 Jim Martin.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
1 IE 607 Heuristic Optimization Simulated Annealing.
Simulated Annealing.
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Local Search Algorithms
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Optimization Problems
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Local Search Algorithms and Optimization Problems
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.
Parallel Simulated Annealing using Genetic Crossover Tomoyuki Hiroyasu Mitsunori Miki Maki Ogura November 09, 2000 Doshisha University, Kyoto, Japan.
Optimization Problems
Optimization via Search
CSCI 4310 Lecture 10: Local Search Algorithms
Department of Computer Science
Informed Search Chapter 4 (b)
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
Local Search Algorithms
By Rohit Ray ESE 251 Simulated Annealing.
G5BAIM Artificial Intelligence Methods
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Local Search and Optimization
Optimization Problems
CSE 589 Applied Algorithms Spring 1999
Informed Search Chapter 4 (b)
School of Computer Science & Engineering
Introduction to Simulated Annealing
More on Search: A* and Optimization
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Artificial Intelligence
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Local Search Algorithms
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Alex Bolsoy, Jonathan Suggs, Casey Wenner
Simulated Annealing & Boltzmann Machines
Stochastic Methods.
Presentation transcript:

The Basics and Pseudo Code Simulated Annealing The Basics and Pseudo Code

SA Basics Fundamentally simple algorithm Not much more intelligent than guessing Relies on the algorithm’s ability to span the search space effectively Uses simple but effective means to avoid local minima Name derives from a technique in metallurgy where two metals are fused by melting them and slowly cooling the combination

SA Glossary Solution: An answer to a problem without respect to its apparent “value” Neighbor: A solution which is “next” to a given solution in the solution space (or neighborhood) Fitness: The “value” of a solution Temperature: The current average rate at which less fit solutions are accepted Annealing Schedule: The function which lowers (or raises) the temperature the algorithm uses during its search

SA Pseudo Code S0 = GenerateSolution() Temp = START_TEMP K = 0 // iteration count while(!stopping_condition(S0, Temp, K)) S1 = Neighbor(S0) // make a neighbor of S0 if(Fitness(S1) < Fitness(S0)) S0 = S1 // if S1 is better than S0 take it else if(rand() < tempFunc(S0, S1, Temp, K)) S0 = S1 // if S1 is worse than S0 // conditionally take it end if AnnealingSchedule(S0, Temp, K) // anneal the temp K = K + 1 // increase the iteration count end while This does not deal with mesa situations, where Fitness(S1) == Fitness(S0)

What is a Neighbor? A neighbor is just a permutation based on the current solution This is how the algorithm moves around the search space In the fuel bundle given to the right a neighbor would be a different arrangement of the fuel pins Image Copyright(C)1997 TOSHIBA CORPORATION. All Rights Reserved.

Why Temperature? Simulated Annealing uses temperature as a way to leave local minima Hill climbing can get stuck Unable to move outside the minima as it does not take worse neighbors SA uses an annealing schedule to determine how high the temperature should be, and how long it should remain at a given temperature

What Annealing Schedule? This is a complicated question! There are linear schedules, geometric schedules, oscillating schedules, schedules based on the rate of improvement Some SA algorithms use a GA to design their annealing schedule! A very basic, yet effective, annealing schedule is one that is geometric, but stepped T = T * 0.9996 However, this is only done every M iterations, which is either determined by the rate of improvement or by a hard figure (say every 1000 iterations) Quantitative analysis is the only way to determine the most effective schedule

Putting it all Together When the new solution is found to be worse than the old solution you must evaluate if the bad solution should be taken Another choice must be made! How do you interpret the temperature? The Metropolis-Hastings algorithm is commonly used (and has interesting ties to SA) The algorithm approximates a probability distribution and was created specifically for the Boltzmann Distribution The algorithm allows you to create samples for problems where it is hard to effectively sample the entire space This sounds exactly like what SA requires!

Metropolis-Hastings e(-(Fitness(S1) – Fitness(S0)) / (K * Temp)) This method was given by Kirkpatrick et. al and doesn’t exactly replicate what occurs in natural annealing Based on the relative difference in how bad the new solution is and how far along the algorithm is running, the Metropolis-Hastings approach takes bad guesses frequently while the temperature is high, and eventually settles into a descent until a minima is found

Conclusion Simple changes to a greedy or hill climbing algorithm can make effective use of computational resources to find solutions in a solution space too hard or large to sample fully Simulated Annealing can be applied to many applications due to extremely simple nature Simulated Annealing can be a good choice for a first guess solution to optimizing a problem For certain problems, SA is among the best solutions for finding optimal answers to a problem