Simulated Annealing G.Anuradha.

Slides:



Advertisements
Similar presentations
Decision Support Andry Pinto Hugo Alves Inês Domingues Luís Rocha Susana Cruz.
Advertisements

CS6800 Advanced Theory of Computation
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Simulated Annealing Premchand Akella. Agenda Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing General Idea: Start with an initial solution
Simulated annealing... an overview. Contents 1.Annealing & Stat.Mechs 2.The Method 3.Combinatorial minimization ▫The traveling salesman problem 4.Continuous.
Simulated Annealing Methods Matthew Kelly April 12, 2011.
1 Optimization Algorithms on a Quantum Computer A New Paradigm for Technical Computing Richard H. Warren, PhD Optimization.
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Applying Machine Learning to Circuit Design David Hettlinger Amy Kerr Todd Neller.
Simulated Annealing Student (PhD): Umut R. ERTÜRK Lecturer : Nazlı İkizler Cinbiş
Local search algorithms
Local search algorithms
Two types of search problems
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.
MAE 552 – Heuristic Optimization Lecture 26 April 1, 2002 Topic:Branch and Bound.
Simulated Annealing 10/7/2005.
1 Simulated Annealing Terrance O ’ Regan. 2 Outline Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
Elements of the Heuristic Approach
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Randomized Algorithm. NP-Complete Problem  A problem that, right now, we need exhaustive search  Example:  SAT  TSP  Vertex Cover  Etc.
Genetic Algorithms and Ant Colony Optimisation
1 IE 607 Heuristic Optimization Simulated Annealing.
Chapter 7 Other Important NN Models Continuous Hopfield mode (in detail) –For combinatorial optimization Simulated annealing (in detail) –Escape from local.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
A Modified Meta-controlled Boltzmann Machine Tran Duc Minh, Le Hai Khoi (*), Junzo Watada (**), Teruyuki Watanabe (***) (*) Institute Of Information Technology-Viet.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Simulated Annealing.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
Doshisha Univ., Kyoto, Japan CEC2003 Adaptive Temperature Schedule Determined by Genetic Algorithm for Parallel Simulated Annealing Doshisha University,
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
A Computational Study of Three Demon Algorithm Variants for Solving the TSP Bala Chandran, University of Maryland Bruce Golden, University of Maryland.
1 Genetic Algorithms and Ant Colony Optimisation.
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
Optimizing Pheromone Modification for Dynamic Ant Algorithms Ryan Ward TJHSST Computer Systems Lab 2006/2007 Testing To test the relative effectiveness.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Chapter 10 Minimization or Maximization of Functions.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.
Meta-controlled Boltzmann Machine toward Accelerating the Computation Tran Duc Minh (*), Junzo Watada (**) (*) Institute Of Information Technology-Viet.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
CS621: Artificial Intelligence
Simulated Annealing Chapter
Van Laarhoven, Aarts Version 1, October 2000
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
CSE 589 Applied Algorithms Spring 1999
Introduction to Simulated Annealing
traveling salesman problem
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Simulated Annealing & Boltzmann Machines
Stochastic Methods.
Presentation transcript:

Simulated Annealing G.Anuradha

What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used to re-crystallize metals Comes under the category of evolutionary techniques of optimization

What is annealing? Annealing is a heat process whereby a metal is heated to a specific temperature and then allowed to cool slowly. This softens the metal which means it can be cut and shaped more easily

What happens during annealing? Initially when the metal is heated to high temperatures, the atoms have lots of space to move about Slowly when the temperature is reduced the movement of free atoms are slowly reduced and finally the metals crystallize themselves

Relation between annealing and simulated annealing Simulated annealing is analogous to this annealing process. Initially the search area is more, there input parameters are searched in more random space and slow with each iteration this space reduces. This helps in achieving global optimized value, although it takes much more time for optimizing

Analogy between annealing and simulated annealing Energy in thermodynamic system high-mobility atoms are trying to orient themselves with other nonlocal atoms and the energy state can occasionally go up. low-mobility atoms can only orient themselves with local atoms and the energy state is not likely to go up again. Simulated Annealing Value of objective function At high temperatures, SA allows fn. evaluations at faraway points and it is likely to accept a new point. At low temperatures, SA evaluates the objective function only at local points and the likelihood of it accepting a new point with higher energy is much lower.

Cooling Schedule how rapidly the temperature is lowered from high to low values. This is usually application specific and requires some experimentation by trial-and-error.

Fundamental terminologies in SA Objective function Generating function Acceptance function Annealing schedule

Objective function E = f(x), where each x is viewed as a point in an input space. The task of SA is to sample the input space effectively to find an x that minimizes E.

Generating function A generating function specifies the probability density function of the difference between the current point and the next point to be visited. ∆𝑥=(𝑥𝑛𝑒𝑤−𝑥) is a random variable with probability density function g(∆x, T), where T is the temperature.

Acceptance function Decides whether to accept/reject a new value of xnew Where c – system dependent constant, T is temperature, ∆E is –ve SA accepts new point ∆E is +ve SA accepts with higher energy state Initially SA goes uphill and downhill

Annealing schedule decrease the temperature T by a certain percentage at each iteration.

Steps involved in general SA method

Steps involved in general SA method Gaussian probability density function-Boltzmann machine is used in conventional GA

Travelling Salesman Problem In a typical TSP problem there are ‘n’ cities, and the distance (or cost) between all pairs of these cities is an n x n distance (or cost) matrix D, where the element dij represents the distance (cost) of traveling from city i to city j. The problem is to find a closed tour in which each city, except for starting one, is visited exactly once, such that the total length (cost) is minimized. combinatorial optimization; it belongs to a class of problems known as NP-complete

TSP Inversion: Remove two edges from the tour and replace them to make it another legal tour.

TSP Translation Remove a section (8-7) of the tour and then replace it in between two randomly selected consecutive cities 4 and 5).

TSP Switching: Randomly select two cities and switch them in the tour

Put together

SA(Extracts from Sivanandem) Step 1:Initialize the vector x to a random point in the set φ Step 2:Select an annealing schedule for the parameter T, and initialize T Step 3:Compute xp=x+Δx where x is the proposed change in the system’s state Step 4:Compute the change in the cool Δf=f(xp)-f(x)

Algo contd…. Step 5: by using metropolis algorithm, decide if xp should be used as the new state of the system or the new state of the system or keep the current state x. 𝑝𝑟 𝑥→𝑥𝑝 = 1 𝑓𝑜𝑟 ∆𝑓<0 𝑒 − ∆𝑓 𝑇 𝑓𝑜𝑟 ∆𝑓≥0 Where T replaces kbT. When Δf>=0 a random number is selected from a uniform distribution in the range of [0 1]. If 𝑝𝑟(x xp) > n the state xp is used as the new state otherwise the state remains at x.

Algo contd…. Step 6: Repeat steps 3-5 n number of times Step 7: If an improvement has been made after the n number of iterations, set the centre point of be the best point Step 8:Reduce the temperature Step 9: Repeat Steps 3-8 for t number of temperatures

Random Search Explores the parameter space of an objective function sequentially in a random fashion to find the optimal point that maximizes or minimizes objective function Simple Optimization process takes a longer time

Primitive version (Matyas)

Observations in the primitive version Leads to reverse step in the original method Uses bias term as the center for random vector

Modified random search

Initial bias is chosen as a zero vector Each component of dx should be a random variable having zero mean and variance proportional to the range of the corresponding parameter This method is primarily used for continuous optimization problems