Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.

Slides:



Advertisements
Similar presentations
Vegetation Science Lecture 4 Non-Linear Inversion Lewis, Disney & Saich UCL.
Advertisements

Nelder Mead.
CS6800 Advanced Theory of Computation
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Simulated Annealing Premchand Akella. Agenda Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing General Idea: Start with an initial solution
Optimization : The min and max of a function
Simulated annealing... an overview. Contents 1.Annealing & Stat.Mechs 2.The Method 3.Combinatorial minimization ▫The traveling salesman problem 4.Continuous.
Simulated Annealing Methods Matthew Kelly April 12, 2011.
1 Optimization Algorithms on a Quantum Computer A New Paradigm for Technical Computing Richard H. Warren, PhD Optimization.
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Applying Machine Learning to Circuit Design David Hettlinger Amy Kerr Todd Neller.
Random numbers and optimization techniques Jorge Andre Swieca School Campos do Jordão, January,2003 second lecture.
Local search algorithms
Local search algorithms
Two types of search problems
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.
MAE 552 – Heuristic Optimization Lecture 26 April 1, 2002 Topic:Branch and Bound.
Simulated Annealing 10/7/2005.
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
1 Simulated Annealing Terrance O ’ Regan. 2 Outline Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
Advanced Topics in Optimization
MAE 552 – Heuristic Optimization Lecture 6 February 4, 2002.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Elements of the Heuristic Approach
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
1 IE 607 Heuristic Optimization Simulated Annealing.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Simulated Annealing.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
Doshisha Univ., Kyoto, Japan CEC2003 Adaptive Temperature Schedule Determined by Genetic Algorithm for Parallel Simulated Annealing Doshisha University,
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Simulated Annealing G.Anuradha.
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Optimizing Pheromone Modification for Dynamic Ant Algorithms Ryan Ward TJHSST Computer Systems Lab 2006/2007 Testing To test the relative effectiveness.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Chapter 10 Minimization or Maximization of Functions.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Optimization Problems
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
CS621: Artificial Intelligence
Optimization Problems
Van Laarhoven, Aarts Version 1, October 2000
Non-linear Minimization
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Optimization Problems
CSE 589 Applied Algorithms Spring 1999
Introduction to Simulated Annealing
More on Search: A* and Optimization
traveling salesman problem
Downhill Simplex Search (Nelder-Mead Method)
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Simulated Annealing & Boltzmann Machines
Stochastic Methods.
Presentation transcript:

Simulated Annealing G.Anuradha

What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used to re- crystallize metalsstochastic optimization Comes under the category of evolutionary techniques of optimization

What is annealing? Annealing is a heat process whereby a metal is heated to a specific temperature and then allowed to cool slowly. This softens the metal which means it can be cut and shaped more easily

What happens during annealing? Initially when the metal is heated to high temperatures, the atoms have lots of space to move about Slowly when the temperature is reduced the movement of free atoms are slowly reduced and finally the metals crystallize themselves

Relation between annealing and simulated annealing Simulated annealing is analogous to this annealing process. Initially the search area is more, there input parameters are searched in more random space and slowly with each iteration this space reduces. This helps in achieving global optimized value, although it takes much more time for optimizing

Analogy between annealing and simulated annealing Annealing – Energy in thermodynamic system – high-mobility atoms are trying to orient themselves with other nonlocal atoms and the energy state can occasionally go up. – low-mobility atoms can only orient themselves with local atoms and the energy state is not likely to go up again. Simulated Annealing – Value of objective function – At high temperatures, SA allows fn. evaluations at faraway points and it is likely to accept a new point. – At low temperatures, SA evaluates the objective function only at local points and the likelihood of it accepting a new point with higher energy is much lower.

Cooling Schedule how rapidly the temperature is lowered from high to low values. This is usually application specific and requires some experimentation by trial-and-error.

Fundamental terminologies in SA Objective function Generating function Acceptance function Annealing schedule

Objective function E = f(x), where each x is viewed as a point in an input space. The task of SA is to sample the input space effectively to find an x that minimizes E.

Generating function

Acceptance function Decides whether to accept/reject a new value of x new Where c – system dependent constant, T is temperature, ∆E is –ve SA accepts new point ∆E is +ve SA accepts with higher energy state Initially SA goes uphill and downhill

Annealing schedule decrease the temperature T by a certain percentage at each iteration.

Steps involved in general SA method

Gaussian probability density function-Boltzmann machine is used in conventional GA

Travelling Salesman Problem In a typical TSP problem there are ‘n’ cities, and the distance (or cost) between all pairs of these cities is an n x n distance (or cost) matrix D, where the element dij represents the distance (cost) of traveling from city i to city j. The problem is to find a closed tour in which each city, except for starting one, is visited exactly once, such that the total length (cost) is minimized. combinatorial optimization; it belongs to a class of problems known as NP-complete

TSP Inversion: Remove two edges from the tour and replace them to make it another legal tour.

TSP Translation Remove a section (8-7) of the tour and then replace it in between two randomly selected consecutive cities 4 and 5).

TSP Switching: Randomly select two cities and switch them in the tour

Put together

SA(Extracts from Sivanandem) Step 1:Initialize the vector x to a random point in the set φ Step 2:Select an annealing schedule for the parameter T, and initialize T Step 3:Compute xp=x+Δx where x is the proposed change in the system’s state Step 4:Compute the change in the cool Δf=f(xp)-f(x)

Algo contd….

Step 6: Repeat steps 3-5 n number of times Step 7: If an improvement has been made after the n number of iterations, set the centre point of be the best point Step 8:Reduce the temperature Step 9: Repeat Steps 3-8 for t number of temperatures

Random Search Explores the parameter space of an objective function sequentially in a random fashion to find the optimal point that maximizes or minimizes objective function Simple Optimization process takes a longer time

Primitive version (Matyas)

Observations in the primitive version Leads to reverse step in the original method Uses bias term as the center for random vector

Modified random search

Initial bias is chosen as a zero vector Each component of dx should be a random variable having zero mean and variance proportional to the range of the corresponding parameter This method is primarily used for continuous optimization problems

Downhill Simplex Method (Nelder- Mead) Keep track of n+1 points in n dimensions – Vertices of a simplex (triangle in 2D tetrahedron in 3D, etc.) At each iteration: simplex can move, expand, or contract – Sometimes known as amoeba method: simplex “oozes” along the function

Downhill Simplex Method (Nelder- Mead) Basic operation: reflection worst point (highest function value) location probed by reflection step

Downhill Simplex Method (Nelder- Mead) If reflection resulted in best (lowest) value so far, try an expansion Else, if reflection helped at all, keep it location probed by expansion step

Downhill Simplex Method (Nelder- Mead) If reflection didn’t help (reflected point still worst) try a contraction location probed by contration step

Downhill Simplex Method (Nelder- Mead) If all else fails shrink the simplex around the best point

Downhill Simplex Method (Nelder- Mead) Method fairly efficient at each iteration (typically 1-2 function evaluations) Can take lots of iterations Somewhat flakey – sometimes needs restart after simplex collapses on itself, etc. Benefits: simple to implement, doesn’t need derivative, doesn’t care about function smoothness, etc.