Heuristic Optimization Methods

Slides:



Advertisements
Similar presentations
Decision Support Andry Pinto Hugo Alves Inês Domingues Luís Rocha Susana Cruz.
Advertisements

G5BAIM Artificial Intelligence Methods
G5BAIM Artificial Intelligence Methods
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Simulated Annealing Student (PhD): Umut R. ERTÜRK Lecturer : Nazlı İkizler Cinbiş
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Tabu Search for Model Selection in Multiple Regression Zvi Drezner California State University Fullerton.
MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
MAE 552 – Heuristic Optimization
Simulated Annealing 10/7/2005.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
MAE 552 – Heuristic Optimization Lecture 6 February 4, 2002.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Ant Colony Optimization: an introduction
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
1 IE 607 Heuristic Optimization Simulated Annealing.
Heuristic Optimization Methods
Heuristic Optimization Methods
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Simulated Annealing.
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Heuristic Optimization Methods Greedy algorithms, Approximation algorithms, and GRASP.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
FORS 8450 Advanced Forest Planning Lecture 11 Tabu Search.
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
LOG740 Heuristic Optimization Methods Local Search / Metaheuristics.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
FORS 8450 Advanced Forest Planning Lecture 6 Threshold Accepting.
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Lecture 18, CS5671 Multidimensional space “The Last Frontier” Optimization Expectation Exhaustive search Random sampling “Probabilistic random” sampling.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Simulated Annealing (SA)
Discrete Optimization MA2827 Fondements de l’optimisation discrète Approximate methods: local search Material based.
Optimization Problems
Scientific Research Group in Egypt (SRGE)
Optimization via Search
Department of Computer Science
Van Laarhoven, Aarts Version 1, October 2000
School of Computer Science & Engineering
G5BAIM Artificial Intelligence Methods
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
Heuristic search INT 404.
Optimization Problems
CSE 589 Applied Algorithms Spring 1999
Multi-Objective Optimization
School of Computer Science & Engineering
Introduction to Simulated Annealing
5.2.3 Optimization, Search and
More on Search: A* and Optimization
EMIS 8373: Integer Programming
Boltzmann Machine (BM) (§6.4)
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Artificial Intelligence Search Algorithms
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Simulated Annealing & Boltzmann Machines
Stochastic Methods.
Presentation transcript:

Heuristic Optimization Methods Lecture 03 – Simulated Annealing

Simulated Annealing A metaheuristic inspired by statistical thermodynamics Based on an analogy with the cooling of material in a heat bath Used in optimization for 20 years Very simple to implement A lot of literature Converges to the global optimum under weak assumptions (- usually slowly)

Simulated Annealing - SA Metropolis’ algorithm (1953) Algorithm to simulate energy changes in physical systems when cooling Kirkpatrick, Gelatt and Vecchi (1983) Suggested to use the same type of simulation to look for good solutions in a COP

SA - Analogy Thermodynamics Discrete optimization Configuration of particles System state Energy State change Temperature Final state Discrete optimization Solution Feasible solution Objective Function Move to neighboring solution Control Parameter Final Solution

Simulated Annealing Can be interpreted as a modified random descent in the space of solutions Choose a random neighbor Improving moves are always accepted Deteriorating moves are accepted with a probability that depends on the amount of the deterioration and on the temperature (a parameter that decreases with time) Can escape local optima

Move Acceptance in SA We assume a minimization problem Set Δ = Obj(random neighbor) – Obj(current solution) If Δ < 0  accept (we have an improving move) Else accept if If the move is not accepted: try another random neighbor

SA - Structure Initial temperature t0 high Reduce t regularly (if   random walk) Reduce t regularly need a cooling schedule if too fast  stop in some local optimum too early if too slow  too slow convergence Might restart Choice of neighborhood structure is important

SA Statistical guarantee that SA finds the global optimum In practice this requires exponential (or ) running time The cooling schedule is vitally important Much research on this Static schedules: specified in advance Adaptive schedules: react to information from the search

Choice of Move in SA Modified ”Random Descent” Select a random solution in the neighborhood Accept this Unconditionally if better than current With a certain, finite probability if worse than current The probability is controlled by a parameter called the temperature Can escape from local optima

SA - Cooling Random Walk Random Descent

SA – Overall Structure Set the initial value of the control variable t (t0) to a high value Do a certain number of iterations with the same temperature Then reduce the temperature Need a ”cooling schedule” Stopping criterion – e.g. ”minimum temperature” Repetition is possible Solution quality and speed are dependent on the choices made Choice of neighborhood structure is important

Statistical Analysis of SA Model: State transitions in the search space Transition probabilities [pij] (i,j are solutions) Only dependent on i and j: homogenous Markov chain If all the transition probabilities are finite, then the SA search will converge towards a stationary distribution, independent of the starting solution. When the temperature approaches zero, this distribution will approach a uniform distribution over the global optima Statistical guarantee that SA finds a global optimum But: exponential (or infinite) search time to guarantee finding the optimum

SA in Practice (1) Heuristic algorithm Behaviour strongly dependent on the cooling schedule Theory: An exponential number of iterations at each temperature Practice: A large number of iterations at each temperature, few temperatures A small number of iterations at each temperature, many temperatures

SA in Practice (2) Geometric chain Number of repetitions can be varied ti+1 =  ti, i = 0,…,K  <1 (0.8 - 0.99) Number of repetitions can be varied Adaptivity: Variable number of moves before the temperature reduction Necessary to experiment

SA – General Decisions Cooling Schedule Adaptive number of repetitions Based on maximum difference in the objective function value of solutions, given a neighborhood Number of repetitions at each temperature Reduction rate,  Adaptive number of repetitions more repetitions at lower temperatures number of accepted moves, but a maximum limit Very low temperatures are not necessary Cooling rate most important

SA – Problem Specific Decisons Important goals Response time Quality of the solution Important choices Search space Infeasible solutions – should they be included? Neighborhood structure Move evaluation function Use of penalty for violated constraints Approximation – if expensive to evaluate Cooling schedule

SA – Choice of Neighborhood Size Variation in size Topologi Symmetry Connectivity Every solution can be reached from all the others Topography Spikes, Plateaus, Deep local optima Move evaluation function How expensive is it to calculate ?

SA - Speed Random choice of neighbor Cost of new candidate solution Reduction of the neighborhood Does not search through all the neighbors Cost of new candidate solution Difference without full evaluation Approximation (using surrogate functions) Move acceptance criterion Simplify

SA – Example: TSP Search space - (n-1)!/2 Neighborhood size: Connected 2-opt: n(n-1)/2 Connected Simple representation of moves Natural cost function Difference in cost between solutions is easy to calculate Generalization: k-Opt

SA – Fine Tuning Test problems Test bench Visualization of solutions Values for cost / penalties temperature number / proportion of accepted move iterations / CPU time Depencies between the SA-parameters The danger of overfitting

SA – Summary Inspired by statistical mechanics - cooling Metaheuristic Local search Random descent Use randomness to escape local optima Simple and robust method Easy to get started Proof for convergence to the global optimum Worse than complete search In practise: Computationally expensive Fine tuning can give good results SA can be good where robust heuristics based on problem structure are difficult to make

Topics for the next Lecture More Local Search based metaheuristics Simulated Annealing (the rest) Threshold Accepting Variable Neighborhood Search Iterative Local Search Guided Local Search This will prepare the ground for more advanced mechanisms, such as found in Tabu Search Adaptive Memory Procedures