MonteCarlo Optimization (Simulated Annealing) Mathematical Biology Lecture 6 James A. Glazier.

Slides:



Advertisements
Similar presentations
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Advertisements

Monte Carlo Methods and Statistical Physics
Simulated Annealing Methods Matthew Kelly April 12, 2011.
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Stochastic Parameter Optimization for Empirical Molecular Potentials function optimization simulated annealing tight binding parameters.
Random numbers and optimization techniques Jorge Andre Swieca School Campos do Jordão, January,2003 second lecture.
Local search algorithms
Local search algorithms
Two types of search problems
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.
MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002.
Simulated Annealing 10/7/2005.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
EDA (CS286.5b) Day 7 Placement (Simulated Annealing) Assignment #1 due Friday.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Traveling Salesman Problem Continued. Heuristic 1 Ideas? –Go from depot to nearest delivery –Then to delivery closest to that –And so on until we are.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
06 - Boundary Models Overview Edge Tracking Active Contours Conclusion.
1 IE 607 Heuristic Optimization Simulated Annealing.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Simulated Annealing.
For a new configuration of the same volume V and number of molecules N, displace a randomly selected atom to a point chosen with uniform probability inside.
. CLASSES RP AND ZPP By: SARIKA PAMMI. CONTENTS:  INTRODUCTION  RP  FACTS ABOUT RP  MONTE CARLO ALGORITHM  CO-RP  ZPP  FACTS ABOUT ZPP  RELATION.
The Ising Model Mathematical Biology Lecture 5 James A. Glazier (Partially Based on Koonin and Meredith, Computational Physics, Chapter 8)
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Optimization Problems
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
CS621: Artificial Intelligence
Optimization Problems
Biointelligence Laboratory, Seoul National University
CSCI 4310 Lecture 10: Local Search Algorithms
Simulated Annealing Chapter
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
Local Search Algorithms
By Rohit Ray ESE 251 Simulated Annealing.
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
Haim Kaplan and Uri Zwick
Optimization Problems
CSE 589 Applied Algorithms Spring 1999
School of Computer Science & Engineering
Introduction to Simulated Annealing
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Artificial Intelligence
Local Search Algorithms
Local Search Algorithms
Simulated Annealing & Boltzmann Machines
Week 11 - Wednesday CS221.
Stochastic Methods.
Presentation transcript:

MonteCarlo Optimization (Simulated Annealing) Mathematical Biology Lecture 6 James A. Glazier

Optimization Other Major Application of Monte Carlo Methods is to Find the Optimal (or a nearly Optimal) Solution of an Algorithmically Hard Problem. Given want to find that minimizes f. Definition: then is a Local Minimum of f. Definition: then is the Global Minimum of f. Definition: then f has Multiple Degenerate Global Minima,

Energy Surfaces The Number and Shape of Local Minima of f Determine the Texture of the ‘Energy Surface,’ also called the Energy Landscape, Penalty Landscape or Optimization Landscape. Definition: The Basin of Attraction of a Local Minimum, is: The Depth of the Basin of Attraction is: The Radius or Size of the Basin of Attraction is: If no local minima except global minimum, then optimization is easy and energy surface is Smooth.

Energy Surfaces If multiple local minima with large basins of attraction, need to pick an In each basin, find the corresponding and pick the best. Corresponds to enumerating all states if is a finite set, e.g. TSP. If many local minima or minima have small basins of attraction, then the energy surface is Rough and Optimization is Difficult. In these cases cannot find global minimum. However, often, only need a ‘pretty good’ solution.

Monte Carlo Optimization Deterministic Methods, e.g. Newton- Rabson () Only Move Towards Better Solutions and Trap in Basins of Attraction. Need to Move the Wrong Way Sometimes to Escape Basins of Attraction (also Called Traps). Algorithm: –Choose a –Start at Propose a Move to –If Where,

Monte Carlo Optimization—Issues Given an infinite time, the pseudo-random walk Will explore all of Phase Space. However, you never know when you have reached the global minimum! So don’t know when to Stop. Can also take a very long time to escape from Deep Local Basins of Attraction. Optimal Choice of g(x) and  will depend on the particular If g(x)  1 for x<x 0, then will algorithm will not see minima with depths less than x 0. A standard Choice is the Boltzmann Distribution, g(x)=e -x/T, Where T is the Fluctuation Temperature. The Boltzmann Distribution has right equilibrium thermodynamics, but is NOT an essential choice in this application).

Temperature and  Bigger T results in more frequent unfavorable moves. In general, the time spent in a Basin of Attraction is ~ exp (Depth of Basin/ T ). An algorithm with these Kinetics is Called an Activated Process. Bigger T are Good for Moving rapidly Between Large and Deep Basins of Attraction but Ignore Subtle (less than T ) Changes in Similarly, large  move faster, but can miss deep minima with small diameter basins of attraction. A strategy for picking T is called an “Annealing Schedule.”

Annealing Schedules Ideally, want time in all local minimum basins to be small and time in global minimum basin to be nearly infinite. A fixed value of T Works if depth of the basin of attraction of global minimum>>depth of the basin of attraction of all local minima and radius of the basin of attraction of global minimum~radius of the largest basin of attraction among all local minima. If so, pick T between these two depths. If multiple local minima almost degenerate with global minimum, then can’t distinguish, but answer is almost optimal. If have a deep global minimum with very small basin of attraction (golf- course energy). Then no method helps!

Annealing Schedules If Energy Landscape is Hierarchical or Fractal, then start with large T and gradually reduce t. Selects first among large, deep basins, then successively smaller and shallower ones until it freezes in one. Called “Simulated Annealing.” No optimal choice of T s. Generally Good Strategy: Start with T ~  f / 2, if you know typical values of  f for a fixed stepsize , or T ~ typical f, if you do not. Run until typical  f << T. Then set T=T/2. Repeat. Repeat for many initial conditions. Take best solution.

Example—The Traveling Salesman Problem Simulated Annealing Method Works for Algorithmically Hard (NP Complete) problems like the Traveling Salesman problem. Put down N points in some space: Define an Itinerary: The Penalty Function or Energy or Hamiltonian is the Total Path Length for a Given Itinerary:

Example—The TSP (Contd.) Pick any Initial Itinerary. At each Monte Carlo Step, pick: If the initial Itinerary is: Then the Trial Itinerary is the Permutation: Then Apply the Metropolis Algorithm. A Good Initial Choice of T is: This Algorithm Works Well, Giving a Permutation with H within a Percent or Better if the Global Optimum in a Reasonable Amount of Time.