Global Optimization The Problem minimize f(x) subject to g i (x)>b i i=1,…,m h j (x)=c j j=1,…,n When x is discrete we call this combinatorial optimization.

Slides:



Advertisements
Similar presentations
Algorithm Design Methods Spring 2007 CSE, POSTECH.
Advertisements

Traveling Salesperson Problem
Types of Algorithms.
Monte Carlo Methods and Statistical Physics
Branch & Bound Algorithms
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
Branch and Bound Searching Strategies
MAE 552 – Heuristic Optimization Lecture 24 March 20, 2002 Topic: Tabu Search.
Dealing with NP-Complete Problems
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
The Theory of NP-Completeness
MAE 552 – Heuristic Optimization
1 Branch and Bound Searching Strategies 2 Branch-and-bound strategy 2 mechanisms: A mechanism to generate branches A mechanism to generate a bound so.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
Chapter 11: Limitations of Algorithmic Power
Branch and Bound Algorithm for Solving Integer Linear Programming
5-1 Chapter 5 Tree Searching Strategies. 5-2 Breadth-first search (BFS) 8-puzzle problem The breadth-first search uses a queue to hold all expanded nodes.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
NP and NP- Completeness Bryan Pearsaul. Outline Decision and Optimization Problems Decision and Optimization Problems P and NP P and NP Polynomial-Time.
Ant Colony Optimization: an introduction
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Escaping local optimas Accept nonimproving neighbors – Tabu search and simulated annealing Iterating with different initial solutions – Multistart local.
1 IE 607 Heuristic Optimization Simulated Annealing.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Simulated Annealing.
Heuristic Optimization Methods Greedy algorithms, Approximation algorithms, and GRASP.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
Course: Logic Programming and Constraints
Mathematical Models & Optimization?
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
FORS 8450 Advanced Forest Planning Lecture 11 Tabu Search.
Applications of Dynamic Programming and Heuristics to the Traveling Salesman Problem ERIC SALMON & JOSEPH SEWELL.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
1 Branch and Bound Searching Strategies Updated: 12/27/2010.
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
1 Network Models Transportation Problem (TP) Distributing any commodity from any group of supply centers, called sources, to any group of receiving.
A Membrane Algorithm for the Min Storage problem Dipartimento di Informatica, Sistemistica e Comunicazione Università degli Studi di Milano – Bicocca WMC.
Presenter: Leo, Shih-Chang, Lin Advisor: Frank, Yeong-Sung, Lin /12/16.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
Optimization Problems
Branch and Bound Algorithms Present by Tina Yang Qianmei Feng.
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
Preliminary Background Tabu Search Genetic Algorithm.
IE 312 Review 1. The Process 2 Problem Model Conclusions Problem Formulation Analysis.
Branch and Bound Searching Strategies
Common Intersection of Half-Planes in R 2 2 PROBLEM (Common Intersection of half- planes in R 2 ) Given n half-planes H 1, H 2,..., H n in R 2 compute.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Lecture 11: Linkage Analysis IV Date: 10/01/02  linkage grouping  locus ordering  confidence in locus ordering.
1 Chapter 5 Branch-and-bound Framework and Its Applications.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Hard Problems Some problems are hard to solve.
Signal processing and Networking for Big Data Applications: Lecture 9 Mix Integer Programming: Benders decomposition And Branch & Bound NOTE: To change.
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Study Guide for ES205 Yu-Chi Ho Jonathan T. Lee Nov. 7, 2000
Heuristics Definition – a heuristic is an inexact algorithm that is based on intuitive and plausible arguments which are “likely” to lead to reasonable.
Integer Programming (정수계획법)
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
School of Computer Science & Engineering
Introduction to Simulated Annealing
Branch and Bound Searching Strategies
Integer Programming (정수계획법)
Algorithm Design Methods
Topic 15 Job Shop Scheduling.
Presentation transcript:

Global Optimization The Problem minimize f(x) subject to g i (x)>b i i=1,…,m h j (x)=c j j=1,…,n When x is discrete we call this combinatorial optimization i.e. an optimization problem with a finite number of feasible solutions. Note that when the objective function and/or constraints cannot be expressed analytically, solution techniques used in combinatorial problems can be used to solve this problem

P vs. NP P: Optimization problems for which there exists an algorithm to solve it with polynomial time complexity. This means that the time it takes to solve the problem can be expressed as a polynomial that is a function of the dimension of the problem. NP: Stands for ‘non-deterministic polynomial’; not P. NP hard: If a problem Q is such that every problem in NP is polynomially transformable to Q, we say that Q is NP hard. NP problems and their existence give justification for heuristic methods for solving optimization problems.

Assignment Problem A set of n people is available to carry out n tasks. If person i does task j, it costs c ij units. Find an assignment {x 1,…x n } that minimizes  n i=1 c ix i The solution is represented by the permutation {x 1,…x n } of the numbers {1,…,n}. Example tasksolution x1 does task x2 does task 3 x1| x3 does task 1 x2| cost x3| person

Knapsack Problem A set of n items is available to be packed into a knapsack with capacity C units. Item i has value v i and uses up c i units of capacity. Determine the subset I of items which should be packed in order to minimize  I v i subject to  I c i < C Here the solution is represented by the subset I of the set {1,…,n}. Example valuecapacitysolution 12.7C/2I={1,3} 23.2C/4 31.1C/2

Traveling Salesman Problem (TSP) A salesperson must visit n cities once before returning home. The distance from city i to city j is d ij. What ordering of the cities minimizes the distance the salesperson must travel before returning home? SETUP minimize  n i,j=1 d ij x ij subject to  n i=1 x ij =1,  n j=1 x ij =1 where x ij = 1 if go from city i to city j = 0 otherwise Note that this is an integer programming problem and there are (n-1)! possible solutions to this problem.

Integer Programming Integer problems involve large numbers of variables and constraint and quickly become very large problems. Finding a Solution If the function is piecewise linear, the problem can be solved exactly with a mixed integer program method that uses branch and bound (later). Otherwise, Heuristic methods (‘finding’ methods) can be used to find approximate solutions. What are heuristic methods? Definition: A heuristic is a technique which seeks good (i.e. near optimal) solutions at a reasonable computational cost without being able to guarantee either feasibility or optimality, or even in many cases to state how close to optimality a particular feasible solution is.

Branch and Bound (In general) Branch and bound is a general search method used to find the minimum of a function, f(x), where x is restricted to some feasible region. f 0L =2 f 0U =9 f 1L =2 f 1U =7 f 2L =4 f 2U =9 Step 0Step 1Step 2 f 1L =2 f 1U =7 f 4L =5 f 4U =9 f 3L =4 f 3U =4 In step 2 the lower bound equals the upper bound in region 3, so 4 is a optimal solution for region 3. Region 4 can be removed from consideration since it has a lower bound of 5, which is greater than 4. Continue branching until the location of the global optimal is found. The difficulty with this method is in determining the lower and upper bounds on each of the regions. L = lower bound U = upper bound

Branch and Bound applied to Mixed Integer Programming A Mixed Integer Program The solution to the optimization problem includes elements that are integers. So minimize f(x) where x=(x 1,x 2,…,x n ) and x i = integer, for at least one i. Branch and Bound: Suppose x 1 and x 2 must be an integers. x1x1 x2x2 * x1x1 x2x2 Find a global minimum for a relaxed problem. x1x1 x2x2 Find minima for the subproblems I and II. * Find minima for the subproblems III, IV and V. * II I III IV V

Clustering Methods Clustering methods are an improvement to multistart methods. Multistart methods: These are methods of optimization that determine the global optimal by comparing the local optimal attained from a large number of different starting points. These methods are inefficient because many starting points may lead to the same local minimum. Clustering methods: A form of a multistart method, with one major difference. Neighborhoods of starting points that lead to the same local minimum are estimated. This decreases redundancies in local minimum values that are found. x1x1 x2x2 x1x1 x2x2 x1x1 x2x2 * * * * * * * * Step 0: Sample pointsStep 1: Create groupsStep 2: Continue sampling * * * * * * * * * * * * * * * * * * ** ***The challenge is how to identify the groups.

Simulated Annealing A method that is useful in solving combinatorial optimization problems. At the the heart of this method is the annealing process studied in thermodynamics. High TemperatureLow Temperature Thermal mobility is lost as the temperature decreases. Thermodynamic StructureCombinatorial Optimization System statesFeasible solutions EnergyCost Change of stateNeighboring solution TemperatureControl parameter Frozen stateHeuristic solution

Simulated Annealing The Boltzmann probability distribution: Prob(E)~exp(-E/kT). k is the Boltzmann’s constant which relates temperature to energy. A system in thermal equilibrium at temperature T has its energy probabilistically distributed among all different energy states E. Even if the temperature is low, there is a chance that the energy state will be high. (This is a small chance, but it is a chance nonetheless.) There is a chance for the system to get out of local energy minimums in search of the global minimum energy state. The general scheme of always taking downhill steps while sometimes taking an uphill step has come to be known as the Metropolis algorithm, named after Metropolis who first incorporated simulated annealing ideas in an optimization problem in 1953.

Simulated Annealing Algorithm to minimize the cost function, f(s). 1) Select an initial solution s 0, an initial temperature t 0 and set iter=0. 2) Select a temperature reduction function, , and a maximum number of iterations, nrep. 3) Randomly select a new point s in the neighborhood of s 0 ; set iter=iter+1. 4) If f(s) nrep. 5) Generate random x between 0 and 1. 6) If x nrep. 7) Let s 0 remain the same and go to step 3 until iter > nrep. 8) Set t=  (t) until stopping criteria is met. 9) The approximation to the optimal solution is s 0. All of the following parameters affect the speed and quality of the solution. t 0 :a high value for free exchange. N(s 0 ): by swap, mutation, random, etc.  :cooling should be gradual t decay. stopping criteria:minimum temp; total nrep: related to the dimension;number of iterations exceeded; can vary with t (higher with low t).proportion of acceptable moves.

Hybrid Methods MINLP: Mixed Integer Nonlinear Programming Branch and Bound (Talked about this earlier.) 1) Relax the integer constraints forming a nonlinear problem. 2) Fix the integer values found to be closest to the solution found in step 1. 3) Solve the new nonlinear programming problems for the fixed integer values until all of the integer parameters are determined. **Requires a large number of NLP problems to be solved.

Hybrid Methods Tree Annealing: Simulated annealing applied to continuous functions Algorithm 1) Randomly choose an initial point, x, over a search interval, S 0. 2) Randomly travel down the tree to an arbitrary terminal node i, and generate a candidate point, y, over the subspace defined by S i. 3) If f(y) < f(x), then replace x with y and go to step 5. 4) Compute P=exp(-(f(y)-f(x))/T). If P > R, where R is a random number uniformly distributed between 0 and 1, then replace x with y. 5) If y replaced x, then decrease T slightly and update the tree. 6) Set i=i+1, and go to 2 until T < Tmin.

Hybrid Methods Differences between tree and simulated annealing: 1) The points x and y are sampled from a continuous space. 2) A minimum value is determined by an increasing density of nodes in a given region. The subspace over which candidate points are chosen decreases in size as a minimum value is approached. 3) The probability of accepting y is now governed by a modified criterion: P=g(x)p(y)/g(y)p(x) where p(y)=exp(-f(x)/T) and g(x)=(1/V x )q i g is dependent upon the volume of the node associated with the subspace defined by x, V x, as well as the path from the root node to the current node, q i. Tree annealing is not guaranteed to converge and often convergence is very slow. Use tree annealing as a first step, then use a gradient method to attain the minimum.

Statistical Global Optimization A statistical model of the objective function is used to bias the selection of new sample points. This is a Bayesian argument, where we use information about the objective function gathered to make decisions about where to sample new points. Problems 1) The statistical model used may not truly represent the objective function. If the statistical model is determined prematurely, the optimization algorithm may be biased and lead to unreliable solutions. 2) Determining a statistical model can be mathematically intense.

Tabu Search The tabu search is designed to cross boundaries of feasibility or local optimality normally treated as barriers, and systematically to impose and release constraints to permit exploration of otherwise forbidden regions. Example: )Assume an initial solution: 2)Define a neighborhood by some type of operation applied to this solution, such as a swap exchange: The 10 adjacent solutions attained by such a swap for a neighborhood. 3) For each swap define a move value or a change in fitness value. 4) Classify a subset of the moves in a neighborhood as forbidden or tabu, such as pairs that have been swapped cannot be swapped for 3 iterations. Call this the tabu classification. 5) Define an aspiration criteria that allows us to override the tabu classification, such as if the move results in a new global minimum.

Tabu Search Example Iteration 0 Value = 4 Tabu structure Swap value 5,4 -6 3,4 -2 1,2 0 2,3 2 0: free to swap >0: tabu * Iteration Tabu structure Swap value 3,1 -2 2,3 1,5 1 3,4 2 * Value = 10

Tabu Search Example (cont.) Iteration 2 Value = 4 Tabu structure Swap value 1,3 2 3,4 2 4,5 3 2,3 4 T Iteration Tabu structure Swap value 4,5 -3 2,4 1,3 1 3,5 2 Value = 2 T * Choose a move that does not improve the solution. Although 4,5 is tabu, the resulting value obtained from this search is less than the lowest value attained thus far (4-3=1 < 2). T* T

Tabu Search Some of the factors that affect the efficiency and the quality of the tabu search include: The number of swaps per iteration. The number of moves that become tabu. The tenure of the tabu. Tabu restrictions (can restrict any swap that includes one member of a tabu pair). Can take into account the frequency of swaps and penalize move values between those pairs that have high swap frequencies.

Nested Partitions Method ( Lyeuan Shi, Operations Research, May 2000 ) This method systematically partitions the feasible region into subregions, and then concentrates the computational effort in the most promising region. 1) Assume that we have a most promising region, . 2) Partition this region into M subregions and aggregate the entire surrounding region into one region. 3) Sample each of these M+1 regions and determine the promising index for each. 4) Set  equal to the most promising region. Go to step 1. If the surrounding region is found to be the best, the algorithm backtracks to a larger region that contains the old most promising region.

Nested Partitions Example Set M = 2 Set  =  3  = {1,2,3,4,5,6,7,8} partition: partition:  5 = {1} --> prom index = 2   1 = {1,2,3,4} --> prom index = 5  6 = {2}--> prom index = 3  2 = {5,6,7,8}--> prom index = 4  0 \(  5 U  6 ) = {3,4,5,6,7,8}--> prom index = 4 Set  =  1 partition: Backtrack to  =  1  3 = {1,2} --> prom index = 5 new partition:   4 = {3,4}--> prom index = 3  7 = {1,2,3} --> prom index = 5  2 = {5,6,7,8} --> prom index = 4  8 = {4} --> prom index = 2  2 = {5,6,7,8} --> prom index = 4 continue until a minimum is found

The Tunneling Method f(q) = original objective function n = the pole strength f*= the local minimum determined in the minimization phase q* = the pumping rate for the local minimum determined in the minimization phase The Tunneling Function Cycle through the minimization and tunneling phases until no roots can be determined for the tunneling function. Minimization Phase: A local minimum value is determined. Tunneling Phase: The objective function is transformed into a tunneling function, whose roots become the starting points for the next minimization phase.

Tunneling Method Illustrated.67 Starting point: x=0 1st root found