Local optimization technique G.Anuradha. Introduction The evaluation function defines a quality measure score landscape/response surface/fitness landscape.

Slides:



Advertisements
Similar presentations
Local Search Algorithms
Advertisements

G5BAIM Artificial Intelligence Methods
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
The Greedy Method1. 2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1) Task Scheduling (§5.1.2) Minimum Spanning.
LOCAL SEARCH AND CONTINUOUS SEARCH. Local search algorithms  In many optimization problems, the path to the goal is irrelevant ; the goal state itself.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Local search algorithms
Local search algorithms
Two types of search problems
1 Simulation Modeling and Analysis Session 13 Simulation Optimization.
A New Evolutionary Algorithm for Multi-objective Optimization Problems Multi-objective Optimization Problems (MOP) –Definition –NP hard By Zhi Wei.
MAE 552 – Heuristic Optimization Lecture 27 April 3, 2002
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Local Search and Stochastic Algorithms Solution tutorial 4.
Reporter : Mac Date : Multi-Start Method Rafael Marti.
Introduction to Artificial Intelligence Local Search (updated 4/30/2006) Henry Kautz.
MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002.
Artificial Intelligence Genetic Algorithms and Applications of Genetic Algorithms in Compilers Prasad A. Kulkarni.
A TABU SEARCH APPROACH TO POLYGONAL APPROXIMATION OF DIGITAL CURVES.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
Informed Search Chapter 4 Adapted from materials by Tim Finin, Marie desJardins, and Charles R. Dyer CS 63.
Informed Search Next time: Search Application Reading: Machine Translation paper under Links Username and password will be mailed to class.
Nonlinear Stochastic Programming by the Monte-Carlo method Lecture 4 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
An Introduction to Artificial Life Lecture 4b: Informed Search and Exploration Ramin Halavati In which we see how information.
Algorithms and their Applications CS2004 ( )
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Random restart General.
Heuristic Optimization Methods Tabu Search: Advanced Topics.
Simulated Annealing.
Genetic Algorithms Siddhartha K. Shakya School of Computing. The Robert Gordon University Aberdeen, UK
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Tetris Agent Optimization Using Harmony Search Algorithm
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Optimization Problems
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
Iterative Improvement Search Including Hill Climbing, Simulated Annealing, WALKsat and more....
Different Local Search Algorithms in STAGE for Solving Bin Packing Problem Gholamreza Haffari Sharif University of Technology
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
Genetic Algorithm(GA)
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Scientific Research Group in Egypt (SRGE)
Local optimization technique
Department of Computer Science
Informed Search Chapter 4 (b)
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
School of Computer Science & Engineering
Tabu Search Review: Branch and bound has a “rigid” memory structure (i.e. all branches are completed or fathomed). Simulated Annealing has no memory structure.
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Flower Pollination Algorithm
Informed Search Chapter 4 (b)
School of Computer Science & Engineering
Artificial Intelligence
First Exam 18/10/2010.
Local Search Algorithms
Presentation transcript:

Local optimization technique G.Anuradha

Introduction The evaluation function defines a quality measure score landscape/response surface/fitness landscape

Hill Climbing start at randomly generated state move to the neighbour with the best evaluation value if a strict local-minimum is reached then restart at other randomly generated state.

Flowchart of Hill climbing Select a current solution s Evaluate s Select a new solution x from the neighborhood of s Evaluate x Is x better than s? Select x as new current solution s yesno

Stopping condition Either the whole neighborhood has been searched Or we have exceeded the threshold of allowed attempts The last solution is the best solution or the current solution is stored and the same procedure is repeated again(iterated hill climbing)

Features of hill climbing techniques Provides local optimum values that depends on starting solution Cant be used for finding the global optimum because there is no general procedure for measuring the relative error with respect to global optimum The success of the algorithm is depended on the initial value choosen

Weakness of hill climbing algorithm Termination on local optimum values There is no indication of how much the local optimum deviates from global optimum The optimum value obtained depends on the initial configurations An upper bound of computation time cant be provided Hill climbing exploit the best available solution but they neglect exploring a large portion of the search space.

Hill climbing and car example Vector of 3000 values provides indices of auction sites from 1 to 50 Evaluate the solution and assign a quality measure score Find the neighbor, evaluate, if the evaluation function is more then go in that direction Else select a new solution.

Stochastic Hill Climber The problem of getting struck up in the local optima is eliminated to a certain extend in this approach In this approach new solutions having negative change in the quality measure score is also accepted. Some basic changes to the ordinary hill climbing is made in stochastic hill climbing approach

Flowchart of Hill climbing Select a current solution s Evaluate s Select a new solution x from the neighborhood of s Evaluate x Is x better than s? Select x as new current solution s yesno

Stochastic Hill climbing approach Select a current solution s Evaluate s Select a new solution x from the neighborhood of s Evaluate x Select x as a new current solution s with probability P The probability of acceptance depends on the quality measure score difference between these solutions and T

How this probabilistic function works? There are 3 cases – 50% probable: if the new solution x has the same quality measure score as the current solution s – >50% probable: if the new solution x is superior then the probability of acceptance is greater than 50% – <50% : if the new solution is inferior, then the probability of acceptance is smaller than 50%

Effect of parameter T If the new solution x is superior, then the probability of acceptance is closer to 50% for high values of T, or closer to 100% for low values of T If the new solution x is inferior, then the probability of acceptance is closer to 50% for high values of T or closer to 0% for low values of T

How this probabilistic function works? Contd… The probability of accepting a new solution x also depends on the value of parameter T( T remains constant during the execution of the algorithm) Superior solution x would have a probability of acceptance of atleast 50%(irrespective of T) Inferior solution x have a probability of acceptance of at most 50% (0 – 50%) T is neither too low nor too high for a particular problem Forerunner of simulated annealing

Annealing Heating steel at a suitable temperature, followed by relatively slow cooling. The purpose of annealing may be to remove stresses, to soften the steel, to improve machinability, to improve cold working properties, to obtain a desired structure. The annealing process usually involves allowing the steel to cool slowly in the furnace.

Simulated Annealing Set the initial temperature T Select a current solution s Evaluate s K=0 K=K+1 Is K large enough? Select a new solution x in the neighborhood of s Evaluate x X better than s Select x as a new current solution s Select x as a new current solution s with probability p Decrease T Is T low? STOP y n yn y n

Analogy between both AnnealingSimulated Annealing StateFeasible solution EnergyEvaluation function Ground stateOptimal solution Rapid quenchingLocal search Careful annealingSimulated annealing

SA resembles a random search at higher temperatures and classic hill climber at lower temperatures When applied to a specific applications some questions come in mind? – What is the representation? – How are neighbors defined? – What is the evaluation function? – How to determine how big is k? – How to cool the system or how to decrease the temperature? – How to determine the stopping condition?

Tabu Search Meta-heuristics search algorithm that guides a local heuristic search procedure to search beyond local optimality Uses adaptive memory and responsive exploration to explore the search space Its deterministic in nature, but its possible to add some probabilistic elements to it

Flowchart of tabu search Set the initial memory M Select a current solution s Evaluate S Select a no. of solutions x,y,.. From neighbourhood of s Evaluate x,y,z…… Select one solution x as new solution s, the decision based on quality measure score and M Update M

Memory component of tabu search There are 3 ways of computing memory – Recency based memory:- Memory structure gets updated after certain iterations and records the last few iterations – Frequency based memory: the memory structure works for a longer time horizon and measures the frequency of change at each position

22 Evolution Heres a very oversimplified description of how evolution works in biology Organisms (animals or plants) produce a number of offspring which are almost, but not entirely, like themselves – Variation may be due to mutation (random changes) – Variation may be due to sexual reproduction (offspring have some characteristics from each parent) Some of these offspring may survive to produce offspring of their ownsome wont – The better adapted offspring are more likely to survive – Over time, later generations become better and better adapted Genetic algorithms use this same process to evolve better programs

Evolutionary Algorithms

Genotype and Phenotype Genes are the basic instructions for building an organism A chromosome is a sequence of genes Biologists distinguish between an organisms genotype (the genes and chromosomes) and its phenotype (what the organism actually is like) Example: You might have genes to be tall, but never grow to be tall for other reasons (such as poor diet) Similarly, genes may describe a possible solution to a problem, without actually being the solution

26 The basic genetic algorithm Start with a large population of randomly generated attempted solutions to a problem Repeatedly do the following: – Evaluate each of the attempted solutions – Keep a subset of these solutions (the best ones) – Use these solutions to generate a new population Quit when you have a satisfactory solution (or you run out of time)

Flowchart of evolution algorithm Create initial population A Initialize counter t=0 Evaluate all s from A Select a set of parents from A Create a set of offspring Create a new population A from existing parents and offspring t=t+1 Is t large STOP yes no

28 A really simple example Suppose your organisms are 32-bit computer words You want a string in which all the bits are ones Heres how you can do it: – Create 100 randomly generated computer words – Repeatedly do the following: Count the 1 bits in each word Exit if any of the words have all 32 bits set to 1 Keep the ten words that have the most 1s (discard the rest) From each word, generate 9 new words as follows: – Pick a random bit in the word and toggle (change) it Note that this procedure does not guarantee that the next generation will have more 1 bits, but its likely

29 A more realistic example, part I Suppose you have a large number of (x, y) data points – For example, (1.0, 4.1), (3.1, 9.5), (-5.2, 8.6),... You would like to fit a polynomial (of up to degree 5) through these data points – That is, you want a formula y = ax 5 + bx 4 + cx 3 + dx 2 +ex + f that gives you a reasonably good fit to the actual data – Heres the usual way to compute goodness of fit: Compute the sum of (actual y – predicted y) 2 for all the data points The lowest sum represents the best fit There are some standard curve fitting techniques, but lets assume you dont know about them You can use a genetic algorithm to find a pretty good solution

30 A more realistic example, part II Your formula is y = ax 5 + bx 4 + cx 3 + dx 2 +ex + f Your genes are a, b, c, d, e, and f Your chromosome is the array [a, b, c, d, e, f] Your evaluation function for one array is: – For every actual data point (x, y), (Im using red to mean actual data) Compute ý = ax 5 + bx 4 + cx 3 + dx 2 +ex + f Find the sum of (y – ý) 2 over all x The sum is your measure of badness (larger numbers are worse) – Example: For [0, 0, 0, 2, 3, 5] and the data points (1, 12) and (2, 22) : ý = 0x 5 + 0x 4 + 0x 3 + 2x 2 +3x + 5 is = 10 when x is 1 ý = 0x 5 + 0x 4 + 0x 3 + 2x 2 +3x + 5 is = 19 when x is 2 (12 – 10) 2 + (22 – 19) 2 = = 13 If these are the only two data points, the badness of [0, 0, 0, 2, 3, 5] is 13

31 A more realistic example, part III Your algorithm might be as follows: – Create 100 six-element arrays of random numbers – Repeat 500 times (or any other number): For each of the 100 arrays, compute its badness (using all data points) Keep the ten best arrays (discard the other 90) From each array you keep, generate nine new arrays as follows: – Pick a random element of the six – Pick a random floating-point number between 0.0 and 2.0 – Multiply the random element of the array by the random floating-point number – After all 500 trials, pick the best array as your final answer

32 The really simple example again Suppose your organisms are 32-bit computer words, and you want a string in which all the bits are ones Heres how you can do it: – Create 100 randomly generated computer words – Repeatedly do the following: Count the 1 bits in each word Exit if any of the words have all 32 bits set to 1 Keep the ten words that have the most 1s (discard the rest) From each word, generate 9 new words as follows: – Choose one of the other words – Take the first half of this word and combine it with the second half of the other word

33 Asexual vs. sexual reproduction In the examples so far, – Each organism (or solution) had only one parent – Reproduction was asexual – The only way to introduce variation was through mutation (random changes) In sexual reproduction, – Each organism (or solution) has two parents – Assuming that each organism has just one chromosome, new offspring are produced by forming a new chromosome from parts of the chromosomes of each parent

Crossover and mutation operators Mutation Crossover

Crossover is a genetic operator that combines (mates) two chromosomes (parents) to produce a new chromosome (offspring). Types of crossover – One point – Two point – Arithmetic – Heuristic

Types of crossover One point crossover Two point crossover

Arithmetic crossover Offspring1 = a * Parent1 + (1- a) * Parent2 Offspring2 = (1 – a) * Parent1 + a * Parent2 Parent 1: (0.3)(1.4)(0.2)(7.4) Parent 2: (0.5)(4.5)(0.1)(5.6) a=0.7 Offspring1: (0.36)(2.33)(0.17)(6.86) Offspring2: (0.402)(2.981)(0.149)(6.842)

38 Comparison of simple examples In the simple example (trying to get all 1s): – The sexual (two-parent, no mutation) approach, if it succeeds, is likely to succeed much faster Because up to half of the bits change each time, not just one bit – However, with no mutation, it may not succeed at all By pure bad luck, maybe none of the first (randomly generated) words have (say) bit 17 set to 1 – Then there is no way a 1 could ever occur in this position Another problem is lack of genetic diversity – Maybe some of the first generation did have bit 17 set to 1, but none of them were selected for the second generation The best technique in general turns out to be sexual reproduction with a small probability of mutation

Flip Bit -A mutation operator that simply inverts the value of the chosen gene (0 goes to 1 and 1 goes to 0). This mutation operator can only be used for binary genes. Boundary - A mutation operator that replaces the value of the chosen gene with either the upper or lower bound for that gene (chosen randomly). This mutation operator can only be used for integer and float genes. Non-Uniform - A mutation operator that increases the probability that the amount of the mutation will be close to 0 as the generation number increases. This mutation operator keeps the population from stagnating in the early stages of the evolution then allows the genetic algorithm to fine tune the solution in the later stages of evolution. This mutation operator can only be used for integer and float genes. Uniform - A mutation operator that replaces the value of the chosen gene with a uniform random value selected between the user-specified upper and lower bounds for that gene. This mutation operator can only be used for integer and float genes. Gaussian - A mutation operator that adds a unit Gaussian distributed random value to the chosen gene. The new gene value is clipped if it falls outside of the user-specified lower or upper bounds for that gene. This mutation operator can only be used for integer and float genes.