Societies of Hill-Climbers Before presenting SoHCs let’s first talk about Hill-Climbing in general. For this lecture we will confine ourselves to binary-

Slides:



Advertisements
Similar presentations
Heuristic Search techniques
Advertisements

Local optimization technique G.Anuradha. Introduction The evaluation function defines a quality measure score landscape/response surface/fitness landscape.
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
CSM6120 Introduction to Intelligent Systems Evolutionary and Genetic Algorithms.
Bio-Inspired Optimization. Our Journey – For the remainder of the course A brief review of classical optimization methods The basics of several stochastic.
CHAPTER 9 E VOLUTIONARY C OMPUTATION I : G ENETIC A LGORITHMS Organization of chapter in ISSO –Introduction and history –Coding of  –Standard GA operations.
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
Genetic Algorithms Sushil J. Louis Evolutionary Computing Systems LAB Dept. of Computer Science University of Nevada, Reno
Genetic Algorithms An Example Genetic Algorithm Procedure GA{ t = 0; Initialize P(t); Evaluate P(t); While (Not Done) { Parents(t) = Select_Parents(P(t));
Particle Swarm Optimization Particle Swarm Optimization (PSO) applies to concept of social interaction to problem solving. It was developed in 1995 by.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Hybridization of Search Meta-Heuristics Bob Buehler.
Introduction to Evolutionary Computation Evolutionary Computation is the field of study devoted to the design, development, and analysis is problem solvers.
Estimation of Distribution Algorithms Let’s review what have done in EC so far: We have studied EP and found that each individual searched via Gaussian.
Genetic Algorithms in Materials Processing N. Chakraborti Department of Metallurgical & Materials Engineering Indian Institute of Technology Kharagpur.
An Introduction to Black-Box Complexity
Imagine that I am in a good mood Imagine that I am going to give you some money ! In particular I am going to give you z dollars, after you tell me the.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
1 A Novel Binary Particle Swarm Optimization. 2 Binary PSO- One version In this version of PSO, each solution in the population is a binary string. –Each.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
Introduction to Computational Intelligence (Evolutionary Computation) Evolutionary Computation is the field of study devoted to the design, development,
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Genetic Algorithm.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Introduction to Genetic Algorithms and Evolutionary Computation
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Algorithms and their Applications CS2004 ( )
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Design & Analysis of Algorithms Combinatory optimization SCHOOL OF COMPUTING Pasi Fränti
(Particle Swarm Optimisation)
Outline Introduction Evolution Strategies Genetic Algorithm
Topics in Artificial Intelligence By Danny Kovach.
Local Search: walksat, ant colonies, and genetic algorithms.
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
1 “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
Particle Swarm Optimization Speaker: Lin, Wei-Kai
Algorithms and their Applications CS2004 ( ) 13.1 Further Evolutionary Computation.
Exact and heuristics algorithms
FORS 8450 Advanced Forest Planning Lecture 5 Relatively Straightforward Stochastic Approach.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Genetic Algorithms An Example Genetic Algorithm Procedure GA{ t = 0; Initialize P(t); Evaluate P(t); While (Not Done) { Parents(t) = Select_Parents(P(t));
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Genetic Algorithms Chapter Description of Presentations
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Agenda  INTRODUCTION  GENETIC ALGORITHMS  GENETIC ALGORITHMS FOR EXPLORING QUERY SPACE  SYSTEM ARCHITECTURE  THE EFFECT OF DIFFERENT MUTATION RATES.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Using GA’s to Solve Problems
Discrete ABC Based on Similarity for GCP
Meta-heuristics Introduction - Fabien Tricoire
School of Computer Science & Engineering
CSC 380: Design and Analysis of Algorithms
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
metaheuristic methods and their applications
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
EE368 Soft Computing Genetic Algorithms.
Machine Learning: UNIT-4 CHAPTER-2
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Population Based Metaheuristics
CSC 380: Design and Analysis of Algorithms
Presentation transcript:

Societies of Hill-Climbers Before presenting SoHCs let’s first talk about Hill-Climbing in general. For this lecture we will confine ourselves to binary- coded representations; however, it is possible for us to extend the concept of hill-climbing to real-coded representations as well.

Deterministic Hill-Climbing In our discussion on Hill-Climbing lets try to solve two function optimization problems (Maximization): –F1(x) = x 2, and –F2(x) = Σ x i We will apply two types of hill-climbing –Steepest Ascent and –Next Ascent.

Deterministic Hill-Climbing Hill-Climbers are local search techniques. Therefore it is important to define the neighbor of the local search. For our demonstration, all CSs that are a hamming distance of 1 away from the parent (indivdual being mutated) are considered neighbors.

Deterministic Steepest Ascent Hill-Climbing Consider the representation for CS for F1 and F2 to be composed of 5-bits. Deterministic Steepest Ascent Hill-Climbing works as follows: –Step1: Randomly generate and evaluate, b, the current best individual –Step2: Generate and evaluate ALL of the neighbors of b. Let Neighbors = the set of λ neighbors. –Step3: b = best-of(b  Neighbors) –Step4: if the the fitness of b does not improve either terminate the search or goto Step1; Else goto Step2.

Deterministic Steepest Ascent Hill-Climbing Let’s use DSAHC to solve F1 –Let b = (01001) –Neighbors = {(11001), (00001), (01101), 01011), (01000)} –The best of b  Neighbors is (11001) therefore, –Let b = (11001)

Deterministic Steepest Ascent Hill-Climbing Continuing (second cycle) –Let b = (11001) –Neighbors = {(01001), (10001), (11101), 11011), (11000)} –The best of b  Neighbors is (11101) therefore, –Let b = (11101)

Deterministic Steepest Ascent Hill-Climbing Continuing (third cycle) –Let b = (11101) –Neighbors = {(01101), (10101), (11001), 11111), (11100)} –The best of b  Neighbors is (11111) therefore, –Let b = (11111) On the fourth cycle, we cannot find a candidate solution that is better than b so we will terminate our search.

Deterministic Steepest Ascent Hill-Climbing For some problems the defined neighbor may be too large to exhaustively search. When this is the case, a Stochastic Steepest Ascent Hill-Climber may be preferred. In a Stochastic Steepest Ascent Hill-Climber, Neighbors is a set of a user-specified number of neighbors are generated from the neighborhood. b = best-of(b  Neighbors)

Deterministic Next Ascent Hill-Climbing Deterministic Next Ascent Hill-Climbing works as follows: –Step1: Randomly generate and evaluate, b, the current best individual For k = 1 to Number of Neighbors Do –Step2: Generate and evaluate the neighbors of b one at a time. –Generate and evaluate Neighbor k –Step3: b = best-of(b,Neighbor k ) End (For) –Step4: if the the fitness of b does not improve either terminate the search or goto Step1; Else goto Step2.

Deterministic Next Ascent Hill-Climbing Let’s use DNAHC to solve F1 –Let b = (01001) –Neighbor = (11001) –The best of b and Neighbor is (11001) therefore, –Let b = (11001)

Deterministic Next Ascent Hill-Climbing Continuing (second cycle) –Let b = (11001) –Neighbor = (01001) –The best of b and Neighbor is (11001) –Neighbor = (10001) –The best of b and Neighbor is (11001) –Neighbor = (11101) –The best of b and Neighbor is (11101) –Let b = (11101), goto third cycle

Deterministic Next Ascent Hill-Climbing Continuing (third cycle) –Let b = (11101) –Neighbor = (01101) –The best of b and Neighbor is (11101) –Neighbor = (10101) –The best of b and Neighbor is (11101) –Neighbor = (11001) –The best of b and Neighbor is (11101) –Neighbor = (11111) –The best of b and Neighbor is (11111) –Let b = (11111), goto fourth cycle

Deterministic Next Ascent Hill-Climbing Continuing (fourth cycle) –Let b = (11111) –Neighbor = (01111) –The best of b and Neighbor is (11111) –Neighbor = (10111) –The best of b and Neighbor is (11111) –Neighbor = (11101) –The best of b and Neighbor is (11111) –Neighbor = (11110) –The best of b and Neighbor is (11111) –Let b = (11111), Terminate Search

Stochastic Next Ascent Hill-Climbing In order to do deterministic hill-climbing one must also specify the order in which the neighbors are to be generated and evaluated. This order can dramatically affect the performance of hill-climbing. In a Stochastic Next Ascent Hill-Climber, neighbors are randomly selected from the neighborhood.

Some Hill-Climbing Questions Of the two type of hill-climbers which will perform better on F2? For what types of problems will Steepest Ascent Hill-Climbing outperform Next Ascent Hill-Climbing? For what types of problems will Next Ascent Hill-Climbing outperform Steepest Ascent Hill-Climbing?

Societies of Hill-Climbers A Society of Hill-Climbers (SoHCs) as proposed by Sebag & Schoenauer (1997) can be seen as being composed of: –A (μ+λ)-EC, and –A repoussoir (repeller) which is an adaptive external memory structure that is used to ‘push’ the search away from known sub- optimal solutions.

Societies of Hill-Climbers Societies of Hill-Climbers, since they evolve an adaptive external memory structure, can be consider a swarm (Swarm Intelligence Method) similar to: –Particle Swarm Optimization [P-Vectors], –Ant Colony Optimization [Pheromone Matrix], –Cultural Algorithms [Belief Space], –Tabu Search [Tabu List]

Societies of Hill-Climbers Sebag & Schoenaur describe two types of Hill-Climber Societies: –Natural Societies μ represents the number of hill-climbers (candidate solutions), λ represents the number of offspring created (λ/μ offspring for each parent), and M represents the number of bits to be mutated to create an offspring.

Societies of Hill-Climbers –Historical Societies R the percentage of the worst fit CSs that will be used to create the AvgWorst in order to update the repeller, α represents the decay rate for the components of R, and T is the tournament size for selecting a variable (gene) of the parent to mutate. The only sensitive parameter for historical societies is M.

Societies of Hill-Climbers Historical Societies Repeller is a real-valued vector of n where n is the number of variables. Each component of Repeller is initially assigned a value of 0.5. The Repeller is updated as follows: Repeller = (1-α) Repeller + αAvgWorst.

Societies of Hill-Climbers: Historical Societies An Example SoHC Procedure SoHC{ t = 0; Initialize Pop(t); Evaluate Pop(t); Intialize Repoussor(t); While (Not Done) { Offspring(t) = Procreate(Pop(t),Repoussor(t)); Evaluate(Offspring(t)); Pop(t+1) = Best_μ_of(Pop(t),Offspring(t)); AvgWorst = Worst_R%_of(Pop(t),Offspring(t)); Repoussoir(t+1) = (1-α)Repoussoir(t) + αAvgWorst; t = t + 1; }

Societies of Hill-Climbers: Historical Societies Let’s try an example by hand for F1! Pop(0): Person 1: Fit: 100 Person 2: Fit: 9 Offspring(0): Child 1: Fit: 4 Child 2: Fit: 196 Child 3: Fit: 361 Child 4: Fit: 1

Societies of Hill-Climbers: Historical Societies Person 2: Fit: 9 Child 1: Fit: 4 Child 4: Fit: AvgWorst: Repoussoir(0): Repoussoir(1):

Societies of Hill-Climbers: Historical Societies Repoussoir(1): Pop(1): Person 1: Fit: 196 Person 2: Fit: 361 Offspring(1): Child 1: Fit: 729 Child 2: Fit: 529 Child 3: Fit: 900 Child 4: Fit: 225

Societies of Hill-Climbers: Historical Societies Person 1: Fit: 196 Person 2: Fit: 361 Child 4: Fit: AvgWorst: Repoussoir(1): Repoussoir(2):

Societies of Hill-Climbers: Historical Societies Repoussoir(2): Pop(2): Person 1: Fit: 729 Person 2: Fit: 900 Offspring(1): Child 1: _____ Fit: ___ Child 2: _____ Fit: ___ Child 3: _____ Fit: ___ Child 4: _____ Fit: ___

Societies of Hill-Climbers: Historical Societies Will the Repoussoir work all of the time? How can we develop more efficient SoHCs? What have we learned from our brief introduction to Swarm Intelligence? –Some swarms are ‘pullers’ Ant Swarms, Ant Colony & Particle Swarm Optimizers, Cultural Algorithms –Some swarms are ‘pushers’ Societies of Hill-Climbers Tabu Search Should we really rely on pushing or pulling exclusively? During what parts of the evolutionary process would ‘pushing’ be more beneficial? During what parts of the evolutionary process would ‘pulling’ be more beneficial?