Chapter 4 Beyond Classical Search

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

CS6800 Advanced Theory of Computation
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Iterative Improvement Algorithms
Artificial Intelligence Genetic Algorithms and Applications of Genetic Algorithms in Compilers Prasad A. Kulkarni.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
7/2/2015Intelligent Systems and Soft Computing1 Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Genetic Algorithm.
Evolutionary Intelligence
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Chapter 4.1 Beyond “Classic” Search. What were the pieces necessary for “classic” search.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
 Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms n Introduction, or can evolution be intelligent? n Simulation.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
ECE 103 Engineering Programming Chapter 52 Generic Algorithm Herbert G. Mayer, PSU CS Status 6/4/2014 Initial content copied verbatim from ECE 103 material.
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
In the name of ALLAH Presented By : Mohsen Shahriari, the student of communication in Sajad institute for higher education.
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Genetic Algorithms.
Introduction to Genetic Algorithms
Chapter 14 Genetic Algorithms.
Optimization via Search
Genetic Algorithms.
Evolution Strategies Evolutionary Programming
Evolution strategies and genetic programming
Chapter 6: Genetic Algorithms
CSC 380: Design and Analysis of Algorithms
Intelligent Systems and Soft Computing
Artificial Intelligence (CS 370D)
Genetic Algorithms, Search Algorithms
Genetic Algorithms overview
Case Study: Genetic Algorithms
Basics of Genetic Algorithms (MidTerm – only in RED material)
Genetic Algorithms CSCI-2300 Introduction to Algorithms
GENETIC ALGORITHMS & MACHINE LEARNING
More on Search: A* and Optimization
Dr. Unnikrishnan P.C. Professor, EEE
Basics of Genetic Algorithms
EE368 Soft Computing Genetic Algorithms.
Boltzmann Machine (BM) (§6.4)
Searching for solutions: Genetic Algorithms
Genetic Algorithms & Simulated Evolution
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Population Based Metaheuristics
CSC 380: Design and Analysis of Algorithms
GA.
Presentation transcript:

Chapter 4 Beyond Classical Search 4.1.1 Hill Climbing Search

Hill Climbing Issues Also referred to as gradient descent Foothill problem / local maxima / local minima Can be solved with random walk or more steps Other problems: ridges, plateaus

Chapter 4.1.4 Genetic algorithms The basic purpose of genetic algorithms (GAs) is optimization. Since optimization problems arise frequently, this makes GAs quite useful for a great variety of tasks. As in all optimization problem, we are faced with the problem of maximizing/minimizing an objective function f(x) over a given space X of arbitrary dimension. A brute force which would consist in examining every possible x in X in order to determine the element for which f is optimal is clearly infeasible. GAs give a heuristic way of searching the input space for optimal x that approximates brute force without enumerating all the elements and therefore bypasses performance issues specific to exhaustive search.

Underlying idea (1) We will first select a certain number of inputs, say, x1,x2 ... xn belonging to the input space X In the GA terminology, each input is called an organism or chromosome . The set of chromosomes is designated as a colony or population Computation is done over epochs. In each epoch the colony will grow and evolve according to specific rules reminiscent of biological evolution

Underlying idea (2) To each chromosome xi, we assign a fitness value which is nothing but f(xi). Stronger individuals, that is those chromosomes with fitness values closer to the colony optimal will have greater chance to survive across epochs and to reproduce than weaker individuals which will tend to perish. In other words, the algorithm will tend to keep inputs that are close to the optimal in the set of inputs being considered (the colony) and discard those that under-perform the rest.

Underlying idea (3) The crucial step in the algorithm is reproduction or breeding that occurs once per epoch. The content of the two chromosomes participating in reproduction are literally merged together to form a new chromosome that we call (guess what?) a child. This heuristic allows us to possibly combine the best of both individuals to yield a better one (evolution).

Underlying idea (4) Moreover during each epoch, a given fraction of the organisms is allowed to mutate (yes, we also have parthenogenesis). This provides a degree of randomness which allows us to span the whole input space by generating individuals with partly random genes.

Underlying idea (5) As mentioned earlier, each epoch ends with the deaths of inapt organisms. We eliminate inputs exhibiting bad performance compared to the overall group. This is based on the assumption that they're less inclined to give birth to strong individuals since they have poor quality genes and that therefore we can safely disregard them (selection)

The algorithm Every input x in X is an integer vector x=(x1,x2,...,xn). For the sake of simplicity, assume 0<=xi<=k for i=1...n. In order to implement our genetic algorithm for optimizing f, we first need to encode each input into a chromosome. We can do it by having log(k) bits per component and directly encoding the value xi (see Fig. in next slide). Each bit will be termed gene. Of course, we may choose any other encoding based on our requirements and the problem at hand.

The algorithm (cont’d) At epoch 0, we generate (possibly randomly) an initial set of inputs in X. Then at each epoch i, we perform fitness evaluation, reproduction, mutation and selection. The algorithm stops when a specified criterion providing an estimate of convergence is reached.

Reproduction: At each epoch, we choose a set of chromosomes belonging to the population that will mate. We choose to call such individuals females. Each female chooses a random set of potential partners and mates with the fittest of the group (this is another way of achieving selection). Once two organisms have been chosen for crossover, we merge their genetic information in order to create a new organism. The split position is determined randomly.

Mutation : A new organism is created by randomly modifying some of its genes. This can be done right after reproduction on the newly created child or as a separate process.

Death: Worst performers among the colony are given a high probability of dying at the end of each epoch. We may also consider eliminating old chromosomes (Note: The highest performer is immune from death from old age).

Remarks : In order to have fine grained control over the computation, we have to adjust parameters such as colony size, rate of reproduction, rate of death... Obviously these must be set empirically in order to fine tune the performance of the GA. We can do even better by incorporating such parameters in each chromosome so that optimal values may be found by the GA itself. A major problem with most optimization technique is hill climbing: The algorithm gets trapped into a local maximum and stops when the convergence criterion is reached. Mutation which introduces randomness in the method allows us somehow, to avoid or at least reduce this undesirable effect.