Meta-Heuristic Algorithms 16B1NCI637

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

G5BAIM Artificial Intelligence Methods
School of Computer Science
Bio-Inspired Optimization. Our Journey – For the remainder of the course A brief review of classical optimization methods The basics of several stochastic.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
Genetic algorithms for neural networks An introduction.
Evolutionary Computational Inteliigence Lecture 6a: Multimodality.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
1. Optimization and its necessity. Classes of optimizations problems. Evolutionary optimization. –Historical overview. –How it works?! Several Applications.
1 Paper Review for ENGG6140 Memetic Algorithms By: Jin Zeng Shaun Wang School of Engineering University of Guelph Mar. 18, 2002.
A Comparison of Nature Inspired Intelligent Optimization Methods in Aerial Spray Deposition Management Lei Wu Master’s Thesis Artificial Intelligence Center.
Genetic algorithms Prof Kang Li
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
Fuzzy Genetic Algorithm
Computational Complexity Jang, HaYoung BioIntelligence Lab.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Exact and heuristics algorithms
Learning by Simulating Evolution Artificial Intelligence CSMC February 21, 2002.
 Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms n Introduction, or can evolution be intelligent? n Simulation.
Edge Assembly Crossover
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
1. Genetic Algorithms: An Overview  Objectives - Studying basic principle of GA - Understanding applications in prisoner’s dilemma & sorting network.
Evolutionary Computation (P. Koumoutsakos) 1 What is Life  Key point : Ability to reproduce.  Are computer programs alive ? Are viruses a form of life.
Optimization Problems
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
A MapReduced Based Hybrid Genetic Algorithm Using Island Approach for Solving Large Scale Time Dependent Vehicle Routing Problem Rohit Kondekar BT08CSE053.
Paper Review for ENGG6140 Memetic Algorithms
Introduction to genetic algorithm
Optimization Problems
Genetic Algorithms.
Introduction to Evolutionary Computing
Evolutionary Algorithms Jim Whitehead
Genetic Algorithms GAs are one of the most powerful and applicable search methods available GA originally developed by John Holland (1975) Inspired by.
Digital Optimization Martynas Vaidelys.
School of Computer Science & Engineering
Introduction to Genetic Algorithm (GA)
C.-S. Shieh, EC, KUAS, Taiwan
CSC 380: Design and Analysis of Algorithms
Intelligent Systems and Soft Computing
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Advanced Artificial Intelligence Evolutionary Search Algorithm
Objective of This Course
Collaborative Filtering Matrix Factorization Approach
Optimization Problems
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Dr. Unnikrishnan P.C. Professor, EEE
EE368 Soft Computing Genetic Algorithms.
Boltzmann Machine (BM) (§6.4)
A Gentle introduction Richard P. Simpson
Machine Learning: UNIT-4 CHAPTER-2
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
What are optimization methods?
Beyond Classical Search
Population Based Metaheuristics
CSC 380: Design and Analysis of Algorithms
Presentation transcript:

Meta-Heuristic Algorithms 16B1NCI637 Raju Pal Assistant Professor JIIT, Sector 128, Noida http://bit.ly/2dCUlOO

Course Objectives http://bit.ly/2dCUlOO JIIT

Why? Used for solving many real world, complex or large problems where classical methods does not work The number of application areas are substantially increasing. These methods are very easy and simple to be applied and very simple to be learn. http://bit.ly/2dCUlOO JIIT

Optimization Introduction Meta-heuristic is a subfield of stochastic optimization. Stochastic optimization is a optimization method in which input parameters are subject to randomness. Stochastic = probabilistic A process of making something (as a design of a system, or decision) as fully perfect, functional, or effective as possible Almost everywhere it is used viz., science, economics, finance ... every area of engineering and industry Objectives is to save time, money, energy, resources, and to maximize efficiency, performance, quality Optimization http://bit.ly/2dCUlOO JIIT

Examples of Optimization Routing and scheduling: transport routes, bus/train/airlines routes, project scheduling ... Planning: utility of resources, time, money, and optimal output, services, performance Design optimization: structural optimization, shape optimization, parameter optimization ... better products Optimal control: optimal control of dynamic systems (e.g., spacecraft, moon lander, car, airplanes...) Economics: portfolio, financial derivatives, ..., banking. Optimization is Like Treasure Hunting How to find a treasure, a hidden 1 million dollars? What is your best strategy? http://bit.ly/2dCUlOO JIIT

Mathematical Representation of Optimization Problem Components xi of x are called design or decision variables, and they can be real continuous, discrete, or a mix of these two The functions where are called the objective functions or simply cost functions The space spanned by the decision variables is called the design space or search space The space formed by the objective function values is called the solution space or response space The equalities for h and inequalities for g are called constraints where, http://bit.ly/2dCUlOO JIIT

Optimization Methods Classical (deterministic) optimization methods Stochastic (probabilistic) optimization methods Classical optimization methods include, for example gradient descent (following the steepest slope) Newton’s method penalty methods, Lagrange multiplier methods etc. http://bit.ly/2dCUlOO JIIT

Types of problems (or functions) Classical methods are particularly useful in convex problems e.g.  quadratic function  and the exponential function, where any local minimum also is a global minimum. Linear functions are convex, so linear programming problems are convex problems. http://bit.ly/2dCUlOO JIIT

Deterministic Method Demerits Merits Give exact solutions Do not use any stochastic technique Rely on the thorough search of the feasible domain. Demerits Not Robust- can only be applied to restricted class of problems. Often too time consuming or sometimes unable to solve real world problems Classical methods are less useful in cases with non-differentiable objective functions objective functions whose values can only be obtained as a result of a (lengthy) simulation varying number of variables (as in optimization of neural networks). http://bit.ly/2dCUlOO JIIT

Stochastic optimization Merits Applicable to wider set of problems i.e. function need not be convex, continuous or explicitly defined Use the stochastic or probabilistic approach i.e. random approach Demerits Converges to the global optima probabilistically Some times get stuck at local optima. Many (but not all) stochastic optimization methods are inspired by Nature Nature is all about adaptation, which can be seen as a kind of optimization http://bit.ly/2dCUlOO JIIT

Nature Inspired Algorithms Nature provide some of the efficient ways to solve problems Algorithms imitating processes inspired from nature Problems Aircraft wing design Bullet train Robotic Spy Plane http://bit.ly/2dCUlOO JIIT

The No Free Lunch Theorem One of the more interesting developments in optimization theory was the publication of the No Free Lunch (NFL) theorem (Wolpert and Macready, 1995; Wolpert and Macready, 1997). This theorem states that the performance of all optimization (search) algorithms, amortized over the set of all possible functions, is equivalent. http://bit.ly/2dCUlOO JIIT

Nature Inspired Algorithms for Optimizations http://bit.ly/2dCUlOO JIIT

Evolution

Evolutionary Algorithms Natural selection - A guided search procedure Individuals suited to the environment survive, reproduce and pass their genetic traits to offspring (Survival of the fittest) Offsprings created by reproduction, mutation, etc. Populations adapt to their environment. Variations accumulate over time to generate new species http://bit.ly/2dCUlOO JIIT

Evolutionary Algorithms Terminologies Individual - carrier of the genetic information (chromosome). It is characterized by its state in the search space, its fitness (objective function value). Population - pool of individuals which allows the application of genetic operators. Fitness function - The term “fitness function” is often used as a synonym for objective function. Generation - (natural) time unit of the EA, an iteration step of an evolutionary algorithm. http://bit.ly/2dCUlOO JIIT

Evolutionary Algorithms http://bit.ly/2dCUlOO JIIT

Evolutionary Algorithms http://bit.ly/2dCUlOO JIIT

Evolutionary Algorithms Selection The survival of the fittest, which means the highest quality chromosomes and/characteristics will stay within the population Motivation is to preserve the best (make multiple copies) and eliminate the worst Crossover Create new solutions by considering more than one individual The recombination of two parent chromosomes (solutions) by exchanging part of one chromosome with a corresponding part of another so as to produce offsprings (new solutions) Search for new and hopefully better solutions Mutation The change of part of a chromosome (a bit or several bits) to generate new genetic characteristics Keep diversity in the population http://bit.ly/2dCUlOO JIIT

Concept of Exploration vs Exploitation Search for promising solutions Generate solutions with enough diversity and far from the current solutions Mutation operators The search is typically on a global scale Exploitation Preferring the good solutions Generate new solutions that are better than existing solutions Crossover and Selection operator This process is typically local Excessive exploration – Random search. Excessive exploitation – Premature convergence http://bit.ly/2dCUlOO JIIT

Evolutionary Algorithms http://bit.ly/2dCUlOO JIIT

Evolutionary Algorithms Classical gradient based algorithms Convergence to an optimal solution usually depends on the starting solution. Most algorithms tend to get stuck to a locally optimal solution. An algorithm efficient in solving one class of optimization problem may not be efficient in solving others. Algorithms cannot be easily parallelized. Evolutionary algorithms Convergence to an optimal solution is designed to be independent of initial population. A search based algorithm. Population helps not to get stuck to locally optimal solution. Can be applied to wide class of problems without major change in the algorithm. Can be easily parallelized. http://bit.ly/2dCUlOO JIIT

Dependency on the starting solution for gradient-based algorithms Newton’s Method The iteration procedure starts from an initial guess x0 and continues until a certain criterion is met Let a nonlinear function Function can be written as http://bit.ly/2dCUlOO JIIT

Newton’s Method http://bit.ly/2dCUlOO JIIT

Fitness Landscapes http://bit.ly/2dCUlOO JIIT