Presentation is loading. Please wait.

Presentation is loading. Please wait.

Meta-Heuristic Algorithms 16B1NCI637

Similar presentations


Presentation on theme: "Meta-Heuristic Algorithms 16B1NCI637"— Presentation transcript:

1 Meta-Heuristic Algorithms 16B1NCI637
Raju Pal Assistant Professor JIIT, Sector 128, Noida

2 Course Objectives JIIT

3 Why? Used for solving many real world, complex or large problems where classical methods does not work The number of application areas are substantially increasing. These methods are very easy and simple to be applied and very simple to be learn. JIIT

4 Optimization Introduction
Meta-heuristic is a subfield of stochastic optimization. Stochastic optimization is a optimization method in which input parameters are subject to randomness. Stochastic = probabilistic A process of making something (as a design of a system, or decision) as fully perfect, functional, or effective as possible Almost everywhere it is used viz., science, economics, finance ... every area of engineering and industry Objectives is to save time, money, energy, resources, and to maximize efficiency, performance, quality Optimization JIIT

5 Examples of Optimization
Routing and scheduling: transport routes, bus/train/airlines routes, project scheduling ... Planning: utility of resources, time, money, and optimal output, services, performance Design optimization: structural optimization, shape optimization, parameter optimization ... better products Optimal control: optimal control of dynamic systems (e.g., spacecraft, moon lander, car, airplanes...) Economics: portfolio, financial derivatives, ..., banking. Optimization is Like Treasure Hunting How to find a treasure, a hidden 1 million dollars? What is your best strategy? JIIT

6 Mathematical Representation of Optimization Problem
Components xi of x are called design or decision variables, and they can be real continuous, discrete, or a mix of these two The functions where are called the objective functions or simply cost functions The space spanned by the decision variables is called the design space or search space The space formed by the objective function values is called the solution space or response space The equalities for h and inequalities for g are called constraints where, JIIT

7 Optimization Methods Classical (deterministic) optimization methods
Stochastic (probabilistic) optimization methods Classical optimization methods include, for example gradient descent (following the steepest slope) Newton’s method penalty methods, Lagrange multiplier methods etc. JIIT

8 Types of problems (or functions)
Classical methods are particularly useful in convex problems e.g.  quadratic function  and the exponential function, where any local minimum also is a global minimum. Linear functions are convex, so linear programming problems are convex problems. JIIT

9 Deterministic Method Demerits Merits
Give exact solutions Do not use any stochastic technique Rely on the thorough search of the feasible domain. Demerits Not Robust- can only be applied to restricted class of problems. Often too time consuming or sometimes unable to solve real world problems Classical methods are less useful in cases with non-differentiable objective functions objective functions whose values can only be obtained as a result of a (lengthy) simulation varying number of variables (as in optimization of neural networks). JIIT

10 Stochastic optimization
Merits Applicable to wider set of problems i.e. function need not be convex, continuous or explicitly defined Use the stochastic or probabilistic approach i.e. random approach Demerits Converges to the global optima probabilistically Some times get stuck at local optima. Many (but not all) stochastic optimization methods are inspired by Nature Nature is all about adaptation, which can be seen as a kind of optimization JIIT

11 Nature Inspired Algorithms
Nature provide some of the efficient ways to solve problems Algorithms imitating processes inspired from nature Problems Aircraft wing design Bullet train Robotic Spy Plane JIIT

12 The No Free Lunch Theorem
One of the more interesting developments in optimization theory was the publication of the No Free Lunch (NFL) theorem (Wolpert and Macready, 1995; Wolpert and Macready, 1997). This theorem states that the performance of all optimization (search) algorithms, amortized over the set of all possible functions, is equivalent. JIIT

13 Nature Inspired Algorithms for Optimizations
JIIT

14 Evolution

15 Evolutionary Algorithms
Natural selection - A guided search procedure Individuals suited to the environment survive, reproduce and pass their genetic traits to offspring (Survival of the fittest) Offsprings created by reproduction, mutation, etc. Populations adapt to their environment. Variations accumulate over time to generate new species JIIT

16 Evolutionary Algorithms
Terminologies Individual - carrier of the genetic information (chromosome). It is characterized by its state in the search space, its fitness (objective function value). Population - pool of individuals which allows the application of genetic operators. Fitness function - The term “fitness function” is often used as a synonym for objective function. Generation - (natural) time unit of the EA, an iteration step of an evolutionary algorithm. JIIT

17 Evolutionary Algorithms
JIIT

18 Evolutionary Algorithms
JIIT

19 Evolutionary Algorithms
Selection The survival of the fittest, which means the highest quality chromosomes and/characteristics will stay within the population Motivation is to preserve the best (make multiple copies) and eliminate the worst Crossover Create new solutions by considering more than one individual The recombination of two parent chromosomes (solutions) by exchanging part of one chromosome with a corresponding part of another so as to produce offsprings (new solutions) Search for new and hopefully better solutions Mutation The change of part of a chromosome (a bit or several bits) to generate new genetic characteristics Keep diversity in the population JIIT

20 Concept of Exploration vs Exploitation
Search for promising solutions Generate solutions with enough diversity and far from the current solutions Mutation operators The search is typically on a global scale Exploitation Preferring the good solutions Generate new solutions that are better than existing solutions Crossover and Selection operator This process is typically local Excessive exploration – Random search. Excessive exploitation – Premature convergence JIIT

21 Evolutionary Algorithms
JIIT

22 Evolutionary Algorithms
Classical gradient based algorithms Convergence to an optimal solution usually depends on the starting solution. Most algorithms tend to get stuck to a locally optimal solution. An algorithm efficient in solving one class of optimization problem may not be efficient in solving others. Algorithms cannot be easily parallelized. Evolutionary algorithms Convergence to an optimal solution is designed to be independent of initial population. A search based algorithm. Population helps not to get stuck to locally optimal solution. Can be applied to wide class of problems without major change in the algorithm. Can be easily parallelized. JIIT

23 Dependency on the starting solution for gradient-based algorithms
Newton’s Method The iteration procedure starts from an initial guess x0 and continues until a certain criterion is met Let a nonlinear function Function can be written as JIIT

24 Newton’s Method JIIT

25 Fitness Landscapes JIIT


Download ppt "Meta-Heuristic Algorithms 16B1NCI637"

Similar presentations


Ads by Google