Meta-heuristics Introduction - Fabien Tricoire

Slides:



Advertisements
Similar presentations
Particle Swarm Optimization (PSO)
Advertisements

Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
The Particle Swarm Optimization Algorithm
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Particle Swarm Optimization
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Particle Swarm Optimization (PSO)  Kennedy, J., Eberhart, R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference.
Particle Swarm Optimization (PSO)
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Reporter : Mac Date : Multi-Start Method Rafael Marti.
1 A hybrid particle swarm optimization algorithm for optimal task assignment in distributed system Peng-Yeng Yin and Pei-Pei Wang Department of Information.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Particle Swarm Optimization Algorithms
Swarm Intelligence 虞台文.
Algorithms and their Applications CS2004 ( )
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Particle Swarm Optimization (PSO) Algorithm and Its Application in Engineering Design Optimization School of Information Technology Indian Institute of.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
PSO and its variants Swarm Intelligence Group Peking University.
(Particle Swarm Optimisation)
The Particle Swarm Optimization Algorithm Nebojša Trpković 10 th Dec 2010.
4 Fundamentals of Particle Swarm Optimization Techniques Yoshikazu Fukuyama.
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Topics in Artificial Intelligence By Danny Kovach.
Heuristic Optimization Methods Scatter Search. 2 Agenda Scatter Search (SS) –For Local Search based Metaheuristics: SA based on ideas from nature TS based.
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
Particle Swarm Optimization Speaker: Lin, Wei-Kai
Exact and heuristics algorithms
DYNAMIC FACILITY LAYOUT : GENETIC ALGORITHM BASED MODEL
Solving of Graph Coloring Problem with Particle Swarm Optimization Amin Fazel Sharif University of Technology Caro Lucas February 2005 Computer Engineering.
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
1 Motion Fuzzy Controller Structure(1/7) In this part, we start design the fuzzy logic controller aimed at producing the velocities of the robot right.
Particle Swarm Optimization Using the HP Prime Presented by Namir Shammas 1.
Particle Swarm Optimization (PSO)
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms. Solution Search in Problem Space.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Particle Swarm Optimization (PSO) Algorithm. Swarming – The Definition aggregation of similar animals, generally cruising in the same directionaggregation.
 Introduction  Particle swarm optimization  PSO algorithm  PSO solution update in 2-D  Example.
Swarm Intelligence. Content Overview Swarm Particle Optimization (PSO) – Example Ant Colony Optimization (ACO)
Swarm Intelligence By Nasser M..
Advanced Computing and Networking Laboratory
AN EFFICIENT IMAGE COMPRESSION ALGORITHM WITH GEOMETRIC WAVELETS & GEOMETRIC WAVELET PACKET USING PSO Guide Prof. Mrs.NISHAT KANVEL M.E.
metaheuristic methods and their applications
Particle Swarm Optimization (2)
Scientific Research Group in Egypt (SRGE)
Digital Optimization Martynas Vaidelys.
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
آموزش شبکه عصبی با استفاده از روش بهینه سازی PSO
School of Computer Science & Engineering
Traffic Simulator Calibration
Multi-objective Optimization Using Particle Swarm Optimization
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
METAHEURISTIC Jacques A. Ferland
Multi-band impedance matching using an evolutionary algorithm
Advanced Artificial Intelligence Evolutionary Search Algorithm
metaheuristic methods and their applications
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
现代智能优化算法-粒子群算法 华北电力大学输配电系统研究所 刘自发 2008年3月 1/18/2019
Boltzmann Machine (BM) (§6.4)
Heuristic Optimization Methods
Multi-objective Optimization Using Particle Swarm Optimization
SWARM INTELLIGENCE Swarms
Beyond Classical Search
Population Based Metaheuristics
Presentation transcript:

Meta-heuristics Introduction - Fabien Tricoire Simulated Annealing - Fabien Tricoire Tabu Search- Fabien Tricoire Genetic Algorithms- Fabien Tricoire Memetic Algorithms- Fabien Tricoire Ant Colony Optimization- Fabien Tricoire Particle Swarm Optimization - Varadarajan Komandur Scatter Search – Manuel Laguna

Particle Swarm Optimization (PSO) PSO is a robust stochastic optimization technique based on the movement and intelligence of swarms. PSO applies the concept of social interaction to problem solving. It was developed in 1995 by James Kennedy (social-psychologist) and Russell Eberhart (electrical engineer). It uses a number of agents (particles) that constitute a swarm moving around in the search space looking for the best solution. Each particle is treated as a point in a N-dimensional space which adjusts its “flying” according to its own flying experience as well as the flying experience of other particles.

Particle Swarm Optimization (PSO) Each particle keeps track of its coordinates in the solution space which are associated with the best solution (fitness) that has achieved so far by that particle. This value is called personal best , pbest. Another best value that is tracked by the PSO is the best value obtained so far by any particle in the neighborhood of that particle. This value is called gbest. The basic concept of PSO lies in accelerating each particle toward its pbest and the gbest locations, with a random weighted accelaration at each time step as shown in Fig.1

Particle Swarm Optimization (PSO) y x Fig.1 Concept of modification of a searching point by PSO sk : current searching point. sk+1: modified searching point. vk: current velocity. vk+1: modified velocity. vpbest : velocity based on pbest. vgbest : velocity based on gbest

Particle Swarm Optimization (PSO) Each particle tries to modify its position using the following information: the current positions, the current velocities, the distance between the current position and pbest, the distance between the current position and the gbest. The modification of the particle’s position can be mathematically modeled according the following equation : Vik+1 = wVik +c1 rand1(…) x (pbesti-sik) + c2 rand2(…) x (gbest-sik) ….. (1) where, vik : velocity of agent i at iteration k, w: weighting function, cj : weighting factor, rand : uniformly distributed random number between 0 and 1, sik : current position of agent i at iteration k, pbesti : pbest of agent i, gbest: gbest of the group.

Particle Swarm Optimization (PSO) The following weighting function is usually utilized in (1) w = wMax-[(wMax-wMin) x iter]/maxIter (2) where wMax= initial weight, wMin = final weight, maxIter = maximum iteration number, iter = current iteration number. sik+1 = sik + Vik+1 (3)

Particle Swarm Optimization (PSO) Comments on the Inertial weight factor: A large inertia weight (w) facilitates a global search while a small inertia weight facilitates a local search. By linearly decreasing the inertia weight from a relatively large value to a small value through the course of the PSO run gives the best PSO performance compared with fixed inertia weight settings. Larger w ----------- greater global search ability Smaller w ------------ greater local search ability.

Particle Swarm Optimization (PSO) Flow chart depicting the General PSO Algorithm: Start Initialize particles with random position and velocity vectors. For each particle’s position (p) evaluate fitness Loop until all particles exhaust If fitness(p) better than fitness(pbest) then pbest= p Loop until max iter Set best of pBests as gBest Update particles velocity (eq. 1) and position (eq. 3) Stop: giving gBest, optimal solution.

Comparison with other evolutionary computation techniques. Unlike in genetic algorithms, evolutionary programming and evolutionary strategies, in PSO, there is no selection operation. All particles in PSO are kept as members of the population through the course of the run PSO is the only algorithm that does not implement the survival of the fittest. No crossover operation in PSO. eq 1(b) resembles mutation in EP. In EP balance between the global and local search can be adjusted through the strategy parameter while in PSO the balance is achieved through the inertial weight factor (w) of eq. 1(a)

Variants of PSO Discrete PSO ……………… can handle discrete binary variables MINLP PSO………… can handle both discrete binary and continuous variables. Hybrid PSO…………. Utilizes basic mechanism of PSO and the natural selection mechanism, which is usually utilized by EC methods such as GAs. Cyber Swarm – Applying Scatter Search and Path Relinking - Glover

Scatter Search and Path Relinking: Methodology and Applications Manuel Laguna

Metaheuristic A metaheuristic refers to a master strategy that guides and modifies other heuristics to produce solutions beyond those that are normally generated in a quest for local optimality. A metaheuristic is a procedure that has the ability to escape local optimality

Typical Search Trajectory

Metaheuristic Classification x/y/z Classification x = A (adaptive memory) or M (memoryless) y = N (systematic neighborhood search) or S (random sampling) Z = 1 (one current solution) or P (population of solutions) Some Classifications Tabu search (A/N/1) Genetic Algorithms (M/S/P) Scatter Search (M/N/P)

Scatter Search Overview P Diversification Generation Method Repeat until |P| = PSize Improvement Method Improvement Method Reference Set Update Method RefSet Stop if MaxIter reached Solution Combination Method Improvement Method Subset Generation Method No more new solutions Diversification Generation Method

GA vs. SS GA SS Population Large (~ 100) Small (~10) Reproduction Probabilistic selection of parents Deterministic selection of reference solutions Combination Crossover and mutation Structured combinations Evolution Survival of the fittest Strategic updating to preserve quality and diversity Local Search Recently added as a mutation mechanism Integral part of the procedure