Particle Swarm Optimization (PSO) Algorithm and Its Application in Engineering Design Optimization School of Information Technology Indian Institute of.

Slides:



Advertisements
Similar presentations
Particle Swarm Optimization (PSO)
Advertisements

The Particle Swarm Optimization Algorithm
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Particle Swarm optimisation. These slides adapted from a presentation by - one of main researchers.
Particle Swarm Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimization methods Review
PARTICLE SWARM OPTIMISATION (PSO) Perry Brown Alexander Mathews Image:
Particle Swarm Optimization (PSO)
Particle Swarm Optimization Particle Swarm Optimization (PSO) applies to concept of social interaction to problem solving. It was developed in 1995 by.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
D Nagesh Kumar, IIScOptimization Methods: M1L1 1 Introduction and Basic Concepts (i) Historical Development and Model Building.
NORM BASED APPROACHES FOR AUTOMATIC TUNING OF MODEL BASED PREDICTIVE CONTROL Pastora Vega, Mario Francisco, Eladio Sanz University of Salamanca – Spain.
1 A hybrid particle swarm optimization algorithm for optimal task assignment in distributed system Peng-Yeng Yin and Pei-Pei Wang Department of Information.
Advanced Topics in Optimization
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Introduction and Basic Concepts
A New Algorithm for Solving Many-objective Optimization Problem Md. Shihabul Islam ( ) and Bashiul Alam Sabab ( ) Department of Computer Science.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Particle Swarm Optimization Algorithms

Efficient Model Selection for Support Vector Machines
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Swarm Intelligence 虞台文.
Ken YoussefiMechanical Engineering Dept. 1 Design Optimization Optimization is a component of design process The design of systems can be formulated as.
(Particle Swarm Optimisation)
The Particle Swarm Optimization Algorithm Nebojša Trpković 10 th Dec 2010.
4 Fundamentals of Particle Swarm Optimization Techniques Yoshikazu Fukuyama.
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Topics in Artificial Intelligence By Danny Kovach.
Particle Swarm optimisation. These slides adapted from a presentation by - one of main researchers.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Exact and heuristics algorithms
Taguchi. Abstraction Optimisation of manufacturing processes is typically performed utilising mathematical process models or designed experiments. However,
Solving of Graph Coloring Problem with Particle Swarm Optimization Amin Fazel Sharif University of Technology Caro Lucas February 2005 Computer Engineering.
Particle Swarm Optimization by Dr. Shubhajit Roy Chowdhury Centre for VLSI and Embedded Systems Technology, IIIT Hyderabad.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Particle Swarm Optimization Using the HP Prime Presented by Namir Shammas 1.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Introduction to Optimization
1 Optimization Techniques Constrained Optimization by Linear Programming updated NTU SY-521-N SMU EMIS 5300/7300 Systems Analysis Methods Dr.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
Faculty of Information Engineering, Shenzhen University Liao Huilian SZU TI-DSPs LAB Aug 27, 2007 Optimizer based on particle swarm optimization and LBG.
Particle Swarm Optimization (PSO)
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
On the Computation of All Global Minimizers Through Particle Swarm Optimization IEEE Transactions On Evolutionary Computation, Vol. 8, No.3, June 2004.
Particle Swarm Optimization (PSO) Algorithm. Swarming – The Definition aggregation of similar animals, generally cruising in the same directionaggregation.
 Introduction  Particle swarm optimization  PSO algorithm  PSO solution update in 2-D  Example.
Swarm Intelligence. Content Overview Swarm Particle Optimization (PSO) – Example Ant Colony Optimization (ACO)
Particle Swarm Optimization (2)
Scientific Research Group in Egypt (SRGE)
Heuristic Optimization Methods
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
Ana Wu Daniel A. Sabol A Novel Approach for Library Materials Acquisition using Discrete Particle Swarm Optimization.
Meta-heuristics Introduction - Fabien Tricoire
آموزش شبکه عصبی با استفاده از روش بهینه سازی PSO
Multi-objective Optimization Using Particle Swarm Optimization
metaheuristic methods and their applications
بهينه‌سازي گروه ذرات (PSO)
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
Instructor :Dr. Aamer Iqbal Bhatti
现代智能优化算法-粒子群算法 华北电力大学输配电系统研究所 刘自发 2008年3月 1/18/2019
Optimization and Some Traditional Methods
Multi-objective Optimization Using Particle Swarm Optimization
What are optimization methods?
SWARM INTELLIGENCE Swarms
Presentation transcript:

Particle Swarm Optimization (PSO) Algorithm and Its Application in Engineering Design Optimization School of Information Technology Indian Institute of Technology Kharagpur September 9, 2005 Sushanta Kumar Mandal Research Scholar By

Outline  Introduction to Optimization  Optimization Procedure  Different Optimization Algorithms  Different Global Optimization Algorithms  Particle Swarm Optimization (PSO) Algorithm  Application of PSO in Design Optimization Problems

Optimization As ageless as time

Calculus Maximum and minimum of a smooth function is reached at a stationary point where its gradient vanishes.

Optimization is Everywhere  The more we know about something, the more we see where optimization can be applied.  Some personal decision making - Finding fastest route home or class - Optimal allocation of time for home work - Optimal budgeting

Goal of Optimization Find values of the variables that minimize or maximize the objective function while satisfying the constraints.

Component of Optimization Problem  Objective Function: An objective function which we want to minimize or maximize.  For example, in a manufacturing process, we might want to maximize the profit or minimize the cost.  In fitting experimental data to a user-defined model, we might minimize the total deviation of observed data from predictions based on the model.  In designing an inductor, we might want to maximize the Quality Factor and minimize the area.

Component of Optimization Problem  Design Variables: A set of unknowns or variables which affect the value of the objective function.  In the manufacturing problem, the variables might include the amounts of different resources used or the time spent on each activity.  In fitting-the-data problem, the unknowns are the parameters that define the model.  In the inductor design problem, the variables used define the layout geometry of the panel.

Component of Optimization Problem  Constraints : A set of constraints that allow the unknowns to take on certain values but exclude others.  For the manufacturing problem, it does not make sense to spend a negative amount of time on any activity, so we constrain all the "time" variables to be non-negative.  In the inductor design problem, we would probably want to limit the upper and lower value of layout parameters and to target an inductance value within the tolerance level.

Are All these ingredients necessary?  Almost all optimization problems have objective function.  No objective function. In some cases (for example, design of integrated circuit layouts), the goal is to find a set of variables that satisfies the constraints of the model. The user does not particularly want to optimize anything so there is no reason to define an objective function. This type of problems is usually called a feasibility problem.

Are All these ingredients necessary?  Variables are essential. If there are no variables, we cannot define the objective function and the problem constraints.  Constraints are not essential. In fact, the field of unconstrained optimization is a large and important one for which a lot of algorithms and software are available. It's been argued that almost all problems really do have constraints.

What We Need for Optimization  Models: Modeling is the process of identifying objective function, variables and constraints. The goal of models is “insight” not the numbers. A good mathematical model of the optimization problem is needed.  Algorithms: Typically, an interesting model is too complicated to be able to solve in with paper and pencil. An effective and reliable numerical algorithm is needed to solve the problem. There is no universal optimization algorithm. Algorithm should have robustness (good performance for a wide class of problems), efficiency (not too much computer time) and accuracy (can identify the error)

Flowchart of Optimal Design Procedure Need for optimization Choose design variables Formulate constraints Formulate objective function Set up variable bounds Select an optimization algorithm Obtain solution(s)

Mathematical Formulation of Optimization Problems

Constraints  Inequality constraints: x 1 2 – x 2 2  0  Equality constraints: x 1 = 2

Variable Bounds  Maximum and minimum bounds on each design variable.  Without variable bounds the constraints completely surround the feasible region.  Variable bounds are used to confine the search algorithm within these bounds.  Ex:

Classification of Optimization Methods  Single variable  Multi-variable  Constrained  Non-constrained  Single objective  Multi-objective  Linear  Non-linear

Classifications of Optimization Methods

Local and Global Optimizers  A local minimizer, x B *, of the region B, is defined so that: f(x B * )  f(x),  x  B.  Ex: Gradient based search methods, Newton-Rapson algorithms, Steepest Decent, Conjugate-Gradient algorithms, Levenberg-Marquardt algorithm etc.  Shortcomings: 1) One requires an initial guess to start with. 2) Convergence to an optimal solution depends on the chosen initial guess. 3) Most algorithms tend to get stuck to a sub-optimal solution. 4) An algorithm efficient in solving one optimization problem may not be efficient in solving another one. 5) These are useful over a relatively narrow range.

Local and Global Optimizers  The global optimizer, x *, is defined so that f(x * )  f(x),  x  S where S is the search space.  Ex. Simulated Annealing algorithm, Genetic Algorithm, Ant Colony, Geometric Programming, Particle Swarm Optimization etc.

Local and Global Optimizers

Particle Swarm Optimization  Evolutionary computational technique based on the movement and intelligence of swarms looking for the most fertile feeding location  It was developed in 1995 by James Kennedy and Russell Eberhart  Simple algorithm, easy to implement and few parameters to adjust mainly the velocity  A “swarm” is an apparently disorganized collection (population) of moving individuals that tend to cluster together while each individual seems to be moving in a random direction

Continued…  It uses a number of agents (particles) that constitute a swarm moving around in the search space looking for the best solution.  Each particle is treated as a point in a D- dimensional space which adjusts its “flying” according to its own flying experience as well as the flying experience of other particles  Each particle keeps track of its coordinates in the problem space which are associated with the best solution (fitness) that has achieved so far. This value is called pbest.

Continued…  Another best value that is tracked by the PSO is the best value obtained so far by any particle in the neighbors of the particle. This value is called gbest.  The PSO concept consists of changing the velocity(or accelerating) of each particle toward its pbest and the gbest position at each time step.

Continued… Each particle tries to modify its current position and velocity according to the distance between its current position and pbest, and the distance between its current position and gbest. CurrentPosition[n+1] = CurrentPosition[n] + v[n+1] current position[n+1]: position of particle at n+1th iteration current position[n]: position of particle at nth iteration v[n+1] : particle velocity at n+1th iteration v n+1 : Velocity of particle at n+1 th iteration V n : Velocity of particle at nth iteration c 1 : acceleration factor related to gbest c 2 : acceleration factor related to lbest rand1( ): random number between 0 and 1 rand2( ): random number between 0 and 1 gbest: gbest position of swarm pbest: pbest position of particle

PSO Algorithm For each particle Initialize particle with feasible random number END Do For each particle Calculate the fitness value If the fitness value is better than the best fitness value (pbest) in history Set current value as the new pbest End Choose the particle with the best fitness value of all the particles as the gbest For each particle Calculate particle velocity according to velocity update equation Update particle position according to position update equation End While maximum iterations or minimum error criteria is not attained

gbest and lbest  global version: vx[ ][ ] = vx[ ][ ] + 2*rand( )*(pbest[ ][ ] – presentx[ ][ ]) +2*rand( )*(pbestx[ ][gbest] – presentx[ ][ ])  local version: vx[ ][ ] = vx[ ][ ] + 2*rand( )*(pbest[ ][ ] – presebtx[ ][ ]) + 2*rand( )*(pbestx[ ][lbest] – presentx[ ][ ])

Particle Swarm Optimization: Swarm Topology  In PSO, there have been two basic topologies used in the literature –Ring Topology (neighborhood of 3) –Star Topology (global neighborhood) I 4 I 0 I 1 I 2 I 3 I 4 I 0 I 1 I 2 I 3

PSO Parameters: Velocity  An important parameter in PSO; typically the only one adjusted  Calmps particles’ velocities on each dimenson  Determines “fineness” with which regions are searched  If too high, can fly past optimal solutions  If too low, can get stuck in local minima

Flow Chart for Extraction Procedure using PSO

Comparison of Genetic Algorithm and PSO Tested in a MATLAB Program, P4 1.7GHz CPU, 256M RAM. No. of particle/ population size = 100 No. of simulation runs: Crossover Prob.= 0.9 Mutation Prob. =.01

Model Fitting

Inductor Optimization

Constrained PSO Optimization

Thank You