Particle Swarm Optimization † Spencer Vogel † This presentation contains cheesy graphics and animations and they will be awesome.

Slides:



Advertisements
Similar presentations
Local optimization technique G.Anuradha. Introduction The evaluation function defines a quality measure score landscape/response surface/fitness landscape.
Advertisements

Decision Support Andry Pinto Hugo Alves Inês Domingues Luís Rocha Susana Cruz.
The Particle Swarm Optimization Algorithm
Particle Swarm optimisation. These slides adapted from a presentation by - one of main researchers.
Derivative-Free Optimization: Biogeography-Based Optimization Dan Simon Cleveland State University 1.
CS 484. Discrete Optimization Problems A discrete optimization problem can be expressed as (S, f) S is the set of all feasible solutions f is the cost.
PARTICLE SWARM OPTIMISATION (PSO) Perry Brown Alexander Mathews Image:
A Fast Thinking Connect Four Machine! Final Project Presented by Tina Wen.
Particle Swarm Optimization (PSO)
PGA – Parallel Genetic Algorithm Hsuan Lee. Reference  E Cantú-Paz, A Survey on Parallel Genetic Algorithm, Calculateurs Paralleles, Reseaux et Systems.
Planning operation start times for the manufacture of capital products with uncertain processing times and resource constraints D.P. Song, Dr. C.Hicks.
CS 584. Discrete Optimization Problems A discrete optimization problem can be expressed as (S, f) S is the set of all feasible solutions f is the cost.
Knight’s Tour Distributed Problem Solving Knight’s Tour Yoav Kasorla Izhaq Shohat.
FLANN Fast Library for Approximate Nearest Neighbors
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
RESEARCH DIRECTIONS IN GRID COMPUTING Dr G Sudha Sadasivam Professor CSE Department, PSG College of Technology.
Particle Swarm Optimization Algorithms
CSM6120 Introduction to Intelligent Systems Other evolutionary algorithms.
ICOM 5995: Performance Instrumentation and Visualization for High Performance Computer Systems Lecture 7 October 16, 2002 Nayda G. Santiago.
Heterogeneous Parallelization for RNA Structure Comparison Eric Snow, Eric Aubanel, and Patricia Evans University of New Brunswick Faculty of Computer.
Particle Swarm Optimization and Social Interaction Between Agents Kenneth Lee TJHSST 2008.
A New Model of Distributed Genetic Algorithm for Cluster Systems: Dual Individual DGA Tomoyuki HIROYASU Mitsunori MIKI Masahiro HAMASAKI Yusuke TANIMURA.
Neural and Evolutionary Computing - Lecture 10 1 Parallel and Distributed Models in Evolutionary Computing  Motivation  Parallelization models  Distributed.
Improved Search for Local Optima in Particle Swarm Optimization May 6, 2015 Huidae Cho Water Resources Engineer, Dewberry Consultants Part-Time Assistant.
SU YUXIN JAN 20, 2014 Petuum: An Iterative-Convergent Distributed Machine Learning Framework.
1 Evaluation of parallel particle swarm optimization algorithms within the CUDA™ architecture Luca Mussi, Fabio Daolio, Stefano Cagnoni, Information Sciences,
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
PSO and its variants Swarm Intelligence Group Peking University.
(Particle Swarm Optimisation)
The Particle Swarm Optimization Algorithm Nebojša Trpković 10 th Dec 2010.
Luminance.  Major group refactoring.  Development Lead: Stephen Damm.  Project Manager: Martina Nagy.  Test team member: Chet Collins.  A lot of.
Topics in Artificial Intelligence By Danny Kovach.
Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory 1.
Simulated Annealing.
Particle Swarm optimisation. These slides adapted from a presentation by - one of main researchers.
A* Project Project 1 Due Sep. 26, 2012, 10pm EST Class presentation on Oct. 1, 2012.
Applications of Genetic Algorithms TJHSST Computer Systems Lab By Mary Linnell.
Moving Web Apps From Synchronous to Asynchronous Processing Jason Carreira Architect, ePlus Systems OpenSymphony member.
CS 484 Designing Parallel Algorithms Designing a parallel algorithm is not easy. There is no recipe or magical ingredient Except creativity We can benefit.
Regrouping Particle Swarm Optimization: A New Global Optimization Algorithm with Improved Performance Consistency Across Benchmarks George I. Evers Advisor:
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
Particle Swarm Optimization † Spencer Vogel † This presentation contains cheesy graphics and animations and they will be awesome.
Parallel Genetic Algorithms By Larry Hale and Trevor McCasland.
CS 584. Discrete Optimization Problems A discrete optimization problem can be expressed as (S, f) S is the set of all feasible solutions f is the cost.
Particle Swarm Optimization Using the HP Prime Presented by Namir Shammas 1.
Towards the Automated Design of Phased Array Ultrasonic Transducers – Using Particle Swarms to find “Smart” Start Points Stephen Chen, York University.
Ch. Eick: Randomized Hill Climbing Techniques Randomized Hill Climbing Neighborhood Hill Climbing: Sample p points randomly in the neighborhood of the.
Parallel Simulated Annealing using Genetic Crossover Tomoyuki Hiroyasu Mitsunori Miki Maki Ogura November 09, 2000 Doshisha University, Kyoto, Japan.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Swarm Intelligence. Content Overview Swarm Particle Optimization (PSO) – Example Ant Colony Optimization (ACO)
CEng 713, Evolutionary Computation, Lecture Notes parallel Evolutionary Computation.
Evolutionary Algorithms Jim Whitehead
Particle Swarm Optimization (2)
Scientific Research Group in Egypt (SRGE)
Particle Swarm Optimization
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
Meta-heuristics Introduction - Fabien Tricoire
آموزش شبکه عصبی با استفاده از روش بهینه سازی PSO
Multi-band impedance matching using an evolutionary algorithm
Randomized Hill Climbing
Particle swarm optimization
Randomized Hill Climbing
Multi-Objective Optimization
Dynamic Authentication of Typing Patterns
Parallel Algorithm Models
Tomoyuki HIROYASU Mitsunori MIKI Masahiro HAMASAKI Yusuke TANIMURA
Local Search Algorithms
SWARM INTELLIGENCE Swarms
Presentation transcript:

Particle Swarm Optimization † Spencer Vogel † This presentation contains cheesy graphics and animations and they will be awesome

Particle Swarm Basics

✘ Each particle is trying to find the global optimum ✘ Each particle is moving ✘ Each particle remembers where it’s local optima was Basic Idea

✘ Each particle in the swarm cooperates with all of the other particles  Each particle has a neighborhood associated with it Neighborhoods

Social Geographical

✘ Each particle in the swarm cooperates with all of the other particles  Each particle has a neighborhood associated with it  Each particle knows the fitness of all other particles in it’s neighborhood ҂ The best position from it’s neighborhood is used to adjust the particle’s velocity Neighborhoods

✘ As each particle has to move, it has to move to a new position at each time step  It does this by adjusting it’s velocity  It’s velocity is based off a random weight of: ҂ It’s current velocity ҂ A random portion in the direction of it’s personal optimal fitness ҂ A random portion of the direction of the neighborhood optimal fitness Particle Action

Swarm Dynamics Current Motion Influence Swarm Influence Particle Memory Influence Resulting Vector Projected Motion

Common Test Functions Griewank Rastrigin Rosenbrock Sinenvsin

Parallelization

✘ Distributes all data to be processed amongst all processors ✘ Allows individual processors to complete all tasks for one set of data ✘ Applied widely to genetic algorithms Data Parallelism

✘ All particles are sent to the environment and the algorithm waits for all analysis to complete before continuing ✘ Usually results in poor efficiency as you end up waiting for some tasks to complete before updating the particles Synchronous

Evaluate Current Fitness Check Convergence Update Particle Position Update Particle Velocity Report Results Evaluate Current Fitness Check Convergence Update Particle Position Update Particle Velocity Report Results Evaluate Current Fitness Check Convergence Update Particle Position Update Particle Velocity Report Results Evaluate Current State

✘ Separates updates actions for each point and the swarm ✘ Updates point actions after each point is analyzed, and swarm actions at the end of each iteration Asynchronous

Evaluate Current Fitness Check Convergence Update Particle Position Update Particle Velocity Report Results Evaluate Current Fitness Check Convergence Update Particle Position Update Particle Velocity Report Results Evaluate Current Fitness Check Convergence Update Particle Position Update Particle Velocity Report Results Evaluate Current State

✘ Master  Initialize all optimization parameters, positions, and velocities  Holds a queue of particles for slave processors to evaluate  Updates global positions and velocities  Sends the position of the next particle in the que to an available slave processor  Receives cost function values from slave processors  Checks convergence Asynchronous master/slave paradigm

✘ Slave  Receives particle position from master processor  Evaluates the fitness function at the given particle  Updates particle position  Updates particle velocity  Sends a cost function to the master processor Asynchronous master/slave paradigm

Asynchronous vs Synchronous (a) Asynchronous Implementation (b) Synchronous Implementation

✘ H(x) represent different fitness functions ✘ Values are the mean (standard deviation) of function evaluations required to reach the final solution Performance Comparison

Efficiency Comparison

Thread communication Loosely correlated parameters Strongly correlated parameters

Thread Communication Unknown parameter correlation

Performance Comparison Communication style 1 with Rosenbrok Communication style 1 with Rastrigin Communication style 2 with Griewank Communication style 3

My Project

✘ Written in C++ using OpenGL and boost  Uses a combination of glew and freeglut ✘ Allows variable initialization via a configuration file ✘ Particle’s “best fitness” value decreases over time ✘ Provides visual feedback of particles in the search space My Implementation

✘ Parameter tuning ✘ Getting social neighborhoods to set up correctly ✘ The swarm currently has moments of “convergence” followed by moments of “oscillation”  Bug not currently fixed Challenges

Things to still implement

✘ All points seek out the green dot (user controllable via mouse) ✘ 3 axis search ✘ Individual particle “depth” represented by red – purple – blue dots  Red represents + z axis  Blue represents – z-axis  Purple shades represent a neutral z axis Live Demo

Questions?

✘ Chu, Shu-Chuan, John F. Roddick, and Jeng-Shyang Pan. "A Parallel Particle Swarm Optimization Algorithm With Communication Strategies." Harbin Institute of Technology. 13 Sept Web. 15 Oct ✘ Venter, Gerhard, and Jaroslaw Sobieszczanski-Sobiesk. "A Parallel Particle Swarm Optimization Algorithm Accelerated by Asynchronous Evaluations." EngOpt. 30 May Web. 10 Nov ✘ Pinto, Andry, Hugo Alves, Inês Domingues, Luís Rocha, and Susana Cruz. "The Particle Swarm Optimization Algorithm." University of Porto Web. 15 Oct ✘ "Particle Swarm Optimization Demo." YouTube. YouTube, 8 Feb Web. 15 Oct References