Dr. Ashraf Abdelbar American University in Cairo

Slides:



Advertisements
Similar presentations
Particle Swarm Optimization (PSO)
Advertisements

Reinforcement Learning
Particle Swarm optimisation. These slides adapted from a presentation by - one of main researchers.
Particle Swarm Optimization
Particle Swarm Optimization (PSO)
Particle Swarm Optimization Particle Swarm Optimization (PSO) applies to concept of social interaction to problem solving. It was developed in 1995 by.
Valery Frolov.  The algorithm  Fitness function  Crossover  Mutation  Elite individuals  Reverse mutations  Some statistics  Run examples.
EMBIO – Cambridge Particle Swarm Optimization applied to Automated Docking Automated docking of a ligand to a macromolecule Particle Swarm Optimization.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
1 A Novel Binary Particle Swarm Optimization. 2 Binary PSO- One version In this version of PSO, each solution in the population is a binary string. –Each.
Hierarchical Distributed Genetic Algorithm for Image Segmentation Hanchuan Peng, Fuhui Long*, Zheru Chi, and Wanshi Siu {fhlong, phc,
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
PSO and its variants Swarm Intelligence Group Peking University.
(Particle Swarm Optimisation)
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Topics in Artificial Intelligence By Danny Kovach.
Particle Swarm optimisation. These slides adapted from a presentation by - one of main researchers.
Particle Swarm Optimization Speaker: Lin, Wei-Kai
Particle Swarm Optimization James Kennedy & Russel C. Eberhart.
Applications of Genetic Algorithms TJHSST Computer Systems Lab By Mary Linnell.
Particle Swarm Optimization † Spencer Vogel † This presentation contains cheesy graphics and animations and they will be awesome.
1 Sampling Distribution of Arithmetic Mean Dr. T. T. Kachwala.
Particle Swarm Optimization Using the HP Prime Presented by Namir Shammas 1.
Introduction to Inference Sampling Distributions.
Faculty of Information Engineering, Shenzhen University Liao Huilian SZU TI-DSPs LAB Aug 27, 2007 Optimizer based on particle swarm optimization and LBG.
Particle Swarm Optimization (PSO)
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
On the Computation of All Global Minimizers Through Particle Swarm Optimization IEEE Transactions On Evolutionary Computation, Vol. 8, No.3, June 2004.
T. P. Runarsson, J. J. Merelo, U. Iceland & U. Granada (Spain) NICSO 2010 Adapting Heuristic Mastermind Strategies to Evolutionary Algorithms.
Unsupervised Learning Part 2. Topics How to determine the K in K-means? Hierarchical clustering Soft clustering with Gaussian mixture models Expectation-Maximization.
Virtual University of Pakistan
Dr. Kenneth Stanley September 11, 2006
Evolutionary Algorithms Jim Whitehead
MAE 552 Heuristic Optimization
Particle Swarm Optimization (2)
Particle Swarm Optimization with Partial Search To Solve TSP
Particle Swarm optimisation
Particle Swarm optimisation
Scientific Research Group in Egypt (SRGE)
Scientific Research Group in Egypt (SRGE)
USING MICROBIAL GENETIC ALGORITHM TO SOLVE CARD SPLITTING PROBLEM.
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
Particle Swarm optimisation: A mini tutorial
Differential Evolution
Meta-heuristics Introduction - Fabien Tricoire
آموزش شبکه عصبی با استفاده از روش بهینه سازی PSO
Evolution strategies Can programs learn?
Hypothesis Testing and Confidence Intervals (Part 1): Using the Standard Normal Lecture 8 Justin Kern October 10 and 12, 2017.
Sampling Distributions
Traffic Simulator Calibration
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Sampling Distributions
Particle swarm optimization
Sampling Distribution
Sampling Distribution
بهينه‌سازي گروه ذرات (PSO)
G5BAIM Artificial Intelligence Methods
Confidence Intervals with Proportions
Boltzmann Machine (BM) (§6.4)
Applications of Genetic Algorithms TJHSST Computer Systems Lab
Chapter 8: Confidence Intervals
Algorithm Course Algorithms Lecture 3 Sorting Algorithm-1
Population Methods.
Presentation transcript:

Dr. Ashraf Abdelbar American University in Cairo PSO Variations Dr. Ashraf Abdelbar American University in Cairo

No Free Lunch Theorem In a controversial paper in 1997 (available at AUC library), Wolpert and Macready proved that “averaged over all possible problems or cost functions, the performance of all search algorithms is exactly the same” This includes such things as random search No algorithm is better on average than blind guessing

Cooperative PSO The solution vector being optimized is divided into k parts, each part given to a separate sub-swarm. Taken to the extreme, k can be equal n To evaluate the fitness of each component in each subswarm, a context vector is used in which the component being evaluated is inserted. One approach to forming a context vector is to take the currently global best component from each sub-swarm.

Guaranteed-Convergence PSO

Using PSO to train a game player

Hierarchical PSO

Hierarchical PSO

Fully-Informed PSO

Fully-Informed PSO Variants

Clerc’s Type 1” Constriction

Adaptive swarm size I try to kill myself There has been enough improvement although I'm the worst I try to generate a new particle I'm the best This is a more recent and sophisticated attempt. I don’t give here the precise formulas, just the underlying metaphors. On this slide “improvement” means “improvement in the particle’s neighbourhood”, and “I try” means success is depending on the current swarm size, according to a probability rule. The good point is you don’t have anymore to “guess” what could be the best swarm size, or to launch a lot of runs to find it, for you can perfectly begin with a very small swarm, and let it increase and decrease by itself. Note that by using the same kind of metaphor, it is also possible to adapt the neighbourhood size. but there has been not enough improvement 2002-04-24 Maurice.Clerc@WriteMe.com This slide is taken from a presentation by M. Clerc

Cluster Centers c-means algorithm used to cluster x vectors Cluster center vectors used instead of either the personal-best vectors or the neighborhood-best vectors

Angeline’s Adaptation In each iteration, the worst half of the population was replaced by mutated clones of the better half.

Adaptive Noise

Breeding Swarms

Statistical Significance When comparing two or more different techniques or variations or parameter settings on a given problem, it is important to make more than one run You should at least report the mean and standard deviation (2σ includes 95%) Ideally, you should run a test of statistical significance such as ANOVA These tests are standard in the natural sciences, but sadly they are less common in CS

Topics Binary PSO Guaranteed Converge Continuous PSO Cooperative NFL Game playing Inertia Hierarchical Constriction Fully informed Adapting Swarm Size Statistical Significance Cluster centers Angeline’s adaptation