Download presentation
Presentation is loading. Please wait.
Published bySylvia Howard Modified over 6 years ago
1
Dr. Ashraf Abdelbar American University in Cairo
PSO Variations Dr. Ashraf Abdelbar American University in Cairo
2
No Free Lunch Theorem In a controversial paper in 1997 (available at AUC library), Wolpert and Macready proved that “averaged over all possible problems or cost functions, the performance of all search algorithms is exactly the same” This includes such things as random search No algorithm is better on average than blind guessing
3
Cooperative PSO The solution vector being optimized is divided into k parts, each part given to a separate sub-swarm. Taken to the extreme, k can be equal n To evaluate the fitness of each component in each subswarm, a context vector is used in which the component being evaluated is inserted. One approach to forming a context vector is to take the currently global best component from each sub-swarm.
4
Guaranteed-Convergence PSO
5
Using PSO to train a game player
6
Hierarchical PSO
7
Hierarchical PSO
8
Fully-Informed PSO
9
Fully-Informed PSO Variants
10
Clerc’s Type 1” Constriction
11
Adaptive swarm size I try to kill myself
There has been enough improvement although I'm the worst I try to generate a new particle I'm the best This is a more recent and sophisticated attempt. I don’t give here the precise formulas, just the underlying metaphors. On this slide “improvement” means “improvement in the particle’s neighbourhood”, and “I try” means success is depending on the current swarm size, according to a probability rule. The good point is you don’t have anymore to “guess” what could be the best swarm size, or to launch a lot of runs to find it, for you can perfectly begin with a very small swarm, and let it increase and decrease by itself. Note that by using the same kind of metaphor, it is also possible to adapt the neighbourhood size. but there has been not enough improvement This slide is taken from a presentation by M. Clerc
12
Cluster Centers c-means algorithm used to cluster x vectors
Cluster center vectors used instead of either the personal-best vectors or the neighborhood-best vectors
13
Angeline’s Adaptation
In each iteration, the worst half of the population was replaced by mutated clones of the better half.
14
Adaptive Noise
15
Breeding Swarms
16
Statistical Significance
When comparing two or more different techniques or variations or parameter settings on a given problem, it is important to make more than one run You should at least report the mean and standard deviation (2σ includes 95%) Ideally, you should run a test of statistical significance such as ANOVA These tests are standard in the natural sciences, but sadly they are less common in CS
17
Topics Binary PSO Guaranteed Converge Continuous PSO Cooperative NFL
Game playing Inertia Hierarchical Constriction Fully informed Adapting Swarm Size Statistical Significance Cluster centers Angeline’s adaptation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.