Download presentation
Presentation is loading. Please wait.
Published byWesley Goodman Modified over 9 years ago
1
03 26 2008
2
Particle Swarm Optimization (PSO) Kennedy, J., Eberhart, R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference on Neural Networks (Perth, Australia), lEEE Service Center, Piscataway, NJ, pp. IV: 1942- 1948.
3
Behavior of Flock of Birds
8
Self-Experience Success of Others Self-Experience Success of Others v id = w* v id + c 1 * rand( ) * (p id - x id ) + c 2 * Rand( ) * (p gd - x id ) x id = x id + v id
9
PSO Equation v id = w* v id + c 1 * rand( ) * (p id - x id ) + c 2 * Rand( ) * (p gd - x id ) x id = x id + v id Self-Experience Success of Others Position : x i Velocity: v i i th Particle Previous Best Position : p i Global Best Position : p g Inertia
13
Optimization Problem Input System Output Parameter Adjustment Input System_1 Output System_2 System_3 System_n … n particles
14
Particle Swarm Optimization Cost x x x Iteration …… v id = w* v id + c 1 * rand( ) * (p id - x id ) + c 2 * Rand( ) * (p gd - x id ) x id = x id + v id Vp Vg Inertia xkxk x k+1 x k-1
15
Inertia Weight v id = w* v id + c 1 * rand( ) * (p id - x id ) + c 2 * Rand( ) * (p gd - x id ) x id = x id + v id Vp Vg Large Inertia Weight xkxk x k+1 x k- 1 W: inertia weight Vp Vg Small Inertia Weight xkxk x k+1 x k- 1
16
Inertia Weight v id = w* v id + c 1 * rand( ) * (p id - x id ) + c 2 * Rand( ) * (p gd - x id ) x id = x id + v id W: inertia weight Cost x x Large Inertia Weight Small Inertia Weight Inertia Weight Large Small Global Search Local Search
17
Fuzzy Adaptive PSO Kennedy, J., Eberhart, R. C. (2001).“Fuzzy adaptive particle swarm optimization,” in Proc. IEEE Int. Congr. Evolutionary Computation, vol. 1, 2001, pp. 101–106. Inertia Weight Large Small Global Search Local Search Fuzzy Adaptive Normalized Current Best Performance Evaluation (NCBPE) Cost x CBPE CBPE max CBPE min
18
Fuzzy Adaptive PSO Inertia Weight Large Small Global Search Local Search Fuzzy Adaptive NCBPE L MH Membership 0 1 Weight L MH Membership 0 1 W_Change L MH Membership 0 1 A description of a fuzzy system for adapting the inertia weight of PSO. Fuzzy Rule
19
Experimental Results Minimization Linearly Decreasing Inertia Weight Fuzzy Adaptive Inertia Weight The performance of PSO is not sensitive to the population size, and the scalability of the PSO is acceptable.
20
Application Example1 Feature Training for Face Detection … Iteration 1 … Iteration 2 … Iteration k …
21
Application Example2 Neural Network Training V.G. Gudisz, G.K. Venayagamoorthy, Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks, in: IEEE Swarm Intelligence Symposium 2003 (SIS 2003), Indianapolis, IN, 2003, pp. 110–117.
22
Introduction of Neural Network a i = W ij X for i=1 to 4, j=1,2 Where X = [x 1] T d i = 1 / (1-e ai ) y = [V 1 V 2 V 3 V 4 ][d 1 d 2 d 3 d 4 ] T
23
Neural Network Training Backpropagation PSO
24
Neural Network Training Backpropagation PSO
25
Neural Network Training Backpropagation PSO Parameter Set of PSO
26
Training Results Training 2x4x1 neural network to fit y = 2x 2 +1 Mean square error curve of neural networks during mining with BP and PSO for bias 1 Test curve for trained neural networks with fixed weights obtained from BP and PSO training algorithm with bias 1
27
Conclusions The concept of PSO is introduced. PSO is an extremely simple algorithm for global optimization problem. Low memory cost Low computational cost Fuzzy system is implemented to dynamically adjust the inertia weight to improve the performance of PSO.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.