Particle swarm optimization

Slides:



Advertisements
Similar presentations
Ali Husseinzadeh Kashan Spring 2010
Advertisements

CS6800 Advanced Theory of Computation
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
On the Genetic Evolution of a Perfect Tic-Tac-Toe Strategy
FOREST PLANNING USING PSO WITH A PRIORITY REPRESENTATION P.W. Brooks and W.D. Potter Institute for Artificial Intelligence, University of Georgia, USA.
Particle Swarm Optimization (PSO)
Content Based Image Clustering and Image Retrieval Using Multiple Instance Learning Using Multiple Instance Learning Xin Chen Advisor: Chengcui Zhang Department.
A GENETIC ALGORITHM APPROACH TO SPACE LAYOUT PLANNING OPTIMIZATION Hoda Homayouni.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Fast Evolutionary Optimisation Temi avanzati di Intelligenza Artificiale - Lecture 6 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
CS 447 Advanced Topics in Artificial Intelligence Fall 2002.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Genetic Algorithm What is a genetic algorithm? “Genetic Algorithms are defined as global optimization procedures that use an analogy of genetic evolution.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Efficient Model Selection for Support Vector Machines
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Genetic Algorithms by using MapReduce
Comparison of Differential Evolution and Genetic Algorithm in the Design of a 2MW Permanent Magnet Wind Generator A.D.Lilla, M.A.Khan, P.Barendse Department.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
(Particle Swarm Optimisation)
The Particle Swarm Optimization Algorithm Nebojša Trpković 10 th Dec 2010.
Topics in Artificial Intelligence By Danny Kovach.
2010 IEEE International Conference on Systems, Man, and Cybernetics (SMC2010) A Hybrid Particle Swarm Optimization Considering Accuracy and Diversity.
Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory 1.
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
GENETIC ALGORITHMS FOR THE UNSUPERVISED CLASSIFICATION OF SATELLITE IMAGES Ankush Khandelwal( ) Vaibhav Kedia( )
Learning from Positive and Unlabeled Examples Investigator: Bing Liu, Computer Science Prime Grant Support: National Science Foundation Problem Statement.
1 Effect of Spatial Locality on An Evolutionary Algorithm for Multimodal Optimization EvoNum 2010 Ka-Chun Wong, Kwong-Sak Leung, and Man-Hon Wong Department.
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
SwinTop: Optimizing Memory Efficiency of Packet Classification in Network Author: Chen, Chang; Cai, Liangwei; Xiang, Yang; Li, Jun Conference: Communication.
Improving Support Vector Machine through Parameter Optimized Rujiang Bai, Junhua Liao Shandong University of Technology Library Zibo , China { brj,
Faculty of Information Engineering, Shenzhen University Liao Huilian SZU TI-DSPs LAB Aug 27, 2007 Optimizer based on particle swarm optimization and LBG.
Genetic algorithms: A Stochastic Approach for Improving the Current Cadastre Accuracies Anna Shnaidman Uri Shoshani Yerach Doytsher Mapping and Geo-Information.
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Agenda  INTRODUCTION  GENETIC ALGORITHMS  GENETIC ALGORITHMS FOR EXPLORING QUERY SPACE  SYSTEM ARCHITECTURE  THE EFFECT OF DIFFERENT MUTATION RATES.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Genetic Algorithms. Solution Search in Problem Space.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithm(GA)
Using GA’s to Solve Problems
Genetic Algorithms.
Advanced Computing and Networking Laboratory
Particle Swarm Optimization (2)
Particle Swarm Optimization with Partial Search To Solve TSP
Scientific Research Group in Egypt (SRGE)
Discrete ABC Based on Similarity for GCP
Dept of ECE, Concordia University, Montreal, Canada
Balancing of Parallel Two-Sided Assembly Lines via a GA based Approach
Dr. Ashraf Abdelbar American University in Cairo
Meta-heuristics Introduction - Fabien Tricoire
Evolution strategies Can programs learn?
Artificial Intelligence Project 2 Genetic Algorithms
Chapter 4 Beyond Classical Search
Advanced Artificial Intelligence Evolutionary Search Algorithm
Evolutionary Computation,
○ Hisashi Shimosaka (Doshisha University)
Shih-Wei Lin, Kuo-Ching Ying, Shih-Chieh Chen, Zne-Jung Lee
Aiman H. El-Maleh Sadiq M. Sait Syed Z. Shazli
EE368 Soft Computing Genetic Algorithms.
Searching for solutions: Genetic Algorithms
Introduction to Genetic Algorithm and Some Experience Sharing
M. Kezunovic (P.I.) S. S. Luo D. Ristanovic Texas A&M University
Traveling Salesman Problem by Genetic Algorithm
Beyond Classical Search
Population Methods.
Presentation transcript:

Particle swarm optimization or Genetic algorithm ? Yuzhao Wang, Zhibin Yu, Qixiao Liu, JunQing Yu Hello, everyone, it is a great honor to be here to present my work. I am wangyuzhao. In this work, we focus on the comparison of two heuristics in Spark parameter tuning, namely genetic algorithm and particle swarm optimization . SIAT, CAS HUST

Motivation Big Data Framework — parameter tuning Hundreds of configuration parameters (Multivalued) Critical for performance, optimization is needed! Tuning parameters is time-consuming Expertise knowledge is needed Data Collecting Model training Optimum searching As we all know, Spark is a famous in-memeory computing framwork, which has one hundred and sixty parameters , and these parameters are critical for performance. So we need to tune before running different kinds of programs. Manual tuning can be time-consuming ,and besides needs expertise knowledge. So we propose a formalism to automatically tune the parameters , this formalism consists of three parts.Firstly , Data Collecting, which collects the trainning data. second, model trainning , which uses the data collected from the first phase to construct the model.Third, optimum searching, which searches the configuration space to get the best congfiguration parameters for a given app. In this work , we forcus on the third phase , Optimum Searching. we conduct several experiments to compare the performance of two searching algorithms , PSO AND GA. Genetic Algorithm(GA) , Particle Swarm Optimization(PSO) Random Forest, Support Vector Machine . . .

Outline Backgrouds Proposal and implementation Experimental Evaluation

Background Natural, efficient presentation for heuristics GA - floating point presentation PSO - location vector : feature parameter gene chromosome In order to solve the real-world problem using the heuristics , firstly we need to transform it to the formation, which heuristics can solve. As to GA , we use floating point presentation , as the picture depicts. Each block presents A feature. We call the block as gene, and several genes make up of a chromosome. As to PSO , We directly use the feature parameter as the location vector. Thus we construct a bridge between the problem space and the heuristic resolution. Performance, Parameter Values

Genetic Algorithm A genetic analogy Selection : choose superior chromosomes according to the fitness function Crossover : generate offsprings from parents chromosome Mutation : induce randomness to evolution The genetic algorithm is an genetic analogy, the algorithm contains a population of chromosomes,as the picture shows. In order to evolve, three operators are performed, namely Selection, Crossover, Mutation.

Particle Swarm Optimization Swarm intelligence location : a vector represents a feasible solution velocity : an addend to update the location local optimum move global optimum As to pso , it has two components, namely location and velocity. As the animation shows, It tries to make use of the three kinds of info of the swarm, inertia velocity , local and global optimum. This will lead each individual to a Better location. inertia velocity

Outline Backgrouds Model and implementation Experimental Evaluation

Performance model Random Forest (RF) 1) Collect data to form the searching space 2) Training the RF as we know, the heuristic need a fitness funtion ,and we choose the random forest algorithm to build the model due to its Better performance, as the practice suggests. The training process is just the same as the previous slide.

Searching heuristics Implementation details GA: Extend 'genalg' library of R in Python PSO : Python package 'pyswarm' As to the implementation of pso and ga, we make use of existing package , such as R library GENALG and Python package ”pyswarm “. Both algorithms are rewrote in Python.

Outline Backgrouds Model and implementation Experimental Evaluation

Experimental setup Platform Metric Desktop:AMD quad-core processor, 4 GB DDR4 memory Benchmark: PageRank Metric Evaluation of the convergent state: Standard Deviation Our experimental platform is equipped with a quad-core processor, and a 4GB DDR4 memory. As a case study, through the experiment , we use PageRank as our benchmark. on another espect, We use the Standard Deviation to evaluate the convergent state of the warm ,for it is more sensitive to describe the variace than the Mean value . iterate

GA & PSO Settings Find the optimal parameter set for GA and PSO respectively Case study with the 6-dimension model Z GA : z axis: convergence time ( higher = worse ) x axis: mutation chance y axis: elitism size X Y (16, 0.01) As we know , ga or pso itself has several parameters that impact its seaching ability. In order to ensure the fairness of comparison, firstly we try to find the best parameter for Ga and pso respectively. As an example, we analysed the 6-dimensional model. as to Ga, x, y axis is the chance and elitism size respectively. As the landscape depicts, as the elitism size increases ,the convergence time decreases obviously. as to PSO, the x,y axis are the two learning factors C1 and C2. And c2 has more influence on the pso convergence time. After the anlaysis, we are able to find the optimal parameter set for ga, and pso respectively. PSO: z axis: convergence time x axis: local learning factor C1 y axis: global learning factor C2 (2.5, 2.5)

GA & PSO Settings Find the optimal Popsize for GA and PSO convergence time solution fitness As for the population size , we discussed its impact on the convergence time and solution quality. As the pictures shows ,Both heuristics' convegence time exhibits linear relation against the swarm size; and the searching accuracy measured by the fitness value exhibits an exponential form. Finally , we choose 25 as the popsize. With the best parameter chosen for GA and PSO, we compare their performance as searching heuristics.

GA & PSO Evaluation Convergence time comparison 1) In convergence time, pso is faster ga.mean = 250 pso.mean = 210 we compare the performance form two respects: the convergence time and the solution quality. See form the first picture, A pso’s mean convergence time is lower than Ga. And from the second picture, ga’s fitness curve overlaps with pso’s , which indicate that both heuristics' robustness Are similar in a degree. HOWERVER, THE MEAN fitness value of pso is 210ms, and ga 250, which means pso is slightly better. 2) Robustness: steadiness across 50 times running

GA & PSO Evaluation Scalability when dimension increases AS FOR Scalability, see from the picture, the pso is faster across all tested dimensions. As for the variation of convergence time, pso is better than ga. Thus AS A conclusion, pso is better in terms of scalability. 1) PSO is faster across all tested dimensions 2) The convergence time of PSO across different dimensions shows less variations, thus, better scalability

GA & PSO Evaluation Solution quality 𝐷𝑒𝑣= 𝑝𝑠𝑜.𝑠 −𝑔𝑎.𝑠 𝑔𝑠.𝑠 From the table ,it is easy to know that although pso’s solution quality is slightly poorer than ga. So above all , we recommend one to use PSO in spark performance tuning context, since it performs much Faster while the performance is only slightly poorer than that by GA. 𝐷𝑒𝑣= 𝑝𝑠𝑜.𝑠 −𝑔𝑎.𝑠 𝑔𝑠.𝑠 Quality is the fitness value. PSO is slightly poorer in solution’s quality than GA

Conclusion PSO performs much faster while the performance is only slightly poorer than that of GA.