Advance in Fireworks Algorithm and its Applications Ying Tan ( 谭营 ) Peking University Contact This PPT is available.

Slides:



Advertisements
Similar presentations
Yuri R. Tsoy, Vladimir G. Spitsyn, Department of Computer Engineering
Advertisements

CS6800 Advanced Theory of Computation
Particle Swarm Optimization
Optimization methods Review
Firefly Algorithm by Mr Zamani & Hosseini.
« هو اللطیف » By : Atefe Malek. khatabi Spring 90.
Particle Swarm Optimization (PSO)  Kennedy, J., Eberhart, R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference.
PARTICLE SWARM OPTIMISATION (PSO) Perry Brown Alexander Mathews Image:
Firefly Algorithm By Rasool Tavakoli.
Bio-Inspired Optimization. Our Journey – For the remainder of the course A brief review of classical optimization methods The basics of several stochastic.
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Evaluating Performance for Data Mining Techniques
Particle Swarm Optimization Algorithms
SWARM INTELLIGENCE IN DATA MINING Written by Crina Grosan, Ajith Abraham & Monica Chis Presented by Megan Rose Bryant.
Self-Organizing Agents for Grid Load Balancing Junwei Cao Fifth IEEE/ACM International Workshop on Grid Computing (GRID'04)
Introduction Due to the recent advances in smart grid as well as the increasing dissemination of smart meters, the electricity usage of every moment in.
Efficient Model Selection for Support Vector Machines
Multimodal Optimization (Niching) A/Prof. Xiaodong Li School of Computer Science and IT, RMIT University Melbourne, Australia
Swarm Computing Applications in Software Engineering By Chaitanya.
Swarm Intelligence 虞台文.
SWARM INTELLIGENCE Sumesh Kannan Roll No 18. Introduction  Swarm intelligence (SI) is an artificial intelligence technique based around the study of.
DRILL Answer the following question’s in your notebook: 1.How does ACO differ from PSO? 2.What does positive feedback do in a swarm? 3.What does negative.
(Particle Swarm Optimisation)
The Particle Swarm Optimization Algorithm Nebojša Trpković 10 th Dec 2010.
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Topics in Artificial Intelligence By Danny Kovach.
2010 IEEE International Conference on Systems, Man, and Cybernetics (SMC2010) A Hybrid Particle Swarm Optimization Considering Accuracy and Diversity.
Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory 1.
1/27 Discrete and Genetic Algorithms in Bioinformatics 許聞廉 中央研究院資訊所.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Extracting meaningful labels for WEBSOM text archives Advisor.
The Application of The Improved Hybrid Ant Colony Algorithm in Vehicle Routing Optimization Problem International Conference on Future Computer and Communication,
Particle Swarm Optimization Speaker: Lin, Wei-Kai
A Novel Local Patch Framework for Fixing Supervised Learning Models Yilei Wang 1, Bingzheng Wei 2, Jun Yan 2, Yang Hu 2, Zhi-Hong Deng 1, Zheng Chen 2.
Greedy is not Enough: An Efficient Batch Mode Active Learning Algorithm Chen, Yi-wen( 陳憶文 ) Graduate Institute of Computer Science & Information Engineering.
Estimation of Number of PARAFAC Components
Tijana Janjusevic Multimedia and Vision Group, Queen Mary, University of London Clustering of Visual Data using Ant-inspired Methods Supervisor: Prof.
SOFIANE ABBAR, HABIBUR RAHMAN, SARAVANA N THIRUMURUGANATHAN, CARLOS CASTILLO, G AUTAM DAS QATAR COMPUTING RESEARCH INSTITUTE UNIVERSITY OF TEXAS AT ARLINGTON.
DATA CLUSTERING WITH KERNAL K-MEANS++ PROJECT OBJECTIVES o PROJECT GOAL  Experimentally demonstrate the application of Kernel K-Means to non-linearly.
SwinTop: Optimizing Memory Efficiency of Packet Classification in Network Author: Chen, Chang; Cai, Liangwei; Xiang, Yang; Li, Jun Conference: Communication.
1 Seema Thakur (st107641) Advisor: Dr. Weerakorn Ongsakul Optimal Generation Scheduling of Cascaded Hydro-thermal and Wind Power Generation By Particle.
Unsupervised Auxiliary Visual Words Discovery for Large-Scale Image Object Retrieval Yin-Hsi Kuo1,2, Hsuan-Tien Lin 1, Wen-Huang Cheng 2, Yi-Hsuan Yang.
Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Particle Swarm Optimization (PSO)
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
DRILL Answer the following question’s about yesterday’s activity in your notebook: 1.Was the activity an example of ACO or PSO? 2.What was the positive.
An Improved Quantum-behaved Particle Swarm Optimization Algorithm Based on Culture V i   v i 1, v i 2,.. v iD  Gao X. Z 2, Wu Ying 1, Huang Xianlin.
Incremental Reduced Support Vector Machines Yuh-Jye Lee, Hung-Yi Lo and Su-Yun Huang National Taiwan University of Science and Technology and Institute.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Methods of multivariate analysis Ing. Jozef Palkovič, PhD.
On the Computation of All Global Minimizers Through Particle Swarm Optimization IEEE Transactions On Evolutionary Computation, Vol. 8, No.3, June 2004.
Swarm Intelligence. Content Overview Swarm Particle Optimization (PSO) – Example Ant Colony Optimization (ACO)
Introduction to Machine Learning, its potential usage in network area,
Genetic Algorithms.
Particle Swarm Optimization (2)
Scientific Research Group in Egypt (SRGE)
Scientific Research Group in Egypt (SRGE)
Discrete ABC Based on Similarity for GCP
Scientific Research Group in Egypt (SRGE)
Adnan Quadri & Dr. Naima Kaabouch Optimization Efficiency
Particle Swarm Optimization
Advanced Artificial Intelligence Evolutionary Search Algorithm
A weight-incorporated similarity-based clustering ensemble method based on swarm intelligence Yue Ming NJIT#:
Computational Intelligence
Multi-Objective Optimization
Computational Intelligence
Mutation Operators of Fireworks Algorithm
Presentation transcript:

Advance in Fireworks Algorithm and its Applications Ying Tan ( 谭营 ) Peking University Contact This PPT is available at

2 5/12/2015 Outlines 1.Brief Introduction to Swarm Intelligence 2.basic Fireworks Algorithm (FA) 3.FA Variant 4.Latest Applications of FA FA for NMF FA for Document Clustering 5.Concluding Remarks

3 5/12/ Brief Introduction to Swarm Intelligence 1.1 Swarm Intelligence (SI) refers to Simple individuals or information processing units Interaction between individuals or with environment from nature to artificial systems

4 5/12/ Brief Introduction to SI 1.2 Some Famous SI Algorithms: Particle Swarm Optimization (PSO) Ant Colony Optimization (ACO) Artificial Immune System (AIS) Bee Colony Optimization (BCO) Bacterial Foraging Optimization (BFO) Fish School Search (FSS) Seeker Optimization Algorithm (SOA) To name a few

5 5/12/ Motivation  Biological population  Social phenomenon  Other laws in a swarm in nature

6 5/12/ Particle Swarm Optimization (PSO) Inspired by the search food of flocks

7 5/12/ Particle Swarm Optimization  A birds flock is searching for a food, and every bird does not know where the food is. But, they know presently the distance of each bird to the food This seeking behavior was associated with that of an optimization how to make a strategy that the bird can get to the food fastest?

8 5/12/ PSO principle solutions How to choose ?

9 5/12/ Visual demonstration of PSO

10 5/12/ Ant Colony Optimization (ACO)  Ant system searches Food from Nest  ANT.EXE (AVI1) ANT.EXE

11 5/12/2015 Double Bridge Experiment Auto-catalytic (positive feedback) process

12 5/12/2015 Welcome to ICSI 2012 Shenzhen, CN.

13 5/12/ Introduction to Fireworks Algorithm When a firework is set Off, a shower of sparks will fill the local space around the firework.  The explosion can be viewed as a search in the local Space around a firework.

14 5/12/2015

15 5/12/ from Nature to Mathematical Modeling  Definition of firework Good firework: firework can generate a big population of sparks within a small range. Bad firework: firework that generate a small Population of sparks within a big range. The basic FA algorithm is on the basis of simulating the process of the firework explosion illuminating the night sky.

16 5/12/ Basic Firework Algorithm (FA) 2.1 Problem description 2.2 Flowchart of FA 2.3 Design of basic FA  Selection operators  Number of son sparks 2.4 Experimental results

17 5/12/ Problem description 

18 5/12/ FA’s Flowchart Set N firework Obtain the sparks Evaluate the sparks, select N fireworks for next generation Terminal criterion? Repeat N Y

19 5/12/ Design of basic Firework Algorithm 

20 5/12/ Design of basic Firework Algorithm

21 5/12/ Design of basic Firework Algorithm BAD BIG RANGE LITTLE SPARKS GOOD SMALL RANGE MORE SPARKS

22 5/12/ Design of basic Firework Algorithm Generating Sparks. In explosion, sparks may undergo the e ff ects of explosion from random z directions (dimensions).

23 5/12/ Design of basic Firework Algorithm  Crowd Sparse KEEP DIVERSITY!

24 5/12/ Design of basic Firework Algorithm To keep the diversity of sparks, we design another way of generating sparks----Gaussian explosion.

25 5/12/ Experiments results of basic FA

26 5/12/ Experiments results of basic FA

27 5/12/ Experiments results of basic FA

28 5/12/ Experiments results of basic FA

29 5/12/2015 Explosion Search Demonstrations of FA  Sphere (AVI 2 ) Sphere  Benchmark function f 18 (AVI 3 ) Benchmark function f 18

30 5/12/ FA variants 3.1 Motivation of FA variants 3.2 Definition for FA variants 3.3 FA with fixed minimal explosion amplitude 3.4 FA with dynamic minimal explosion amplitude

31 5/12/ Motivation of FA variants  By comprehensively analyzing the techniques of fireworks explosion, our findings suggest that some of the generated sparks can hardly make any contribution to the optimization if they do not come to a local space while take a lot of ”resource”. LOW DIVERSITY!! Too Many Sparks in such a small range

32 5/12/ Definition for FA variants Core Firework: In iterative process, there is fixed number of fireworks which can generate sparks. Among the fireworks in each iteration, the firework with minimal fitness is defined as Core Firework.

33 5/12/ Definition for FA variants Significance Improvement In order to determine whether the Core Firework can generate better sparks, we define that if it generates a spark in the next generation whose fitness is Best Fitness in the next iteration as Significance Improvement.

34 5/12/ Definition for FA variants

35 5/12/ FA with fixed minimal explosion amplitude

36 5/12/ FA with fixed minimal explosion amplitude

37 5/12/ FA with fixed minimal explosion amplitude  As can be seen, the bigger fixed minimal scope can accelerate the process of searching the optimal solution at the first iterations. However, when it comes to the local minimal space, the higher the fixed minimal scope, the harder it searches the optimal solution.  It would seem that function Ackley is uni-modal function. And multi-modal function is the same, the difference is that only when the swarm comes to a local minimal space rather than a global minimal space compared to uni- modal functions.

38 5/12/ FA with dynamic minimal explosion amplitude  Experiments with fixed dynamic explosion amplitude suggest that FA should be loaded with a higher explosion amplitude while it does Not come to a local minimal space, thus to enhance the possibility of obtain a better solution. When it got to a local minimal space, then the fireworks begin to search the Space within a small range from the found good solution.

39 5/12/ FA with dynamic minimal explosion amplitude If FA have not get any better positions in N consecutive steps?

40 5/12/ Two cases:  the population of the swarm (excluding the core firework) is too small.  the search range of the core firework is too small LOW DIVERSITY

41 5/12/ Dynamic minimal explosion amplitude strategy A better position : B A C Yes, it finds a better solution at B So it load with the smallest explosion amplitude Iteration by iteration, it can not obtain any improvement, so a bigger explosion amplitude Should be exploited till it reaches point C.

42 5/12/ FA with dynamic minimal explosion amplitude Set N firework Obtain the sparks Evaluate the sparks, select N fireworks Terminal criterion? Repeat Minimal Explosion Amplitude Checking

43 5/12/ FA with dynamic minimal explosion amplitude  25 Benchmark Test Functions

44 5/12/ FA with dynamic minimal explosion amplitude  Experimental Results of FA-DEA on f 1 -f 14

45 5/12/ FA with dynamic minimal explosion amplitude  Experimental Results of FA-DEA on f 15 -f 25

46 5/12/2015 Performance Comparison on Various Dimensions

47 5/12/ Application Research 4.1 FA for NMF 4.2 FA for Clustering

48 5/12/ FA for NMF NMF description Algorithm for NMF Computing Experimental results

49 5/12/ NMF description  Lee and Seung publish a paper on Nature in 1999 about the property.  Low-rank approximations are utilized in several content based retrieval and data mining applications.

50 5/12/ NMF description Figure - Scheme of very coarse NMF approximation with very low rank k. Although k is significantly smaller than m and n, the typical structure of the original data matrix can be retained (note the three different groups of data objects in the left, middle, and right part of A). Minimal

51 5/12/ NMF description  Mathematically, we consider the problem of finding a “good” (ideally the global) solution of an optimization problem with bound constraints.

52 5/12/ NMF description  The nonlinear optimization problem underlying NMF can generally be stated as

53 5/12/ Algorithm for NMF Computing Figure – Illustration of the optimization process for row l of the NMF factor W. The l th row of A (a l r ) and all columns of H0 are the input for the optimization algorithms. The output is a row-vector w l r (the l th row of W) which minimizes the norm of d l r, the l th row of the distance matrix D. The norm of d l r is the fitness function for the optimization algorithms (minimization problem).

54 5/12/ Algorithm for NMF Computing  General structure of NMF algorithms

55 5/12/ Algorithm for NMF Computing  NMF Initialization

56 5/12/ Algorithm for NMF Computing  Iterative Optimizati on

57 5/12/ Datasets for NMF Computing by FA  DS-RAND is a randomly created, fully dense matrix which is used in order to provide unbiased results. To evaluate the proposed methods in a classification context we further used two data sets from the area of classification (spam/phishing detection).  Data set DS-SPAM1 consists of messages described by 133 features, divided into three groups: spam, phishing and legitimate .  Data set DS-SPAM2 is the spambase data set taken from (Kjellerstrand 2011) which consists of 1813 spam and 2788 non-spam messages. DS- SPAM1 represents a ternary classification problem; DS-SPAM2 represents a typical binary classification problem.

58 5/12/2015 Figure – Left hand-side: average approximation error per row (after initializing rows of W). Right hand-side: average approximation error per column (after initializing of H). NMF rank k = 5. Legends are ordered according to approximation error (top = worst, bottom = best). First WFirst H

59 5/12/2015 Figure Accuracy per Iteration when updating only the row of W, m=2, c=20. Left: k=2, right: k=5

60 5/12/2015 Experiments results of NMF Figure – Proportional runtimes for achieving the same accuracy as basic MU after 30 iterations for different values of k when updating only the rows of W. (m=2, c=20)

61 5/12/ FA for Document Clustering Document Clustering Description Dataset Experimental Result

62 5/12/ Document Clustering description  Automatically group the related documents into clusters. Example – Medical documents – Legal documents – Financial documents  If a collection is well clustered, it is much more efficient to search only the cluster that will contain relevant documents.

63 5/12/ Dataset: Newsgroups predecessor : Newsgroups  The 4.5% document belongs to the two or more than two news group  The remaining documents only belongs to one newsgroup modification : Jason Rennie from MIT do some necessary processings to Newsgroups, so that each document belongs to only one News group characters : A total of documents, all documents belong to 20 different new newsgroups  widely used in document classification and clustering

Experimental Result

65 5/12/2015 Concluding Remarks By mimicking the explosion process of fireworks in our festivals, the so-called FA is proposed and implemented for function optimization, with a promising performance against Clonal PSO and Standard PSO. FA-DEA, by introducing an explosion amplitude control strategy, has shown its great advantages in hybrid-modal problems, which can achieve the optima 50% on average. The use of Firework Algorithm to solve practical engineering and scientific problems like clustering and NMF has gained great success compared to traditional methods.

66 5/12/2015 Acknowledgment  This work was supported by National Natural Science Foundation of China (NSFC), under Grant No , , and partially supported by the 863 Program with Grant No. 2007AA01Z453. Greatly appreciate my Ph.D. students, Zhu Yuanchun and Zheng Shaoqiu, for their excellent contributions and extensive experiments. Many thanks go to my Postdoc Dr. Andreas Janecek from Univ. of Vienna, and my M.S. students, Yang Xiang and Chao Rui for their significant application researches on FA.

67 5/12/2015 Reference [ 1] Y. Tan, Y. Zhu (2010). Fireworks algorithm for optimization. ICSI 2010, Part I, Springer LNCS 6145, pp [2] S.Q. Zheng, Y. Tan (2011). Fireworks Algorithm with Dynamic Explosion Amplitude. Submitted to IEEE CEC [3] A. Janecek and Y. Tan, "Using Population Based Algorithms for Initializing Nonnegative Matrix Factorization,"Proc. of The Second International Conference on Swarm Intelligence (ICSI’2011), June 12-15, Chongqing, China, Part II, LNCS 6729, Springer-Verlag, 2011, pp [4] A. Janecek and Y. Tan, "Iterative Improvement of the Multiplicative Update NMF Algorithm using Nature-inspired Optimization,“ The 7th International Conference on Natural Computation (ICNC’11), July 2011, Shanghai, China, Springer LNCS, pp [5] A. Janecek, Y. Tan, " Swarm Intelligence for Non-negative Matrix Factorization," International Journal of Swarm Intelligence Research (IJSIR), November 2011 (In press) [6] X. Yang, Using Fireworks Algorithm for Document Clustering, Bachelor Thesis, Peking University, 2011.

Thank you Any Question ?