Feature based Diversity Optimization

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Advertisements

26/04/05 DMI - Università di Catania 1 Combinatorial Landscapes Giuseppe Nicosia University of Catania Department of Mathematics and Computer Science
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
CS6800 Advanced Theory of Computation
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Mathematical Analysis of Robustness Sensitivity analysis allows the linking of robustness to network structure. However, it yields only local properties.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Estimation of Distribution Algorithms Let’s review what have done in EC so far: We have studied EP and found that each individual searched via Gaussian.
A Heuristic Bidding Strategy for Multiple Heterogeneous Auctions Patricia Anthony & Nicholas R. Jennings Dept. of Electronics and Computer Science University.
Reporter : Mac Date : Multi-Start Method Rafael Marti.
Evolutionary Computing and the Traveling Salesman
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Memetic Algorithms By  Anup Kulkarni( )  Prashanth Kamle( ) Instructor: Prof. Pushpak Bhattacharyya.
Ant Colony Optimization: an introduction
Attention Deficit Hyperactivity Disorder (ADHD) Student Classification Using Genetic Algorithm and Artificial Neural Network S. Yenaeng 1, S. Saelee 2.
Genetic Algorithm.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Project MLExAI Machine Learning Experiences in AI Ingrid Russell, University.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
CS440 Computer Science Seminar Introduction to Evolutionary Computing.
Search Methods An Annotated Overview Edward Tsang.
Computer Science and Mathematical Basics Chap. 3 발표자 : 김정집.
Fuzzy Genetic Algorithm
The Design of Innovation is coming July 2002 a new book by David E. Goldberg Department of General Engineering University of Illinois at Urbana-Champaign.
Exploiting Context Analysis for Combining Multiple Entity Resolution Systems -Ramu Bandaru Zhaoqi Chen Dmitri V.kalashnikov Sharad Mehrotra.
Antonio Augusto Chaves - Luiz Antonio Nogueira Lorena National Institute for Space Research - INPE São José dos Campos, Brazil
Exact and heuristics algorithms
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
Automated design of lightweight exploration rovers Slide 1> Automated rover design > Alexandre C. Leite ASTRA 2013 > Alexandre Carvalho.
Benjamin Doerr 1, Michael Gnewuch 2, Nils Hebbinghaus 1, Frank Neumann 1 1 Max-Planck-Institut für Informatik Saarbrücken A Rigorous View on Neutrality.
Genetic Algorithms and TSP Thomas Jefferson Computer Research Project by Karl Leswing.
Surface Defect Inspection: an Artificial Immune Approach Dr. Hong Zheng and Dr. Saeid Nahavandi School of Engineering and Technology.
Shape2Pose: Human Centric Shape Analysis CMPT888 Vladimir G. Kim Siddhartha Chaudhuri Leonidas Guibas Thomas Funkhouser Stanford University Princeton University.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Hybrid Ant Colony Optimization-Support Vector Machine using Weighted Ranking for Feature Selection and Classification.
Introduction to Machine Learning, its potential usage in network area,
Science problem of the month
Genetic Algorithms.
Selected Topics in CI I Genetic Programming Dr. Widodo Budiharto 2014.
Genetic Algorithms.
CS 9633 Machine Learning Support Vector Machines
Classifier Representation in LCS
Evolutionary Algorithms Jim Whitehead
Department of Computer Science
Aircraft Sequencing Problem Near Terminal Area
Particle Swarm Optimization with Partial Search To Solve TSP
Introduction to Machine Learning
Non-additive Security Games
School of Computer Science & Engineering
Item pool optimization for adaptive testing
On calibration of micro-crack model of thermally induced cracks through inverse analysis Dr Vladimir Buljak University of Belgrade, Faculty of Mechanical.
How do we plan for IBL in mathematics?
Artificial Intelligence Project 2 Genetic Algorithms
Comparing Genetic Algorithm and Guided Local Search Methods
Advanced Artificial Intelligence Evolutionary Search Algorithm
Computability and Complexity
Genetic Algorithms overview
Multi-Objective Optimization
“Hard” Optimization Problems
Ant Colony Optimization
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
The Naïve Bayes (NB) Classifier
Emna Krichene 1, Youssef Masmoudi 1, Adel M
Artificial Intelligence CIS 342
Lecture 4. Niching and Speciation (1)
Modeling and Analysis Tutorial
Evolutionary Ensembles with Negative Correlation Learning
Alex Bolsoy, Jonathan Suggs, Casey Wenner
Complexity Theory: Foundations
Presentation transcript:

Feature based Diversity Optimization Wanru Gao^, Samadhi Nallaperuma* and Frank Neumann^ ^University of Adelaide and University of Sheffield* Presentation at Dagsthul seminar 11 – October - 2016

Introduction Understanding the behavior of heuristic search methods is a challenge. This understanding is essential for performance prediction. A general framework to construct a diverse set of instances that are hard or easy for a given search heuristic and a problem. characterizes algorithms and their performance for a given problem based on fea- tures of problem instances. provides a tool for bridging the gap between pure experimental investigations and mathematical methods for analysing the performance of search algorithms.

Feature based Diversity Optimization LetI 1,..., k be the elements of P and f(Ii) be their features values and f (Ii ) ∈ [0, R]. f(I1) ≤ f(I2) ≤ . . . ≤ f(Ik). The diversity contribution of an instance I to a population of instances P is d(I,P) = c(I,P) c(I , P ) is a contribution based on other individuals in the population. Let Ii be an individual for which f(Ii) != f(I1) and f(Ii) != f(Ik). c(Ii,P) = (f(Ii) − f(Ii−1)) · (f(Ii+1) − f(Ii)) If f(Ii) = f(I1) or f(Ii) = f(Ik), we set c(Ii,P) = R2 if there is no other individual I != Ii in P with f(I) = f(Ii) and c(Ii,P) = 0 otherwise.

Experimental Setup

Experimental Setup μ = 30 and λ = 5 Consider TSP and 2OPT algorithm as a case study . α_easy = 1 for instances of size 25 and 50, and α_easy = 1.03 for instances of size 100. α_hard =1.15,1.18,1.2 for instances of size n = 25, 50, 100. n = 25, 50, 100 and each of the 7 features easy and hard instances are generated Normal mutation with σ = 0.025 with probability 0.9 and σ = 0.05 with probability 0.1 in a mutation step.

Results The boxplots for 2 example feature values of a population consisting of 100 different hard or easy TSP instances of different number of cities without(a) or with(b) diversity mechnism

Results Some two features provide a good classification between hard and easy instances

Results Some two features do not provide a clear separation between the hard and easy instances

Results Some three features provide a good classification between hard and easy instances

Classification based on multiple feature combinations The weighted population diversity for a certain set of features {f1,f2,...,fk} is defined as the weighted sum of the normalised population diversity over these k features.

Results of multiple feature combinations

Experimental setup – multiple feature combinations μ = 30 and = 5. The instance sizes 25, 50 and 100 some of the good three-feature combinations are chosen for exploration. The weight distributions for {f1 , f2 , f3 } are {1, 1, 1}, {2, 1, 1}, {1, 2, 1}, {1, 1, 2}, {2, 2, 1}, {2, 1, 2}, {1, 2, 2}. The same hardness thresholds are used in these experiments as previous.

Results of multiple feature combinations

Summary A new methodology of evolving easy/hard instances which are diverse with respect to feature sets of the optimization problem at hand. Covers a much wider range in the feature space than previous methods. Provides instances which are diverse with respect to the investigated features. Good evaluation of the diverse over single or multiple feature values. Large set of diverse instances can be classified quite well into easy and hard instances when considering a suitable combination of three features.

Acknoledgements European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 618091 (SAGE) Australian Research Council under grant agreement DP140103400.

References O. Mersmann, B. Bischl, H. Trautmann, M. Wagner,J. Bossek, and F. Neumann. A novel feature-based approach to characterize algorithm performance for the traveling salesperson problem. Annals of Mathematics and Artificial Intelligence, 69(2):151–182, 2013. K. Smith-Miles and L. Lopes. Measuring instance difficulty for combinatorial optimization problems. Computers & OR, 39(5):875–889, 2012. K. Smith-Miles, J. van Hemert, and X. Y. Lim. Understanding TSP difficulty by learning from evolved instances. In 4th International Conference on Learning and Intelligent Optimization (LION), LION’10, pages 266–280. Springer, 2010. J. I. van Hemert. Evolving combinatorial problem instances that are difficult to solve. Evolutionary Computation, 14(4):433–462, Dec. 2006.

Thank You!