Evolutionary multi-objective algorithm design issues Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical.

Slides:



Advertisements
Similar presentations
Topic Outline ? Black-Box Optimization Optimization Algorithm: only allowed to evaluate f (direct search) decision vector x objective vector f(x) objective.
Advertisements

Non-dominated Sorting Genetic Algorithm (NSGA-II)
MOEAs University of Missouri - Rolla Dr. T’s Course in Evolutionary Computation Matt D. Johnson November 6, 2006.
Angers, 10 June 2010 Multi-Objective Optimisation (II) Matthieu Basseur.
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Multi-objective optimization multi-criteria decision-making.
Elitist Non-dominated Sorting Genetic Algorithm: NSGA-II
Multiobjective Optimization Chapter 7 Luke, Essentials of Metaheuristics, 2011 Byung-Hyun Ha R1.
Multi-Objective Optimization Using Evolutionary Algorithms
Spring, 2013C.-S. Shieh, EC, KUAS, Taiwan1 Heuristic Optimization Methods Pareto Multiobjective Optimization Patrick N. Ngatchou, Anahita Zarei, Warren.
A New Evolutionary Algorithm for Multi-objective Optimization Problems Multi-objective Optimization Problems (MOP) –Definition –NP hard By Zhi Wei.
Diversity Maintenance Behavior on Evolutionary Multi-Objective Optimization Presenter : Tsung Yu Ho at TEILAB.
Multimodal Problems and Spatial Distribution Chapter 9.
Torcs Simulator Presented by Galina Volkinshtein and Evgenia Dubrovsky.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #36 4/29/02 Multi-Objective Optimization.
The Pareto fitness genetic algorithm: Test function study Wei-Ming Chen
Resource Allocation Problem Reporter: Wang Ching Yu Date: 2005/04/07.
A New Algorithm for Solving Many-objective Optimization Problem Md. Shihabul Islam ( ) and Bashiul Alam Sabab ( ) Department of Computer Science.
Multiobjective Optimization Athens 2005 Department of Architecture and Technology Universidad Politécnica de Madrid Santiago González Tortosa Department.
Optimal Arrangement of Ceiling Cameras for Home Service Robots Using Genetic Algorithms Stefanos Nikolaidis*, ** and Tamio Arai** *R&D Division, Square.
1 Reasons for parallelization Can we make GA faster? One of the most promising choices is to use parallel implementations. The reasons for parallelization.
IE 607 Constrained Design: Using Constraints to Advantage in Adaptive Optimization in Manufacturing.
Evolutionary Multi-objective Optimization – A Big Picture Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical.
On comparison of different approaches to the stability radius calculation Olga Karelkina Department of Mathematics University of Turku MCDM 2011.
Selected Topics in Evolutionary Algorithms II Pavel Petrovič Department of Applied Informatics, Faculty of Mathematics, Physics and Informatics
MOGADES: Multi-Objective Genetic Algorithm with Distributed Environment Scheme Intelligent Systems Design Laboratory , Doshisha University , Kyoto Japan.
Constrained Evolutionary Optimization Yong Wang Associate Professor, PhD School of Information Science and Engineering, Central South University
Inferring Temporal Properties of Finite-State Machines with Genetic Programming GECCO’15 Student Workshop July 11, 2015 Daniil Chivilikhin PhD student.
Omni-Optimizer A Procedure for Single and Multi-objective Optimization Prof. Kalyanmoy Deb and Santosh Tiwari.
Optimization of the ESRF upgrade lattice lifetime and dynamic aperture using genetic algorithms Nicola Carmignani 11/03/2015.
Hybird Evolutionary Multi-objective Algorithms Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information.
Guaranteed Convergence and Distribution in Evolutionary Multi- Objective Algorithms (EMOA’s) via Achivement Scalarizing Functions By Karthik.
FORS 8450 Advanced Forest Planning Lecture 5 Relatively Straightforward Stochastic Approach.
Kanpur Genetic Algorithms Laboratory IIT Kanpur 25, July 2006 (11:00 AM) Multi-Objective Dynamic Optimization using Evolutionary Algorithms by Udaya Bhaskara.
DIVERSITY PRESERVING EVOLUTIONARY MULTI-OBJECTIVE SEARCH Brian Piper1, Hana Chmielewski2, Ranji Ranjithan1,2 1Operations Research 2Civil Engineering.
A core Course on Modeling Introduction to Modeling 0LAB0 0LBB0 0LCB0 0LDB0 S.25.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Evolutionary Design (2) Boris Burdiliak. Topics Representation Representation Multiple objectives Multiple objectives.
2/29/20121 Optimizing LCLS2 taper profile with genetic algorithms: preliminary results X. Huang, J. Wu, T. Raubenhaimer, Y. Jiao, S. Spampinati, A. Mandlekar,
FORS 8450 Advanced Forest Planning Lecture 6 Threshold Accepting.
Soft Computing Multiobjective Optimization Richard P. Simpson.
Yuan-Ze University A Genetic Algorithm with Injecting Artificial Chromosomes for Single Machine Scheduling Problems Pei-Chann Chang, Shih-Shin Chen, Qiong-Hui.
Multi-objective Evolutionary Algorithms (for NACST/Seq) summarized by Shin, Soo-Yong.
Introduction to GAs: Genetic Algorithms Quantitative Analysis: How to make a decision? Thank you for all pictures and information referred.
Tamaki Okuda ● Tomoyuki Hiroyasu   Mitsunori Miki   Shinya Watanabe  
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Multiobjective Optimization  Particularities of multiobjective optimization  Multiobjective.
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Genetic Algorithm (GA)
Evolutionary Computing Chapter 12. / 26 Chapter 12: Multiobjective Evolutionary Algorithms Multiobjective optimisation problems (MOP) -Pareto optimality.
1 ParadisEO-MOEO for a Bi-objective Flow-Shop Scheduling Problem May 2007 E.-G. Talbi and the ParadisEO team
Zhengli Huang and Wenliang (Kevin) Du
- Divided Range Multi-Objective Genetic Algorithms -
Multimodal Problems and Spatial Distribution A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 9.
ZEIT4700 – S1, 2016 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Intelligent Exploration for Genetic Algorithms Using Self-Organizing.
L-Dominance: An Approximate-Domination Mechanism
An Evolutionary Approach
Alan P. Reynolds*, David W. Corne and Michael J. Chantler
Particle Swarm Optimization (2)
Presented by: Dr Beatriz de la Iglesia
Fundamental of genetic Algorithms Part 11
Multy- Objective Differential Evolution (MODE)
Multimodal Problems and Spatial Distribution
Heuristic Optimization Methods Pareto Multiobjective Optimization
Multi-Objective Optimization
Chen-Yu Lee, Jia-Fong Yeh, and Tsung-Che Chiang
○ Hisashi Shimosaka (Doshisha University)
Multimodal Problems and Spatial Distribution
MOEA Testing and Analysis
Chap 7: Penalty functions (1/2)
Multiobjective Optimization
Presentation transcript:

Evolutionary multi-objective algorithm design issues Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology

Objectives The objectives of this lecture are to: Address the design issues of evolutionary multi-objective optimization algorithms – Fitness assignment – Diversity preservation – Elitism Explore ways to handle Constraints

K. Deb. Multi-Objective Optimization using Evolutionary Algorithms. Wiley, Chichester, E. Zitzler, M. Laumanns, S. Bleuler. A Tutorial on Evolutionary Multiobjective Optimization, in Metaheuristics for Multiobjective Optimisation, 3-38, Springer-Verlag, References

The approximation of the Pareto front is itself multi-objective. – Convergence: Compute solutions as close as possible to Pareto front quickly. – Diversity: Maximize the diversity of the Pareto solutions. It is impossible to describe – What a good approximation can be for a Pareto optimal front. – Proximity to the Pareto optimal front. Algorithm design issues

Unlike single objective, multiple objectives exists. – Fitness assignment and selection go hand in hand. Fitness assignment can be classified in to following categories: – Scalarization based E.g. Weighted sum, MOEA/D – Objective based VEGA – Dominance based NSGA-II Fitness assignment

Scalarization based (Aggregation based): – Aggregate the objective functions to form a single objective. – Vary the parameters in the single objective function to generate multiple Pareto optimal solutions. Fitness assignment f 1 (x), f 2 (x),…, f k (x) w 1 f 1 (x) + w 2 f 2 (x),…, w k f k Or, max(w i (f i - z i )) w 1 f 1 (x) + w 2 f 2 (x),…, w k f k Or, max(w i (f i - z i )) F F Parameters - weights

Advantages – Weighted sum – Easy to understand and implement. – Fitness assignment is computationally efficient. – If time available is short can be used to quickly provide a Pareto optimal solution. Disadvantages - Weighted sum – Non-convex Pareto optimal fronts cannot be handled. Fitness assignment

Objective based – Switch between objectives in the selection phase. Every time an individual is chosen for reproduction, a different objective decides. – E.g. Vector evaluated genetic algorithm (VEGA) proposed by David Schaffer. First implementation of an evolutionary multi-objective optimization algorithm. Subpopulations are created and each subpopulation is evaluated with a different objective. Fitness assignment f1f1 f1f1 f2f2 f2f2 f3f3 f3f3 Population Mating pool New population Selection Reproduction

Advantages – Simple idea and easy to implement. – Simple single objective genetic algorithm can be easily extended to handle multi-objective optimization problems. – Has tendency to produce solutions near the individual best for every objective. An advantage when this property is desirable. Disadvantages – Each solution is evaluated only with respect to one objective. In multi-objective optimization algorithm all solutions are important. – Individuals may be stuck at local optima of individual objectives. Fitness assignment

Dominance based – Pareto dominance based fitness ranking proposed by Goldberg in Different ways – Dominance rank: Number of individuals by which an individual is dominated. E.g. MOGA, SPEA2 – Dominance depth: The fitness is based on the front an individual belongs. NSGA-II – Dominance count: Number of individuals dominated by an individual. SPEA2 Fitness assignment

Dominance count Dominance rank Dominance depth 1 2 3

Chance of an individual being selected – Increases: Low number of solutions in its neighborhood. – Decreases: High number of solutions in its neighborhood. There are at least three types: – Kernel methods – Nearest neighbor – Histogram Diversity preservation

Kernel methods: – Sum of f values, where f is a function of distance. – E.g. NSGA Nearest neighbor – The perimeter of the cuboid formed by the nearest neighbors as the vertices. – E.g. NSGA-II f f f Diversity preservation i i+1 i-1

Histogram – Number of elements in a hyperbox. – E.g. PAES Diversity preservation

Elitism is needed to preserve the promising solutions Elitism Old populationOffspring New population No archive strategy Old population New population Offspring Archive New Archive

Penalty function approach – For every solution, calculate the overall constraint violation, OCV (sum of Constraint violations). – F m (x i ) = f m (x i ) + OCV Solution - (x i ), f m (x i )- m th objective value for x i,, F m (x i ) – Overall m th objective value for x i. OCV is added to each of the objective function values. Use constraints as additional objectives – Usually used when feasible search space is very narrow. Constraint handling

Deb’s constraint domination strategy – A solution x i constraint dominates a solution x j, if any is true: x i is feasible and x j is not. x i and x j are both infeasible, but x i has a smaller constraint violation. x i and x j are feasible and x i dominates x j. – Advantages: Penalty less approach. Easy to implement and clearly distinguishes good from bad solutions. Can handle, if population has only infeasible solutions. – Disadvantages: Problem to maintain diversity of solutions. Slightly infeasible and near optimal solutions are preferred over feasible solutions far from optima. Constraint handling