Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Multiobjective Optimization  Particularities of multiobjective optimization  Multiobjective.

Slides:



Advertisements
Similar presentations
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Advertisements

Topic Outline ? Black-Box Optimization Optimization Algorithm: only allowed to evaluate f (direct search) decision vector x objective vector f(x) objective.
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
MOEAs University of Missouri - Rolla Dr. T’s Course in Evolutionary Computation Matt D. Johnson November 6, 2006.
Biased Random Key Genetic Algorithm with Hybrid Decoding for Multi-objective Optimization Panwadee Tangpattanakul, Nicolas Jozefowiez, Pierre Lopez LAAS-CNRS.
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Multi-objective optimization multi-criteria decision-making.
Elitist Non-dominated Sorting Genetic Algorithm: NSGA-II
Optimization methods Review
Multiobjective Optimization Chapter 7 Luke, Essentials of Metaheuristics, 2011 Byung-Hyun Ha R1.
Spring, 2013C.-S. Shieh, EC, KUAS, Taiwan1 Heuristic Optimization Methods Pareto Multiobjective Optimization Patrick N. Ngatchou, Anahita Zarei, Warren.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
A New Evolutionary Algorithm for Multi-objective Optimization Problems Multi-objective Optimization Problems (MOP) –Definition –NP hard By Zhi Wei.
Multi-Objective Evolutionary Algorithms Matt D. Johnson April 19, 2007.
Evolutionary Computation Introduction Peter Andras s.
Evolutionary Computational Inteliigence Lecture 6a: Multimodality.
Multimodal Problems and Spatial Distribution Chapter 9.
Torcs Simulator Presented by Galina Volkinshtein and Evgenia Dubrovsky.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
A New Algorithm for Solving Many-objective Optimization Problem Md. Shihabul Islam ( ) and Bashiul Alam Sabab ( ) Department of Computer Science.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimal Arrangement of Ceiling Cameras for Home Service Robots Using Genetic Algorithms Stefanos Nikolaidis*, ** and Tamio Arai** *R&D Division, Square.
Evolutionary Multi-objective Optimization – A Big Picture Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical.
Radial Basis Function Networks
Genetic Algorithm.
Neural and Evolutionary Computing - Lecture 10 1 Parallel and Distributed Models in Evolutionary Computing  Motivation  Parallelization models  Distributed.
Example II: Linear truss structure
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Constrained Evolutionary Optimization Yong Wang Associate Professor, PhD School of Information Science and Engineering, Central South University
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
A two-stage approach for multi- objective decision making with applications to system reliability optimization Zhaojun Li, Haitao Liao, David W. Coit Reliability.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Introduction to GAs: Genetic Algorithms How to apply GAs to SNA? Thank you for all pictures and information referred.
Neural and Evolutionary Computing - Lecture 6
Omni-Optimizer A Procedure for Single and Multi-objective Optimization Prof. Kalyanmoy Deb and Santosh Tiwari.
Neural and Evolutionary Computing - Lecture 5 1 Evolutionary Computing. Genetic Algorithms Basic notions The general structure of an evolutionary algorithm.
Locating Multiple Optimal Solutions Based on Multiobjective Optimization Yong Wang
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Learning by Simulating Evolution Artificial Intelligence CSMC February 21, 2002.
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
DIVERSITY PRESERVING EVOLUTIONARY MULTI-OBJECTIVE SEARCH Brian Piper1, Hana Chmielewski2, Ranji Ranjithan1,2 1Operations Research 2Civil Engineering.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Evolutionary Design (2) Boris Burdiliak. Topics Representation Representation Multiple objectives Multiple objectives.
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
Multi-objective Optimization
Multiobjective Optimization for Locating Multiple Optimal Solutions of Nonlinear Equation Systems and Multimodal Optimization Problems Yong Wang School.
Multi-objective Evolutionary Algorithms (for NACST/Seq) summarized by Shin, Soo-Yong.
Introduction to GAs: Genetic Algorithms Quantitative Analysis: How to make a decision? Thank you for all pictures and information referred.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
Evolutionary multi-objective algorithm design issues Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical.
Evolutionary Computing Chapter 12. / 26 Chapter 12: Multiobjective Evolutionary Algorithms Multiobjective optimisation problems (MOP) -Pareto optimality.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Multimodal Problems and Spatial Distribution A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 9.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
ZEIT4700 – S1, 2016 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Chapter 14 Genetic Algorithms.
Genetic Algorithms.
An Evolutionary Approach
Fundamental of genetic Algorithms Part 11
Multy- Objective Differential Evolution (MODE)
Multimodal Problems and Spatial Distribution
Multimodal Problems and Spatial Distribution
Heuristic Optimization Methods Pareto Multiobjective Optimization
Multi-Objective Optimization
Chen-Yu Lee, Jia-Fong Yeh, and Tsung-Che Chiang
Multimodal Problems and Spatial Distribution
RM-MEDA: A Regularity Model-Based Multiobjective Estimation of Distribution Algorithm BISCuit EDA Seminar
Lecture 4. Niching and Speciation (1)
Multiobjective Optimization
Presentation transcript:

Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Multiobjective Optimization  Particularities of multiobjective optimization  Multiobjective optimization methods  Evolutionary multiobjective optimization

Neural and Evolutionary Computing - Lecture 9 2 Particularities Multiobjective optimization = simultaneous optimization of several criteria Examples: 1.Find the parameters of a product which simultaneously maximize the reliability and minimize the cost. 2.Solve a routing problem in a communication network such that both the costs and the network congestion are minimized.

Neural and Evolutionary Computing - Lecture 9 3 Particularities Problem: f:R n ->R r, f(x)=(f 1 (x),…,f r (x)) Find x* which satisfies: (i)Inequality constraints: g i (x*)>=0, i=1..p (ii)Equality constraints: h i (x*)=0, i=1..q (iii)Optimize (maximize or minimize each criterion) Rmk: the criteria can be conflicting (e.g.: quality and price of a product: a higher quality usually implies a higher price)

Neural and Evolutionary Computing - Lecture 9 4 Particularities Ex: r=2 f 1 (x)=x 2, f 2 (x)=(x-2) 2 for x in [-2,4] Problem: Minimize both functions. Remark: Does not exist x* which minimizes simultaneously both functions Idea: Find compromise solutions, e.g. x which satisfies: x in [0,2] – does not exist x’ such that f 1 (x’)<f 1 (x) and f 2 (x’)<f 2 (x) Such a compromise solution Is called Pareto solution

Neural and Evolutionary Computing - Lecture 9 5 Basic notions in Pareto optimization 1.Domination: y dominates y’ (for a minimization problem) if y i <=y’ i for each i and the inequality is strict for at least one component f1f1 f2f2 y y’ y dominates y’ y y’ y does not dominate y’ and y’ does not dominate y f1f1 f2f2 The domination relation is a partial order relation

Neural and Evolutionary Computing - Lecture 9 6 Basic notions in Pareto optimization 2. Non-dominated element (with respect to a set): y is non-dominated with respect to V if there is no element in V which dominates y f1f1 f2f2 The red elements are globally non-dominated The green elements are non- dominated with respect to the set of green and blue elements

Neural and Evolutionary Computing - Lecture 9 7 Basic notions in Pareto optimization 3.Pareto optimal solution An element x is called Pareto optimal solution if there is no element x’ such that f(x’) dominates f(x) The set of all Pareto optimal elements for a multiobjective optimization problem is colled Pareto optimal solution. In this case the Pareto optimal set is [0,2]

Neural and Evolutionary Computing - Lecture 9 8 Basic notions in Pareto optimization 4. Pareto Front The set of values of the objective functions for all elements in the Pareto optimal set is called Pareto front Pareto Front

Neural and Evolutionary Computing - Lecture 9 9 Solving methods 1.Transform the multicriterial problem in a one-criterial problem : combine all criteria in just one criterion  Aggregation method Advantages: the problem to be solved is easier Disadvantages:  For a set of parameters w one obtains just one solution  The user has to specify appropriate values for w

Neural and Evolutionary Computing - Lecture 9 10 Solving methods 1.Transform the multicriterial problem in a one-criterial problem : combine all criteria in just one criterion  Goal attainment method: Advantages: the problem to be solved is easier Disadvantages: the goal values should be known

Neural and Evolutionary Computing - Lecture 9 11 Solving methods 2. Approximate the Pareto optimal set by using a population of candidates  Apply an EA in order to generate elements approaching the Pareto optimal set (and the corresponding Pareto from)  The Pareto front approximation should satisfy at least two properties:  Closeness to the real Pareto front  Uniform spread of points

Neural and Evolutionary Computing - Lecture 9 12 Pareto Evolutionary Optimization There are several techniques which can be used in order to obtain Pareto fronts which satisfy the two previous conditions:  Use a selection process based on the non-dominance relation  Use a crowding factor to stimulate the elements in less crowded regions  Change the fitness function by using a sharing mechanism  Mating restriction  Ensure the elitism (e.g. use an archive)

Neural and Evolutionary Computing - Lecture 9 13 Pareto Evolutionary Optimization Selection criteria:  Use a nondomination rank (ex: NSGA – Nondominated Sorting GA)  The population is organized in nondomination layers:  The first layer consists of nondominated elements  The nondominated elements in the set obtained by removing the first layer represent the second layer etc. First layer (rank=1) Second layer (rank=2) Third layer (rank=3)  An element is better if its nondomination rank is smaller  During the selection process the parents population is joined with offspring population and this set is increasingly sorted by rank.

Neural and Evolutionary Computing - Lecture 9 14 Pareto Evolutionary Optimization Selection criteria:  The fitness value of an element will depend on:  The number of elements which it dominates (direct relationship)  The number of elements which dominate it (inverse relationship) Ex: SPEA – Strength Pareto EA

Neural and Evolutionary Computing - Lecture 9 15 Pareto Evolutionary Optimization Using a crowding factor  Aim: stimulate the diversity of the Pareto front  Idea: between two elements belonging to the same layer the one having a smaller crowding factor is preferred  The crowding factor for an element is computed by using the distance between that element and the closest neigbours. a b Value of the crowding factor: (a+b)/2

Neural and Evolutionary Computing - Lecture 9 16 Pareto Evolutionary Optimization Sharing mechanism:  Idea: if a group of individuals share the same resource their survival chance is increasing with the size of the resource and is decreasing with the size of the group  The fitness of an element is adjusted by dividing its value by a function (called sharing function) which depends on the distances between the elements in the group.

Neural and Evolutionary Computing - Lecture 9 17 Pareto Evolutionary Optimization Sharing mechanism:  Allows to differentiate elements which are mutually non- dominated  The main disadvantage is the need to specify the niche radius (σ s ) σ s =3 σ s =2 σ s =1 It is also beneficial in the case of multimodal optimization (when the goal is to approximate several optima or even all local optima

Neural and Evolutionary Computing - Lecture 9 18 Pareto Evolutionary Optimization Restricted mating:  Idea: the crossover is accepted only for elements which satisfy some restrictions  Goal: avoid the generation of low fitness. Examples: 1.Accept as parents only elements which are similar enough 2.Accept as parents only nondominated elements

Neural and Evolutionary Computing - Lecture 9 19 Pareto Evolutionary Optimization Archive:  Aim: elitism (preserve the nondominate elements discovered during the evolution)  The archive will be the approximation of the Pareto optimal set  Disadvantage: it is necessary a mechanism to process the archive:  A new element is accepted in the archive only if it is nondominated  If the new element dominates some elements in the archive then these elements are eliminated  In order to avoid the increase of the archive it should be periodically reorganized (e.g. by clustering or by eliminating some of the elements which are too similar to other elements)

Neural and Evolutionary Computing - Lecture 9 20 Pareto Evolutionary Optimization Using an archive: archivepopulation The new population The new archive Evaluation Selection Generation of new elements Archive update Archive truncation

Neural and Evolutionary Computing - Lecture 9 21 Pareto Evolutionary Optimization Bibliographical resource: (2940 references – september 2007) (3519 references – november 2008) (4388 references – october 2009) (4861 references – february 2010)