Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
February 21, 2002 Simplex Method Continued
February 14, 2002 Putting Linear Programs into standard form
Maximal Independent Subsets of Linear Spaces. Whats a linear space? Given a set of points V a set of lines where a line is a k-set of points each pair.
Crew Pairing Optimization with Genetic Algorithms
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Local Search Algorithms
1 Modeling and Simulation: Exploring Dynamic System Behaviour Chapter9 Optimization.
Optimization 1/33 Radford, A D and Gero J S (1988). Design by Optimization in Architecture, Building, and Construction, Van Nostrand Reinhold, New York.
ZEIT4700 – S1, 2014 Mathematical Modeling and Optimization School of Engineering and Information Technology.
1 Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
13-Optimization Assoc.Prof.Dr. Ahmet Zafer Şenalp Mechanical Engineering Department Gebze Technical.
Linear Programming – Simplex Method: Computational Problems Breaking Ties in Selection of Non-Basic Variable – if tie for non-basic variable with largest.
Geometry and Theory of LP Standard (Inequality) Primal Problem: Dual Problem:
CS6800 Advanced Theory of Computation
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
1 Transportation problem The transportation problem seeks the determination of a minimum cost transportation plan for a single commodity from a number.
Dragan Jovicic Harvinder Singh
Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Content Based Image Clustering and Image Retrieval Using Multiple Instance Learning Using Multiple Instance Learning Xin Chen Advisor: Chengcui Zhang Department.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
EAs for Combinatorial Optimization Problems BLG 602E.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Handling Constraints 報告者 : 王敬育. Many researchers investigated Gas based on floating point representation but the optimization problems they considered.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
MAE 552 – Heuristic Optimization Lecture 5 February 1, 2002.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
7/2/2015Intelligent Systems and Soft Computing1 Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Introduction and Basic Concepts
Stevenson and Ozgur First Edition Introduction to Management Science with Spreadsheets McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies,
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
IE 607 Constrained Design: Using Constraints to Advantage in Adaptive Optimization in Manufacturing.
Computer Implementation of Genetic Algorithm
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Constrained Evolutionary Optimization Yong Wang Associate Professor, PhD School of Information Science and Engineering, Central South University
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Chapter 7 Handling Constraints
1 Combinatorial Problem. 2 Graph Partition Undirected graph G=(V,E) V=V1  V2, V1  V2=  minimize the number of edges connect V1 and V2.
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Exact and heuristics algorithms
 Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms n Introduction, or can evolution be intelligent? n Simulation.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 5: Power of Heuristic; non- conventional search.
Automated discovery in math Machine learning techniques (GP, ILP, etc.) have been successfully applied in science Machine learning techniques (GP, ILP,
A survey of Constraint Handling Techniques in Evolutionary Computation Methods Author: Zbigneiw Michalewicz Presenter: Masoud Mazloom 27 th Oct
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
1 Combinatorial Problem. 2 Graph Partition Undirected graph G=(V,E) V=V1  V2, V1  V2=  minimize the number of edges connect V1 and V2.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Constraints in Evolutionary Algorithms. Constraints: the big questions, page 233  how to evaluate and compare feasible and infeasible solutions  avoid,
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithm(GA)
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
ZEIT4700 – S1, 2016 Mathematical Modeling and Optimization School of Engineering and Information Technology.
School of Computer Science & Engineering
C.-S. Shieh, EC, KUAS, Taiwan
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
Parameter control Chapter 8.
Parameter control Chapter 8.
Parameter control Chapter 8.
Presentation transcript:

Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)  0, i=1, ,n hj(x)=0, j=1, ,p

How to solve Penalty functions Special representations and operators Repair algorithms Separation of objectives and constraints Hybrid methods

Penalty functions The most common approach to handle constraints (particularly, inequality constraints) is to use penalties. The basic idea is to transform a constrained-optimization problem into an unconstrained one by adding/substracting a certain value to/from the objective function based on the amount of constraint violation present in a certain solution.

General formulation of the penalty function The general formulation of the penalty function is  (x)=f(x)  (  ri  Gi+  cj  Lj) where  (x) is the new objective function to be optimized, Gi and Lj are functions of the constraints gi(x) and hj(x), respectively, and ri and cj are positive constants normally called “penalty factors” The most common form of Gi and Lj is Gi=max(0,gi(x))  Lj=|hj(x)|  where  and  are normally 1 or 2

Minimum penalty rule The penalty should be kept as low as possible, just above the limit below which infeasible solutions are optimal. If the penalty is too high and the optimum lies at the boundary of the feasible region, the GA will be pushed inside the feasible region very quickly and will not be able to move back towards the boundary with the infeasible region. If there are several disjoint feasible regions in the search space, the GA would tend to move to one of them and would not be able to move to a different feasible region. If the penalty is too low, a lot of the search time will be spent exploring the infeasible region because the penalty will be negligible with respect to the objective functions.

Static Penalties The penalty factors remain constant during the entire evolutionary process. Fitness(x)=f(x)  (Ri  max[0,gi(x)] 2 )

Dynamic penalties Current generation number is involved in the computation of the corresponding penalty factors Fitness(x)=f(x)  ((Ri  t)   max[0,gi(x)] 2 )

Annealing penalties Penalty factors are changed once in many generations (after the algorithm has been trapped in a local optima) Fitness(x)=f(x)  max[0,gi(x)] 2 /  where  is the cooling schedule.

Adaptive penalties Use a penalty function which takes a feedback from the search process. Fitness(x)=f(x)  (t)  max[0,gi(x)] 2 where (t) is updated at every generation t in the following way If the best individual in the last k generations was always feasible: (t+1)= (t)/ ,  >1 If the best individual in the last k generations was never feasible: (t+1)=  (t),  >1 Otherwise, there are some feasible and infeasible individuals tied as best in the population, the penalty does not change.

Segregated GA Use two penalty parameters instead of one. These two values aim at achieving a balance between heavy and moderate penalties. A population of size 2m is generated. These individuals are divided into two groups. Each group use one penalty parameter. Choose m best individuals from these two groups to become parents for the next generation. These m parents produce m offspring. The 2m individuals are combined together as a new generation.

Death penalty The rejection of infeasible individuals is probably the easiest way to handle constraints.

Special Representations and operators Solve a certain problem for which a generic representation scheme might not be appropriate To simplify the shape of the search space It is always difficult to locate at least a single feasible solution.

Linear constraints Eliminate equality constraints together with an equal number of problem variables The search space is a convex sets. Crossover: linear combinations of individuals Mutation: Re-define the domain.

Linear constraints: example Optimize f(x1,x2,x3,x4,x5,x6) 2x1+x2+x3=6 x3+x5-3x6=10 x1+4x4=3 x2+x5   x1   x2  75 0  x3  10 5  x4  15 0  x5   x6  5

Eliminate equality constraints x1=3-4x4 x2=-10+8x4+x5-3x6 x3=10-x5+3x6

New problem Optimize g(x4,x5,x6)=f(3-4x4,-10+8x4+x5-3x6,10- x5+3x6,x4,x5,x6) -10+8x4+2x5-3x6  120 (x2+x5  120) -40  3-4x4  20 (-40  x1  20) 50  -10+8x4+x5-3x6  75 (50  x2  75) 0  10-x5+3x6  10 (0  x3  10) 5  x4  15 0  x5   x6  5 Combine 2nd and 5th: 5  x4  10.75

Mutation If (x4,x5,x6)=(10,8,2), then x4  [7.25,10.375] x5  [6,11] x6  [1,2.666]

Heuristic crossover Suppose f(x2) is better than f(x1) x3=r  (x2-x1)+x2 r  [0,1] Check if x3 satisfy all the constraints

Decoder A chromosome “gives instructions” on how to build a feasible solution. For each feasible solution there must be a decoded solution Each decoded solution must correspond to a feasible solution The transformation is computationally fast and it has locality feature in the sense that small changes in the decoded solution result in small changes in the solution itself

TSP and ordinal expression The ith gene belongs to the region [1,n-i+1]. It determines which city will be chosen from the city list.

Ordinal expression City list ( ) Chromosome ( ) Then the path is ( )

Ordinal expression City list ( ) Chromosome ( ) Then the path is (1) City list ( ) Chromosome ( ) Then the path is (1 2) City list ( ) Chromosome ( ) Then the path is (1 2 4) City list ( ) Chromosome ( ) Then the path is ( ) City list ( ) Chromosome ( ) Then the path is ( ) …...

Crossover The traditional one-point crossover can be used p1=( ) p2=( ) which correspond to the paths ( ) ( )

Crossover After crossover c1=( ) c2=( ) The new paths are ( ) ( )

Repair algorithms Repair an infeasible individual, i.e., to make feasible an infeasible individual. Such a repaired version can be used either for evaluation only, or it can also replace (with some probability) the original individual in the population.

Basic rule There are no standard heuristics for the design of repair algorithms The success of this approach relies mainly on the ability of the user to come up with such a heuristics. It is possible to use a greedy algorithm (i.e., an optimization algorithm that proceeds through a series of alternatives by making the best decision, as computed locally, at each point in the series), a random algorithm or any other heuristic which would guide the repair process.

Repair by random evolution The main idea is to use random evolutionary search combined with a mathematical programming technique for unconstrained optimization. Whenever a solution is not feasible, the following constraint functional is minimized C(x)=  hj 2 (x)-  gj(x) c1={i=1, ,n| |hi(x)|>  } c2={j=1, ,q| gj(x)<0}

Separation of constraints and objectives Handle constraints and objectives separately Superiority of feasible points Behavioral memory

Superiority of feasible points Evaluations of feasible solutions are mapped into a better interval, and infeasible solution into a worst interval Modified tournament selection A feasible solution is always better than an infeasible one Between two feasible solutions, the one having a better objective function value is preferred. Between two infeasible solutions, the one having smaller constraints violation is preferred

Behavioral memory Constraints are handled in a particular order Start with a random population of individuals Set j=1 (j is the constraint counter) Evolve this population to minimize the violation of the jth constraint, until a given percentage of the population is feasible for this constraint. Points that do not satisfy at least one of the 1st, 2nd, , (j-1)th constraints are eliminated from the population. j=j+1. If j<=m repeat the former step. Optimize the objective function and reject infeasible individuals.

Hybrid methods In this section we are considering methods that are coupled with another technique (normally a numerical optimization approach) to handle constraints.

Fuzzy logic Replace constraints of the form gi(x)  bi by a set of fuzzy functions ci(x) if gi(x)  bi, ci(x)=1 if bi<gi(x)  bi+s, ci(x)=(exp(-p((x-bi)/s)^2)-exp(-p))/(1-exp(-p)) if gi(x)>bi+s, ci(x)=0 This method allows a higher degree of tolerance if gi(x) is greater than bi but close to bi. The tolerance decrease rapidly when the error increase. fitness(x)=f(x)  min(c1(x), ,cm(x))

Reliability of Communication System

R(k): valid probability of component k Q(k): invalid probability of component k Rs: valid probability of the system Qs: invalid probability of the system Qs=(Q(1)Q(4)) 2 R(3)+(Q(2)+R(2)Q(1)Q(4)) 2 Q(3) C(k)=K(k)R(k)^a(k): cost of component k Cs=2C(1)+2C(2)+C(3)+2C(4): cost of the system

Minimum cost problem Min Cs s.t. Rmin  Rs Rmin(k)  R(k)  1.0 We use penalty function Min Cs+  max[0, Rmin-Rs] s.t. Rmin(k)  R(k)  1.0

Maximum reliability min Qs s.t. Cs  Cmax Rmin(k)  R(k)  1.0 We use variable replacement: x(k)=R(k)^a(k) Then the constraints are 2K(1)X(1)+2K(2)X(2)+K(3)X(3)+2K(4)X(4)  Cmax Rmin(k)^a(k)  X(k)  1.0