Chapter 7 Handling Constraints

Slides:



Advertisements
Similar presentations
Crew Pairing Optimization with Genetic Algorithms
Advertisements

Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
1 Transportation problem The transportation problem seeks the determination of a minimum cost transportation plan for a single commodity from a number.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Constrained optimization Indirect methods Direct methods.
Nonlinear Programming
Linear Programming Fundamentals Convexity Definition: Line segment joining any 2 pts lies inside shape convex NOT convex.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
1 IOE/MFG 543 Chapter 14: General purpose procedures for scheduling in practice Section 14.5: Local search – Genetic Algorithms.
Data Mining CS 341, Spring 2007 Genetic Algorithm.
1 Chapter 8: Linearization Methods for Constrained Problems Book Review Presented by Kartik Pandit July 23, 2010 ENGINEERING OPTIMIZATION Methods and Applications.
Chapter 10: Iterative Improvement
Unconstrained Optimization Problem
Handling Constraints 報告者 : 王敬育. Many researchers investigated Gas based on floating point representation but the optimization problems they considered.
Lecture outline Support vector machines. Support Vector Machines Find a linear hyperplane (decision boundary) that will separate the data.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Ch. 9: Direction Generation Method Based on Linearization Generalized Reduced Gradient Method Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010.
1 © 2010 Pearson Education, Inc. All rights reserved © 2010 Pearson Education, Inc. All rights reserved Chapter 8 Systems of Equations and Inequalities.
IE 607 Constrained Design: Using Constraints to Advantage in Adaptive Optimization in Manufacturing.
Linear Programming - Standard Form
Ranga Rodrigo April 6, 2014 Most of the sides are from the Matlab tutorial. 1.
Genetic Algorithm.
AUTOMATIC CONTROL THEORY II Slovak University of Technology Faculty of Material Science and Technology in Trnava.
1 Chapter 8 Nonlinear Programming with Constraints.
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
L4 Graphical Solution Homework See new Revised Schedule Review Graphical Solution Process Special conditions Summary 1 Read for W for.
ECE 556 Linear Programming Ting-Yuan Wang Electrical and Computer Engineering University of Wisconsin-Madison March
Systems of Inequalities in Two Variables Sec. 7.5a.
Optimal resource assignment to maximize multistate network reliability for a computer network Yi-Kuei Lin, Cheng-Ta Yeh Advisor : Professor Frank Y. S.
PowerPoint Slides by Robert F. BrookerHarcourt, Inc. items and derived items copyright © 2001 by Harcourt, Inc. Managerial Economics in a Global Economy.
Christoph F. Eick: Using EC to Solve Transportation Problems Transportation Problems.
Part 4 Nonlinear Programming 4.5 Quadratic Programming (QP)
Genetic Algorithms Czech Technical University in Prague, Faculty of Electrical Engineering Ondřej Vaněk, Agent Technology Center ZUI 2011.
Kanpur Genetic Algorithms Laboratory IIT Kanpur 25, July 2006 (11:00 AM) Multi-Objective Dynamic Optimization using Evolutionary Algorithms by Udaya Bhaskara.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
OR Chapter 8. General LP Problems Converting other forms to general LP problem : min c’x  - max (-c)’x   = by adding a nonnegative slack variable.
Slide Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
Part 3. Linear Programming 3.2 Algorithm. General Formulation Convex function Convex region.
A survey of Constraint Handling Techniques in Evolutionary Computation Methods Author: Zbigneiw Michalewicz Presenter: Masoud Mazloom 27 th Oct
Preliminary Background Tabu Search Genetic Algorithm.
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Ch. Eick: Num. Optimization with GAs Numerical Optimization General Framework: objective function f(x 1,...,x n ) to be minimized or maximized constraints:
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Business Mathematics MTH-367 Lecture 14. Last Lecture Summary: Finished Sec and Sec.10.3 Alternative Optimal Solutions No Feasible Solution and.
Constraints in Evolutionary Algorithms. Constraints: the big questions, page 233  how to evaluate and compare feasible and infeasible solutions  avoid,
1 Chapter 6 Reformulation-Linearization Technique and Applications.
1 Chapter 5 Branch-and-bound Framework and Its Applications.
Genetic Algorithm(GA)
ZEIT4700 – S1, 2016 Mathematical Modeling and Optimization School of Engineering and Information Technology.
An Introduction to Linear Programming
Balancing of Parallel Two-Sided Assembly Lines via a GA based Approach
Lecture 3.
Computational Optimization
C.-S. Shieh, EC, KUAS, Taiwan
Numerical Optimization
The minimum cost flow problem
Dr. Arslan Ornek IMPROVING SEARCH
Chapter 3 The Simplex Method and Sensitivity Analysis
Part 3. Linear Programming
CS5321 Numerical Optimization
Graphical Solution Procedure
Linear Programming Example: Maximize x + y x and y are called
Part 4 - Chapter 13.
Part 4 Nonlinear Programming
Chapter 6. Large Scale Optimization
Part 4 Nonlinear Programming
Multidisciplinary Optimization
Constraints.
Presentation transcript:

Chapter 7 Handling Constraints

NonLinear Programming Problem NLP with linear constraints: Optimize Domain constraints: Equalities: Inequalities:

GENOCOP I Original GENOCOP (GEnetic algorithm, for Numerical Optimization for COnstrained Problems): With linear constrain An elimination of the equalities (convex) Special “genetic” operators

Example Optimize a function of six variables: subject to the following constraints Express four variables as functions of the remaining two:

Reduce the original problem to the optimization problem of a function of two variables and : Subject to the following constraints (inequalities only): These inequalities can be further reduced to:

Elimination of Equalities Equality constraint set: Split A: New set of inequalities (after removal ): Split C:

Final set of constraints: original domain constraints: new inequalities: original inequalities (after removal of variables):

Example (1) Optimize a function of six variables: subject to the following constraints Domain constraints: Equalities: Inequalities:

Example (2) Transportation problem:

Initialization process Representation floating point representation   Initialization process A subset of potential solutions -- the space of the whole feasible region (randomly) The remaining subset -- the boundary of the solution space.   Genetic operators dynamic non-uniform.

Mutation Uniform mutation Boundary mutation Non-uniform mutation

Crossover Arithmetical crossover Simple crossover Heuristic crossover

GENOCOP II With non-linear constrain Distinguish between linear and nonlinear constraints A single starting point Quadratic penalty function Iterative execution of GENOCOP

Algorithm Procedure GENOCOP II begin split the set of constraints C into select a starting point ( need not be feasible.) set the set of active constraints, A to (V: violated constraints at point ) set penalty  

while (not termination-condition) do begin execute GENOCOP I for the function with linear constraints L and the starting point save the best individual : update A: decrease penalty r: (where ; end

Example Minimize s.t. Iteration Number 1 2 3 4 The best point (0,0) 1 2 3 4 The best point (0,0) (3,4) (2.06, 3.98) (2.3298, 3.1839) (2.3295, 3.1790) Active Constraints none c2 c1 , c2

Other Techniques Homaifar Joines and Houck

Schoenauer and Xanthakis Start with a random population of individuals (feasible or infeasible) Set ( j is a constraint counter) Evolve this population with , until a given percentage of population (flip threshold ) is feasible for this constraint Set The current population is the starting point for the next phase of the evolution, where . If , repeat the last two steps, otherwise optimize the objective function ,

Powell and Skolnick

Bean and Hadj-Alouane

GENOCOP III Two separate populations Repair: (search point): satisfy linear constraints (reference point): satisfy all constraints Repair: Feasible points: (reference point ) Infeasible search points: (search point ) ( : better reference points) ( is feasible) ( :probability of replacement) (if is better than ) ( :probability of replacement)

Extend GENOCOP III Nonlinear equations