Ch. 9: Direction Generation Method Based on Linearization Generalized Reduced Gradient Method Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010.

Slides:



Advertisements
Similar presentations
February 14, 2002 Putting Linear Programs into standard form
Advertisements

Integer Optimization Basic Concepts Integer Linear Program(ILP): A linear program except that some or all of the decision variables must have integer.
Chapter 5: Linear Programming: The Simplex Method
Lecture #3; Based on slides by Yinyu Ye
Engineering Optimization
Optimization. f(x) = 0 g i (x) = 0 h i (x)
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
ENGINEERING OPTIMIZATION
1 Chapter 8: Linearization Methods for Constrained Problems Book Review Presented by Kartik Pandit July 23, 2010 ENGINEERING OPTIMIZATION Methods and Applications.
Chapter 10: Iterative Improvement
Optimization Mechanics of the Simplex Method
Simplex Method LP problem in standard form. Canonical (slack) form : basic variables : nonbasic variables.
Optimization Linear Programming and Simplex Method
Unconstrained Optimization Problem
Advanced Topics in Optimization
MIT and James Orlin © Chapter 3. The simplex algorithm Putting Linear Programs into standard form Introduction to Simplex Algorithm.
Table of Contents Solving Linear Systems of Equations - Addition Method Recall that to solve the linear system of equations in two variables... we need.
Linear programming Lecture (4) and lecture (5). Recall An optimization problem is a decision problem in which we are choosing among several decisions.
Linear Programming - Standard Form
Chapter 3 Linear Programming Methods 高等作業研究 高等作業研究 ( 一 ) Chapter 3 Linear Programming Methods (II)
1 Chapter 8 Nonlinear Programming with Constraints.
ENCI 303 Lecture PS-19 Optimization 2
Simplex method (algebraic interpretation)
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Part 4 Nonlinear Programming 4.3 Successive Linear Programming.
Simplex Algorithm.Big M Method
ECE 556 Linear Programming Ting-Yuan Wang Electrical and Computer Engineering University of Wisconsin-Madison March
Chapter 7 Handling Constraints
To accompany Quantitative Analysis for Management, 8e by Render/Stair/Hanna 11-1 © 2003 by Prentice Hall, Inc. Upper Saddle River, NJ Chapter 11.
Solving Linear Systems of Equations - Addition Method Recall that to solve the linear system of equations in two variables... we need to find the values.
Solving Linear Systems of Equations - Substitution Method Recall that to solve the linear system of equations in two variables... we need to find the value.
To accompany Quantitative Analysis for Management, 9e by Render/Stair/Hanna 11-1 © 2006 by Prentice Hall, Inc. Upper Saddle River, NJ Chapter 11.
1 Simplex Method for Bounded Variables Linear programming problems with lower and upper bounds Generalizing simplex algorithm for bounded variables Reference:
1 A polynomial relaxation-type algorithm for linear programming Sergei Chubanov University of Siegen, Germany
L8 Optimal Design concepts pt D
Circuits Theory Examples Newton-Raphson Method. Formula for one-dimensional case: Series of successive solutions: If the iteration process is converged,
OR Chapter 8. General LP Problems Converting other forms to general LP problem : min c’x  - max (-c)’x   = by adding a nonnegative slack variable.
OR Simplex method (algebraic interpretation) Add slack variables( 여유변수 ) to each constraint to convert them to equations. (We may refer it as.
OR Chapter 7. The Revised Simplex Method  Recall Theorem 3.1, same basis  same dictionary Entire dictionary can be constructed as long as we.
Gradient Methods In Optimization
OR Relation between (P) & (D). OR optimal solution InfeasibleUnbounded Optimal solution OXX Infeasible X( O )O Unbounded XOX (D) (P)
Cutting-Plane Algorithm BY: Mustapha.D. Ibrahim. Introduction The cutting-plane algorithm starts at the continuous optimum LP solution Special constraints.
Integer Programming, Branch & Bound Method
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.
Linear Programming: Formulations, Geometry and Simplex Method Yi Zhang January 21 th, 2010.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 1 Primal Methods.
The Canonical Form and Null Spaces Lecture III. The Canonical Form ä A canonical form is a solution to an underidentified system of equations. ä For example.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
1 Optimization Linear Programming and Simplex Method.
1 Simplex algorithm. 2 The Aim of Linear Programming A Linear Programming model seeks to maximize or minimize a linear function, subject to a set of linear.
1 Chapter 6 Reformulation-Linearization Technique and Applications.
Chapter 4 The Simplex Algorithm and Goal Programming
Linear programming Lecture (4) and lecture (5). Recall An optimization problem is a decision problem in which we are choosing among several decisions.
Warm-Up Solve the system by graphing y = x + 2 x = −3 Solve the system by graphing 4x + y = 2 x − y = 3.
Constrained Optimization by the  Constrained Differential Evolution with an Archive and Gradient-Based Mutation Tetsuyuki TAKAHAMA ( Hiroshima City University.
Part 4 Nonlinear Programming 4.3 Successive Linear Programming.
Nonlinear Programming Prepared by Lee Revere and John Large
Chap 10. Sensitivity Analysis
Chap 9. General LP problems: Duality and Infeasibility
Chapter 5. Sensitivity Analysis
ENGM 631 Optimization Ch. 4: Solving Linear Programs: The Simplex Method.
Chapter 5. The Duality Theorem
Part 4 Nonlinear Programming
Chapter 8. General LP Problems
Chapter 10: Iterative Improvement
Chapter 8. General LP Problems
Chapter 2. Simplex method
Constraints.
Presentation transcript:

Ch. 9: Direction Generation Method Based on Linearization Generalized Reduced Gradient Method Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010

Objective Methods to solve general NLP problems – Equality constraints – Inequality Constraints

Implicit Variable Elimination Eliminate variables by solving equality constraints Explicit elimination is not always possible – Reduce the problem dimension

Implicit Variable Elimination X (1) satisfies the constraints of the equality constrained problem Linear approximation to the problem constraints at X (1) This system of equations have more unknowns than equation – Solve for k variables in terms of other N-K

Implicit Variable Elimination First K variables - (basic) Remaining N-K variables – (non-basic) Partition the row vector into and Equation 9.14 becomes,

Implicit Variable Elimination appears to be an unconstrained function involving only the N-K non-basic variables

Implicit Variable Elimination The first order necessary condition for X (1) to be a local minima of is, - reduced gradient

Basic Generalized Reduced Gradient (GRG) algorithm Suppose at iteration t, feasible point and the partition are available

Basic GRG algorithm d is a descent direction – From first order tailor expansion of equation 9.16, – is implicit in the above construction

Basic GRG algorithm – Example 1 Linear approximation – Most of the points do not satisfy the equality constraints – d is a descent direction – d in general leads to infeasible points

Basic GRG algorithm More precisely, is a descent direction in the space of non-basic variables but the composite direction vector yields infeasible points

Basic GRG algorithm – Example 2

For every values of α that is selected as a trial, the constraint equation will have to be solved for the values of the dependent variables that will cause the resulting point to be feasible Newton’s iteration formula to solve the set of equations, is In this problem,

GRG Algorithm

GRG Algorithm – Example 3

GRG Algorithm - Example

Extension of GRG – Inequality Constraints and Bounds on Variables Upper and lower variable bounds – A check must be made to ensure that only variables that are not on or very near their bounds are labeled as basic variables – The direction vector is modified to ensure that the bounds on the independent variables will not be violated if movement is undertaken in the direction. This is accomplished by setting – Checks must be inserted in step 3 of the basic GRG algorithm to ensure that the bounds are not exceeded either during the search on or during the Newton iterations.

Extension of GRG – Inequality Constraints and Bounds on Variables Inequality constraints – explicitly writing these constraints as equalities using slack variables – implicitly using the concept of active constraint set as in feasible direction methods.

Extension of GRG - Example

Summary Linearization of the nonlinear problem functions to generate good search directions Two types of algorithms – Feasible direction methods Required the solution of an LP sub-problem – GRG algorithm solve a set of linear equations to determine a good descent direction