Part 4 Nonlinear Programming

Slides:



Advertisements
Similar presentations
Integer Linear Programming
Advertisements

Incremental Linear Programming Linear programming involves finding a solution to the constraints, one that maximizes the given linear function of variables.
Solving IPs – Cutting Plane Algorithm General Idea: Begin by solving the LP relaxation of the IP problem. If the LP relaxation results in an integer solution,
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 123 “True” Constrained Minimization.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 8 Tuesday, 11/19/02 Linear Programming.
Adam Networks Research Lab Transformation Methods Penalty and Barrier methods Study of Engineering Optimization Adam (Xiuzhong) Chen 2010 July 9th Ref:
EE 553 Integer Programming
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Constrained optimization Indirect methods Direct methods.
ENGINEERING OPTIMIZATION
Solving Integer Programs. Natural solution ideas that don’t work well Solution idea #1: Explicit enumeration: Try all possible solutions and pick the.
1 Chapter 8: Linearization Methods for Constrained Problems Book Review Presented by Kartik Pandit July 23, 2010 ENGINEERING OPTIMIZATION Methods and Applications.
Chapter 10: Iterative Improvement
Optimization Methods One-Dimensional Unconstrained Optimization
Optimization Methods One-Dimensional Unconstrained Optimization
Ch. 9: Direction Generation Method Based on Linearization Generalized Reduced Gradient Method Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010.
1 OR II GSLM Outline  separable programming  quadratic programming.
Daniel Kroening and Ofer Strichman Decision Procedures An Algorithmic Point of View Deciding ILPs with Branch & Bound ILP References: ‘Integer Programming’
LINEAR PROGRAMMING SIMPLEX METHOD.
Operations Research Models
Decision Procedures An Algorithmic Point of View
Paper review of ENGG*6140 Optimization Techniques Paper Review -- Interior-Point Methods for LP for LP Yanrong Hu Ce Yu Mar. 11, 2003.
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Part 4 Nonlinear Programming 4.3 Successive Linear Programming.
Operations Research Assistant Professor Dr. Sana’a Wafa Al-Sayegh 2 nd Semester ITGD4207 University of Palestine.
ECE 556 Linear Programming Ting-Yuan Wang Electrical and Computer Engineering University of Wisconsin-Madison March
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Chapter 7 Handling Constraints
1 Max 8X 1 + 5X 2 (Weekly profit) subject to 2X 1 + 1X 2  1000 (Plastic) 3X 1 + 4X 2  2400 (Production Time) X 1 + X 2  700 (Total production) X 1.
LP Narrowing: A New Strategy for Finding All Solutions of Nonlinear Equations Kiyotaka Yamamura Naoya Tamura Koki Suda Chuo University, Tokyo, Japan.
Branch-and-Cut Valid inequality: an inequality satisfied by all feasible solutions Cut: a valid inequality that is not part of the current formulation.
OR Chapter 8. General LP Problems Converting other forms to general LP problem : min c’x  - max (-c)’x   = by adding a nonnegative slack variable.
Branch and Bound Algorithms Present by Tina Yang Qianmei Feng.
Chapter 2. Optimal Trees and Paths Combinatorial Optimization
Part 3. Linear Programming 3.2 Algorithm. General Formulation Convex function Convex region.
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Integer Programming, Branch & Bound Method
DEPARTMENT/SEMESTER ME VII Sem COURSE NAME Operation Research Manav Rachna College of Engg.
Common Intersection of Half-Planes in R 2 2 PROBLEM (Common Intersection of half- planes in R 2 ) Given n half-planes H 1, H 2,..., H n in R 2 compute.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 117 Penalty and Barrier Methods General classical constrained.
1 Chapter 5 Branch-and-bound Framework and Its Applications.
Part 4 Nonlinear Programming 4.3 Successive Linear Programming.
-- Interior-Point Methods for LP
Linear Programming Many problems take the form of maximizing or minimizing an objective, given limited resources and competing constraints. specify the.
Signal processing and Networking for Big Data Applications: Lecture 9 Mix Integer Programming: Benders decomposition And Branch & Bound NOTE: To change.
Solver & Optimization Problems
5.3 Mixed-Integer Nonlinear Programming (MINLP) Models
Computational Optimization
Chapter 5 The Simplex Method
6.5 Stochastic Prog. and Benders’ decomposition
Linear Programming.
Chapter 6. Large Scale Optimization
Chapter 5. Sensitivity Analysis
Integer Linear Programming
Chapter 3 The Simplex Method and Sensitivity Analysis
MIP Tools Branch and Cut with Callbacks Lazy Constraint Callback
3-3 Optimization with Linear Programming
Part 3. Linear Programming
Gomory’s cutting plane algorithm for integer programming
Part 4 Nonlinear Programming
Integer Linear Programming
Part 3. Linear Programming
6.5 Stochastic Prog. and Benders’ decomposition
Chapter 10: Iterative Improvement
Branch-and-Bound Algorithm for Integer Program
Transformation Methods Penalty and Barrier methods
Integer LP: Algorithms
Chapter 6. Large Scale Optimization
Multidisciplinary Optimization
Presentation transcript:

Part 4 Nonlinear Programming 4.3 Successive Linear Programming

Approach 3 Cutting Plane Method

Basic Strategy We seek to devise an algorithm that will solve this problem by solving a sequence of intermediate problems constructed by starting out with a rough approximation to the feasible region and successively improving the approximation by adding constraint estimates updated at the intermediate solution.

Basic Strategy

Basic Strategy

Basic Strategy Case (ii) gives us an indication of the possible location of the optimum. In order to improve our approximation to F in the vicinity of x^(1), we will need to modify the boundaries of Z^0 near x^(1). This can be achieved by imposing additional constraints that will exclude from Z^0 the region in the vicinity of x^(1).

Example F x2 P1 P2 x1 F(x1,x2)=-x1-x2

Example

Example If the computations are continued in this fashion, and if with each set of cuts we can be sure that a nonempty remaining portion of Z^(0) is eliminated, then it seems reasonable that eventually a point will be reached that is feasible and that consequently will be the minimum of f(x) over F.

Basic Ideas of Kelley’s Algorithm

Nonlinear Objective Function

Generation of Cuts

Generation of Cuts

Generation of Cuts Kelley proposed that : Only the linearization of the most violated constraint be used to construct a cut.

Kelley’s Algorithm

Kelley’s Algorithm – Step 1

Kelley’s Algorithm – Step 2

Kelley’s Algorithm – Step 3

Kelley’s Algorithm – Step 4

Advantages Any linearity or near linearity in the original problem is preserved and directly utilized. The sub-problem to be solved at each major iteration is one for which the powerful techniques of LP are applicable.

Disadvantages The algorithm generates a sequence of infeasible points. Thus, it cannot be terminated early with a “good” but perhaps not optimal point. The size of the LP problem grows continuously. The feasible region F has to be convex.

Requirement of Convexity

Remark

Cut-Deletion Procedure

Step 4a

Step 4b