Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function.

Slides:



Advertisements
Similar presentations
Chapter 17 Multivariable Calculus.
Advertisements

Fin500J: Mathematical Foundations in Finance
Nonlinear Programming McCarl and Spreen Chapter 12.
BA 452 Lesson A.2 Solving Linear Programs 1 1ReadingsReadings Chapter 2 An Introduction to Linear Programming.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Linear Inequalities and Linear Programming Chapter 5 Dr.Hayk Melikyan/ Department of Mathematics and CS/ Linear Programming in two dimensions:
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Thursday, April 25 Nonlinear Programming Theory Separable programming Handouts: Lecture Notes.
Chapter 6 Linear Programming: The Simplex Method Section 3 The Dual Problem: Minimization with Problem Constraints of the Form ≥
1 Copyright (c) 2003 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 3 Introduction to Linear Programming to accompany Introduction to Mathematical.
MIT and James Orlin © Nonlinear Programming Theory.
Linear Programming Applications
Partial Differentiation & Application
Maximum and Minimum Values (Section 3.1)
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
ECON 1150, Spring 2013 Lecture 3: Optimization: One Choice Variable Necessary conditions Sufficient conditions Reference: Jacques, Chapter 4 Sydsaeter.
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
Tier I: Mathematical Methods of Optimization
Lecture 9 – Nonlinear Programming Models
Applied Economics for Business Management
KKT Practice and Second Order Conditions from Nash and Sofer
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
11. Cost minimization Econ 494 Spring 2013.
Linear Programming Chapter 13 Supplement.
1. Problem Formulation. General Structure Objective Function: The objective function is usually formulated on the basis of economic criterion, e.g. profit,
Chapter 11 Nonlinear Programming
The mean value theorem and curve sketching
ENCI 303 Lecture PS-19 Optimization 2
1 Intermediate Microeconomics Math Review. 2 Functions and Graphs Functions are used to describe the relationship between two variables. Ex: Suppose y.
Nonlinear Programming (NLP) Operation Research December 29, 2014 RS and GISc, IST, Karachi.
Copyright 2013, 2010, 2007, Pearson, Education, Inc. Section 7.6 Linear Programming.
The Simplex Algorithm The Simplex Algorithm LI Xiao-lei.
Nonlinear Programming Models
A LINEAR PROGRAMMING PROBLEM HAS LINEAR OBJECTIVE FUNCTION AND LINEAR CONSTRAINT AND VARIABLES THAT ARE RESTRICTED TO NON-NEGATIVE VALUES. 1. -X 1 +2X.
Chapter 4 The Simplex Algorithm( 單體法 ) and Goal Programming Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc.
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
Sensitivity analysis LI Xiao-lei. A graphical introduction to sensitivity analysis Sensitivity analysis is concerned with how changes in an LP’s parameters.
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
Optimization unconstrained and constrained Calculus part II.
4  Applications of the First Derivative  Applications of the Second Derivative  Curve Sketching  Optimization I  Optimization II Applications of the.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
3 Components for a Spreadsheet Optimization Problem  There is one cell which can be identified as the Target or Set Cell, the single objective of the.
Introduction to Optimization
1 Optimization Techniques Constrained Optimization by Linear Programming updated NTU SY-521-N SMU EMIS 5300/7300 Systems Analysis Methods Dr.
Calculus-Based Optimization AGEC 317 Economic Analysis for Agribusiness and Management.
LINEAR PROGRAMMING 3.4 Learning goals represent constraints by equations or inequalities, and by systems of equations and/or inequalities, and interpret.
Linear & Nonlinear Programming -- Basic Properties of Solutions and Algorithms.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Copyright © 2016, 2012 Pearson Education, Inc
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Chapter 4 The Simplex Algorithm and Goal Programming
OBJECTIVE Find relative extrema of a function of two variables.
deterministic operations research
Maximum and Minimum Values (Section 4.1)
Calculus-Based Solutions Procedures MT 235.
Lecture 8 – Nonlinear Programming Models
Objectives for Section 12.5 Absolute Maxima and Minima
1. Problem Formulation.
Linear Programming Objectives: Set up a Linear Programming Problem
3.2: Extrema and the First Derivative Test
Optimization Problems
Linear Programming I: Simplex method
Outline Unconstrained Optimization Functions of One Variable
Chapter 7 Functions of Several Variables
Chapter 12 Graphing and Optimization
Chapter 4 Graphing and Optimization
Multivariable optimization with no constraints
Maximum and Minimum Values
Presentation transcript:

Nonlinear Programming I Li Xiaolei

Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function constraints

Introductory concepts DEFINATION the feasible region for NLP(1) is the set of points ( x 1,x 2,…,x n ) that satisfy the m constraints in (1). A point in the feasible region is a feasible point, and a point that is not in the feasible region is an infeasible point.

Introductory concepts DEFINATION For a maximization problem, any point x* in the feasible region for which f(x*) ≥f(x) holds for all points x in the feasible region is an optimal solution to the NLP. For a minimization problem, x is the optimal solution if f(x*) ≤ f(x) for all feasible x.

Examples of NLPs EXAMPLE 1 It costs a company c dollars per unit to manufacture a product. If the company charges p dollars per unit for the product, customers demand D(p) units. To maximize profits, what price should the firm charge? Solution The firm’s decision variable is p. Since the firm’s profit is (p-c)D(p), the firm wants to solve the following unconstrained maximization problem: max (p-c)D(p).

Examples of NLPs EXAMPLE 2 If K units of capital and L units of labor are used, a company can produce KL units of a manufactured good. Capital can be purchased at $4/unit and labor can be purchased at $1/unit. A total of $8 is available to purchase capital and labor. How can the firm maximized the quantity of the good that can be manufactured?

Solution Let K=units of capital purchased and L=units of labor purchased. Then K and L must satisfy 4K+L≤8,K≥0, and L ≥ 0. Thus, the firm wants to solve the following constrained maximization problem:

Differences Between NLPs and LPs The feasible region for any LP is a convex set. If an LP has an optimal solution, there is an extreme point of the feasible region that is optimal. Even if the feasible region for an NLP is a convex set, the optimal solution need not be an extreme point of the NLP’s feasible region.

Figure to EXAMPLE 2 KL=1 KL=2 KL=4 4K+L=8 D

Thus, the optimal solution to the example is z=4,k=1,l=4(point D). Point D is not an extreme point of the NLP’s feasible region. In fact, the optimal solution for an NLP may not be on the boundary of the feasible region.

Local Extremum DEFINITION 1 For any NLP (maximization), a feasible point x=(x 1,x 2,…,x n ) is a local maximum if for sufficiently small ε, any feasible point x’=(x 1 ’,x 2 ’,…,x n ’) having |x i -x i ’|<ε (i=1,2,…,n) satisfies f(x)≥f(x’). Analogously, for a minimization problem, a point x is a local minimum if f(x)≤f(x’) holds for all feasible x’ that are close to x.

For a NLP, local maximum maybe not an optimal solution.

Convex and Concave Function Let f(x 1,x 2,…,x n ) be a function that is defined for all points (x 1,x 2,…,x n ) in a convex set S. DEFINITION 2 A function f(x 1,x 2,…,x n ) is a convex function on a convex set S if for any x’ ∈ S and x’’ ∈ S f(cx’+(1-c)x’’) ≤cf(x’)+(1-c)f(x’’) holds for 0≤c≤1.

A=(x’,f(x’)) D=(x”,f(x”)) C=(cx’+(1-c)x”,cf(x’)+(1-c)f(x”)) B=(cx’+(1-c)x”,f(cx’+(1-c)x”)) From figure: f(cx’+(1-c)x”) ≤cf(x’)+(1-c)f(x”) x y A D C x’ x” cx’+(1-c)x” y=f(x) B

DEFINITION 3 A function f(x 1,x 2,…,x n ) is a concave function on a convex set S if for any x’ ∈ S and x’’ ∈ S f(cx’+(1-c)x’’) ≥cf(x’)+(1-c)f(x’’) holds for 0≤c≤1. From definition 2 and 3, we see that f(x 1,x 2,…,x n ) is a convex function if and only if - f(x 1,x 2,…,x n ) is a concave function, and conversely.

A=(x’,f(x’)) D=(x”,f(x”)) C=(cx’+(1-c)x”,cf(x’)+(1-c)f(x”)) B=(cx’+(1-c)x”,f(cx’+(1-c)x”)) From figure: f(cx’+(1-c)x”) ≥ cf(x’)+(1-c)f(x”) x y A D C x’ x” cx’+(1-c)x” y=f(x) B

x y A C x’ x” cx’+(1-c)x” y=f(x) B f(x) is not a convex or a concave function.

A linear function of the form f(x)=ax+b is both a convex and a concave function. f(cx’+(1-c)x”)=a[cx’+(1-c)x”]+b =c(ax’+b)+(1-c)(ax”+b) =cf(x’)+(1-c)f(x”)

Theorem 1 Consider NLP(1) and assume it is a maximization problem. Suppose the feasible region S for NLP (1) is a convex set. If f(x) is concave on S, then any local maximum for NLP(1) is an optimal solution to this NLP. Theorem 1’ Consider NLP(1) and assume it is a minimization problem. Suppose the feasible region S for NLP(1) is a convex set. If f(x) is convex on S, then any local minimum for NLP(1) is an optimal solution to this NLP.

Theorems 1 and 1’ demonstrate that if we are maximizing a concave function (or minimizing a convex function )over a convex feasible region S, then any local maximum (or local minimum) will solve NLP(1), we will repeatedly apply theorem 1 and 1’. We now explain how to determine if a function f(x) of a single variable is convex or concave.

Theorem 2 Suppose f’’(x) exists for all x in a convex set S. Then f(x) is a convex function on S if and only if f’’(x) ≥0 for all x in S. Since f(x) is convex if and only if - f(x) is concave, Theorem 2’ must also be true. Theorem 2’ Suppose f’’(x) exists for all x in a convex set S. Then f(x) is a concave function on S if and only if f’’(x) ≤0 for all x in S.

Example 1. show that f(x) = is a convex function on S = R show that f(x) = is a concave function on S =(0,∞). Solution 1. f’’(x)=2≥0, so f(x) is convex on S. 2. f’’(x)=- /4≤0,so f(x) is a concave function on S.

How can we determine whether a function f(x 1,x 2,…,x n ) of n variables is convex or concave on a set S ∈ R n ? We assume that f(x 1,x 2,…,x n ) has continuous second-order partial derivatives. We require three definitions.

Definition The Hessian of f(x 1,x 2,…,x n ) is the n×n matrix whose ij th entry is We let H(x 1,x 2,…,x n ) denote the value of the Hessian at (x 1,x 2,…,x n ). For example,if f(x 1,x 2 ) =, then H(x 1,x 2 )=

Definition An ith principal minor of an n×n matrix is the determinant of any i×i matrix obtained by deleting n–i rows and the corresponding n–i columns of the matrix. Thus,for the matrix,the first principal minors are -2 and -4, and the second principal minor is -2(-4)-(-1)(-1)=7. for any matrix, the first principal minors are just the diagonal entries of the matrix.

Definition The k th leading principal minor of an n×n matrix is the determinant of the k×k matrix obtained by deleting the last n-k rows and columns of the matrix. We let H k (x 1,x 2,…,x n ) be the k th leading principal minor of the H matrix evaluated at the point (x 1,x 2,…,x n ). Thus, if f(x 1,x 2 ) =,then H 1 (x 1,x 2 )= 6, and H 2 (x 1,x 2 ) = 6x 1 (2)-2(2)=12x 1 -4

Theorem 3 Suppose has continuous second-order partial derivatives for each point = S. Then is a convex function on S iff for each S, all principal minors of H are nonnegative. Example : Show that = is a convex function on S=.

Solution The first principal minors are the diagonal entries (both equal 2≥0). The second principal minor is 2(2)-2(2)=0 ≥0. Since for all point, all principal minors of H are nonnegative, theorem 3 shows that f(x 1,x 2 ) is a convex function on R 2.

Theorem 3’ Suppose has continuous second-order partial derivatives for each point S. Then is a concave function on S iff for each and =1,2,…n, all nonzero principal minors have the same sign as.

Solving NLPs with one variable In this section, we explain how to solve the NLP : max (or min ) s.t. (4) (If b= ∞,the feasible region for NLP (4) is, and if a=- ∞, the feasible region for (4) is.)

Solving NLPs with one variable To find the optimal solution to (4), we find all local maxima (or minima). A point that is a local maximum or minimum for (4) is called a local extremum. Then the optimal solution to (4) is the local maximum (minimum) having the largest (smallest) value of. Of cause if a=-∞ or b=∞, (4) may have no optimal solution.

NLPs with no solution

Extremum candidates Case 1 Points where a< x <b, and f’(x)= 0 Case 2 Points where f’(x) does not exist. Case 3 Endpoints a and b of the interval [a,b].

Case 1. points where and =0 Theorem 4: If =0, and 0,then is a local minimum. Theorem 5:If =0, and 1 If the first non-vanishing (nonzero) derivative at is an odd-order derivative( and so on), then is not a local maximum or a local minimum.

2 If the first non-vanishing derivative at is positive and is an even-order derivative, then is a local minimum. 3 If the first non-vanishing derivative at is negative and is an even-order derivative, then is a local maximum.

Figure 14

Case 2. Point Where does not exist If does not have a derivative at, may be a local maximum, a local minimum, or neither (see Figure 15). In this case, we determine whether is a local maximum or a local minimum by checking values of at points near The four possible cases that can occur that are summarized in Table 7.

Figure 15 (a) (b)

Figure 15 (c) (d)

Table 7 If does not exist Relationship x 0 Figure Not local extremum 15a Not localextremum 15b local maximum 15c local minimum 15d

Case 3.Endpoints a and b of [a,b] figure 16

Case 3.Endpoints a and b of [a,b] figure 16

Example 15 It costs a monopolist $5/unit to produce a product. If he produces x units of the product, each can be sold for 10-x dollars(0≤x≤10). To maximize profit, how much should the monopolist produce?

Solution Let P(x) be the monopolist’s profit if he produces x unites. Then P(x)=x(10-x)-5x=5x-x 2 (0≤x≤10) Thus,the monopolist wants to solve the following NLP: max P(x) s.t. 0≤x≤10

Classify all extremum candidates: Case 1 P’(x)=5-2x, so P’(2.5)=0. since P’’(x)=-2, x=2.5 is a local maximum yielding a profit of P(2.5)=6.25. Case 2 P’(x) exists for all points in [0,10],so there are no Case 2 candidates. Case 3 a=0 has P ’(0)=5>0,so a =0 is a local minimum; b =10 has P ’(10)=-15<0, so b =10 is a local minimum. Thus, x =2.5 is the only local maximum.

Example 16 Let Find

Solution Case 1 For 0≤x 0, x=4 is a local minimum. Case 2 We see that f(x) has no derivative at x=3(for x slightly less than 3, f’(x)is near -4, and for x slightly bigger than 3, f’(x)is near -2). Since f(2.9)=-1.61, f(3)=- 2,and f(3.1)=-2.19, x=3 is not a local extremum. Case 3 Since f’(0)=2>0, x=0 is a local minimum. Since f’(6)=4>0, x=6 is a maximum. Thus, on [0,6], f(x) has a local maximum for x=1 and x=6. Since f(1)=2 and f(6)=1, we find that the optimal solution to the NLP occurs for x=1.