Presentation is loading. Please wait.

Presentation is loading. Please wait.

Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function.

Similar presentations


Presentation on theme: "Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function."— Presentation transcript:

1 Nonlinear Programming I Li Xiaolei

2 Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function constraints

3 Introductory concepts DEFINATION the feasible region for NLP(1) is the set of points ( x 1,x 2,…,x n ) that satisfy the m constraints in (1). A point in the feasible region is a feasible point, and a point that is not in the feasible region is an infeasible point.

4 Introductory concepts DEFINATION For a maximization problem, any point x* in the feasible region for which f(x*) ≥f(x) holds for all points x in the feasible region is an optimal solution to the NLP. For a minimization problem, x is the optimal solution if f(x*) ≤ f(x) for all feasible x.

5 Examples of NLPs EXAMPLE 1 It costs a company c dollars per unit to manufacture a product. If the company charges p dollars per unit for the product, customers demand D(p) units. To maximize profits, what price should the firm charge? Solution The firm’s decision variable is p. Since the firm’s profit is (p-c)D(p), the firm wants to solve the following unconstrained maximization problem: max (p-c)D(p).

6 Examples of NLPs EXAMPLE 2 If K units of capital and L units of labor are used, a company can produce KL units of a manufactured good. Capital can be purchased at $4/unit and labor can be purchased at $1/unit. A total of $8 is available to purchase capital and labor. How can the firm maximized the quantity of the good that can be manufactured?

7 Solution Let K=units of capital purchased and L=units of labor purchased. Then K and L must satisfy 4K+L≤8,K≥0, and L ≥ 0. Thus, the firm wants to solve the following constrained maximization problem:

8 Differences Between NLPs and LPs The feasible region for any LP is a convex set. If an LP has an optimal solution, there is an extreme point of the feasible region that is optimal. Even if the feasible region for an NLP is a convex set, the optimal solution need not be an extreme point of the NLP’s feasible region.

9 Figure to EXAMPLE 2 KL=1 KL=2 KL=4 4K+L=8 D

10 Thus, the optimal solution to the example is z=4,k=1,l=4(point D). Point D is not an extreme point of the NLP’s feasible region. In fact, the optimal solution for an NLP may not be on the boundary of the feasible region.

11 Local Extremum DEFINITION 1 For any NLP (maximization), a feasible point x=(x 1,x 2,…,x n ) is a local maximum if for sufficiently small ε, any feasible point x’=(x 1 ’,x 2 ’,…,x n ’) having |x i -x i ’|<ε (i=1,2,…,n) satisfies f(x)≥f(x’). Analogously, for a minimization problem, a point x is a local minimum if f(x)≤f(x’) holds for all feasible x’ that are close to x.

12 For a NLP, local maximum maybe not an optimal solution.

13 Convex and Concave Function Let f(x 1,x 2,…,x n ) be a function that is defined for all points (x 1,x 2,…,x n ) in a convex set S. DEFINITION 2 A function f(x 1,x 2,…,x n ) is a convex function on a convex set S if for any x’ ∈ S and x’’ ∈ S f(cx’+(1-c)x’’) ≤cf(x’)+(1-c)f(x’’) holds for 0≤c≤1.

14 A=(x’,f(x’)) D=(x”,f(x”)) C=(cx’+(1-c)x”,cf(x’)+(1-c)f(x”)) B=(cx’+(1-c)x”,f(cx’+(1-c)x”)) From figure: f(cx’+(1-c)x”) ≤cf(x’)+(1-c)f(x”) x y A D C x’ x” cx’+(1-c)x” y=f(x) B

15 DEFINITION 3 A function f(x 1,x 2,…,x n ) is a concave function on a convex set S if for any x’ ∈ S and x’’ ∈ S f(cx’+(1-c)x’’) ≥cf(x’)+(1-c)f(x’’) holds for 0≤c≤1. From definition 2 and 3, we see that f(x 1,x 2,…,x n ) is a convex function if and only if - f(x 1,x 2,…,x n ) is a concave function, and conversely.

16 A=(x’,f(x’)) D=(x”,f(x”)) C=(cx’+(1-c)x”,cf(x’)+(1-c)f(x”)) B=(cx’+(1-c)x”,f(cx’+(1-c)x”)) From figure: f(cx’+(1-c)x”) ≥ cf(x’)+(1-c)f(x”) x y A D C x’ x” cx’+(1-c)x” y=f(x) B

17 x y A C x’ x” cx’+(1-c)x” y=f(x) B f(x) is not a convex or a concave function.

18 A linear function of the form f(x)=ax+b is both a convex and a concave function. f(cx’+(1-c)x”)=a[cx’+(1-c)x”]+b =c(ax’+b)+(1-c)(ax”+b) =cf(x’)+(1-c)f(x”)

19 Theorem 1 Consider NLP(1) and assume it is a maximization problem. Suppose the feasible region S for NLP (1) is a convex set. If f(x) is concave on S, then any local maximum for NLP(1) is an optimal solution to this NLP. Theorem 1’ Consider NLP(1) and assume it is a minimization problem. Suppose the feasible region S for NLP(1) is a convex set. If f(x) is convex on S, then any local minimum for NLP(1) is an optimal solution to this NLP.

20 Theorems 1 and 1’ demonstrate that if we are maximizing a concave function (or minimizing a convex function )over a convex feasible region S, then any local maximum (or local minimum) will solve NLP(1), we will repeatedly apply theorem 1 and 1’. We now explain how to determine if a function f(x) of a single variable is convex or concave.

21 Theorem 2 Suppose f’’(x) exists for all x in a convex set S. Then f(x) is a convex function on S if and only if f’’(x) ≥0 for all x in S. Since f(x) is convex if and only if - f(x) is concave, Theorem 2’ must also be true. Theorem 2’ Suppose f’’(x) exists for all x in a convex set S. Then f(x) is a concave function on S if and only if f’’(x) ≤0 for all x in S.

22 Example 1. show that f(x) = is a convex function on S = R 1. 2. show that f(x) = is a concave function on S =(0,∞). Solution 1. f’’(x)=2≥0, so f(x) is convex on S. 2. f’’(x)=- /4≤0,so f(x) is a concave function on S.

23 How can we determine whether a function f(x 1,x 2,…,x n ) of n variables is convex or concave on a set S ∈ R n ? We assume that f(x 1,x 2,…,x n ) has continuous second-order partial derivatives. We require three definitions.

24 Definition The Hessian of f(x 1,x 2,…,x n ) is the n×n matrix whose ij th entry is We let H(x 1,x 2,…,x n ) denote the value of the Hessian at (x 1,x 2,…,x n ). For example,if f(x 1,x 2 ) =, then H(x 1,x 2 )=

25 Definition An ith principal minor of an n×n matrix is the determinant of any i×i matrix obtained by deleting n–i rows and the corresponding n–i columns of the matrix. Thus,for the matrix,the first principal minors are -2 and -4, and the second principal minor is -2(-4)-(-1)(-1)=7. for any matrix, the first principal minors are just the diagonal entries of the matrix.

26 Definition The k th leading principal minor of an n×n matrix is the determinant of the k×k matrix obtained by deleting the last n-k rows and columns of the matrix. We let H k (x 1,x 2,…,x n ) be the k th leading principal minor of the H matrix evaluated at the point (x 1,x 2,…,x n ). Thus, if f(x 1,x 2 ) =,then H 1 (x 1,x 2 )= 6, and H 2 (x 1,x 2 ) = 6x 1 (2)-2(2)=12x 1 -4

27 Theorem 3 Suppose has continuous second-order partial derivatives for each point = S. Then is a convex function on S iff for each S, all principal minors of H are nonnegative. Example : Show that = is a convex function on S=.

28 Solution The first principal minors are the diagonal entries (both equal 2≥0). The second principal minor is 2(2)-2(2)=0 ≥0. Since for all point, all principal minors of H are nonnegative, theorem 3 shows that f(x 1,x 2 ) is a convex function on R 2.

29 Theorem 3’ Suppose has continuous second-order partial derivatives for each point S. Then is a concave function on S iff for each and =1,2,…n, all nonzero principal minors have the same sign as.

30 Solving NLPs with one variable In this section, we explain how to solve the NLP : max (or min ) s.t. (4) (If b= ∞,the feasible region for NLP (4) is, and if a=- ∞, the feasible region for (4) is.)

31 Solving NLPs with one variable To find the optimal solution to (4), we find all local maxima (or minima). A point that is a local maximum or minimum for (4) is called a local extremum. Then the optimal solution to (4) is the local maximum (minimum) having the largest (smallest) value of. Of cause if a=-∞ or b=∞, (4) may have no optimal solution.

32 NLPs with no solution

33 Extremum candidates Case 1 Points where a< x <b, and f’(x)= 0 Case 2 Points where f’(x) does not exist. Case 3 Endpoints a and b of the interval [a,b].

34 Case 1. points where and =0 Theorem 4: If =0, and 0,then is a local minimum. Theorem 5:If =0, and 1 If the first non-vanishing (nonzero) derivative at is an odd-order derivative( and so on), then is not a local maximum or a local minimum.

35 2 If the first non-vanishing derivative at is positive and is an even-order derivative, then is a local minimum. 3 If the first non-vanishing derivative at is negative and is an even-order derivative, then is a local maximum.

36 Figure 14

37

38

39 Case 2. Point Where does not exist If does not have a derivative at, may be a local maximum, a local minimum, or neither (see Figure 15). In this case, we determine whether is a local maximum or a local minimum by checking values of at points near The four possible cases that can occur that are summarized in Table 7.

40 Figure 15 (a) (b)

41 Figure 15 (c) (d)

42 Table 7 If does not exist Relationship x 0 Figure Not local extremum 15a Not localextremum 15b local maximum 15c local minimum 15d

43 Case 3.Endpoints a and b of [a,b] figure 16

44 Case 3.Endpoints a and b of [a,b] figure 16

45 Example 15 It costs a monopolist $5/unit to produce a product. If he produces x units of the product, each can be sold for 10-x dollars(0≤x≤10). To maximize profit, how much should the monopolist produce?

46 Solution Let P(x) be the monopolist’s profit if he produces x unites. Then P(x)=x(10-x)-5x=5x-x 2 (0≤x≤10) Thus,the monopolist wants to solve the following NLP: max P(x) s.t. 0≤x≤10

47 Classify all extremum candidates: Case 1 P’(x)=5-2x, so P’(2.5)=0. since P’’(x)=-2, x=2.5 is a local maximum yielding a profit of P(2.5)=6.25. Case 2 P’(x) exists for all points in [0,10],so there are no Case 2 candidates. Case 3 a=0 has P ’(0)=5>0,so a =0 is a local minimum; b =10 has P ’(10)=-15<0, so b =10 is a local minimum. Thus, x =2.5 is the only local maximum.

48 Example 16 Let Find

49 Solution Case 1 For 0≤x 0, x=4 is a local minimum. Case 2 We see that f(x) has no derivative at x=3(for x slightly less than 3, f’(x)is near -4, and for x slightly bigger than 3, f’(x)is near -2). Since f(2.9)=-1.61, f(3)=- 2,and f(3.1)=-2.19, x=3 is not a local extremum. Case 3 Since f’(0)=2>0, x=0 is a local minimum. Since f’(6)=4>0, x=6 is a maximum. Thus, on [0,6], f(x) has a local maximum for x=1 and x=6. Since f(1)=2 and f(6)=1, we find that the optimal solution to the NLP occurs for x=1.


Download ppt "Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function."

Similar presentations


Ads by Google