Presentation is loading. Please wait.

Presentation is loading. Please wait.

Nonlinear programming

Similar presentations


Presentation on theme: "Nonlinear programming"— Presentation transcript:

1 Nonlinear programming
®Copyright of Shun-Feng Su Course in 2019, spring Nonlinear programming 非線性規劃化 Offered by 蘇順豐 Shun-Feng Su, Department of Electrical Engineering, National Taiwan University of Science and Technology

2 ®Copyright of Shun-Feng Su
Preface Optimization is central to many occasions involving decision or finding good solutions in various research problems. In this course, I shall provide fundamental concepts and ideas about optimization, especially for nonlinear problems. Such a topic is usually called nonlinear programming.

3 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization In general, an optimization problem requires finding a setting of variable vector (or parameters) of the system such that an objective function is optimized. Sometimes, the variable vector may have to satisfy some constraints. Alternatives are to choose among values  Numerical approach. This is why optimization is considered as one part of computational intelligence. Sept, 2010

4 Traditional Optimization
®Copyright of Shun-Feng Su Traditional Optimization A traditional optimization problem can be expressed as to find x that Min (or Max) f(x) subject to x f( ) is the objective function to be optimized. or called performance index, cost function, fitness function, etc. Sept, 2010

5 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization If some constraint like x is specified, it is referred to as a constrained optimization problem; otherwise it is called unconstrained optimization problem. If f( ) is linear and  is polyhedral, the problem is a linear programming problem. Otherwise it is a nonlinear programming problem Sept, 2010

6 Nonlinear Programming
®Copyright of Shun-Feng Su Course details Nonlinear Programming Spring, 2019 Prerequisite: Basic Engineering Mathematics Instructor: Shun-Feng Su, Office : T Phone: ext 6704 Classroom :TBD Time : 09:10~12:10 Monday

7 ®Copyright of Shun-Feng Su
Course details References : E. K. P. Chong and S. H. Żak, An Introduction to Optimization, Classnote: Available on Please select the course information and the click the nonlinear programming icon to download. Tests :One Midterm and one final.

8 Tentative Outline** Preface Fundamentals of Optimization
®Copyright of Shun-Feng Su Tentative Outline** Preface Fundamentals of Optimization Unconstrained Optimization Linear Programming Constrained optimization Non-derivative Approaches ** The possible contents will depend on the time.

9 Tentative Outline** Preface Fundamentals of Optimization
®Copyright of Shun-Feng Su Tentative Outline** Preface Fundamentals of Optimization Unconstrained Optimization Linear Programming Constrained optimization Non-derivative Approaches ** The possible contents will depend on the time.

10 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Optimization is to find the best one among all possible alternatives. It is easy to see that optimization is always a good means in demonstrating your research results. But, the trick is what you mean “better”?

11 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Optimization is to find the best one among all possible alternatives. It is easy to see that optimization is always a good means in demonstrating your research results. But, the trick is what you mean “better”? Why the optimal one is better than the others? In other words, based on which criterion the evaluation is conducted?

12 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization The measure of goodness of alternatives is described by an so-called objective function or performance index. Thus, it is desired that when you see “optimal”, you should first check what is the objective function used. Optimization then is to maximized or minimized the objective function considered. Other terms used are cost function (maximized), fitness function (minimized), etc. Sept, 2010

13 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization In general, an optimization problem requires finding a setting of variable vector (or parameters) of the system such that an objective function is optimized. Sometimes, the variable vector may have to satisfy some constraints. Alternatives are to choose among values  Numerical approach. This is why optimization is considered as one part of computational intelligence. Sept, 2010

14 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization A traditional optimization problem can be expressed as to find x that Min (or Max) f(x) subject to x (Rn) f( ) (f: Rn  R) is the objective function to be optimized. or called performance index, cost function, fitness function, etc. Sept, 2010

15 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization If some constraint like x is specified, it is referred to as a constrained optimization problem; otherwise it is called unconstrained optimization problem. If f( ) is linear and  is polyhedral, the problem is a linear programming problem. Otherwise it is a nonlinear programming problem Sept, 2010

16 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization The constraint x is called a set constraint. Usually,  = {x| h(x)=0, g(x)0}, where h and g are given functions. This kind of description of  is referred to as function constraints. When = Rn, as mentioned it becomes an unconstrained optimization problem. Sept, 2010

17 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization The optimization problem is to find the “best” vector x over all possible vector in . This point (vector) x* is called the minimizer of f(x) over . Note that the minimizers may not be unique. Similar idea for the maximization problem. A possible way is -- Max f = Min f Sept, 2010

18 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Definition: Local minimizer -- Suppose that f: Rn  R is a real-valued function defined on some  Rn. A point x* is a local minimizer of f over , if there exist  >0 such that f(x)f(x*) for all x \ {x*} and ||xx*||<. Global minimizer -- A point x* is a global minimizer, if f(x)f(x*) for all x \ {x*}. If f(x)f(x*) becomes f(x)>f(x*), x* is a strict local (or global) minimizer. Sept, 2010

19 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

20 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Given an optimization problem with a constraint set , a minimizer may lie either in the interior or on the boundary. In order to define it, the notion of feasible directions must be given. Definition: Feasible direction– a vector dRn, d0, is a feasible direction at x, if there exists 0>0 such that x+d for all   [0, 0]. Sept, 2010

21 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Necessary conditions for a local minimizer First order necessary condition (FONC): For any feasible direction d, dTf(x*)0. It is easy to understand that for all feasible points around x* have value larger than f(x*), for which x* is called a local minimizer. If an interior case or unconstrained case is considered, FONC become ∇f(x*)=0 Zero slope at a local minimum x*. Sept, 2010

22 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Example: Min subject to Whether the following point satisfied the FONC? [1, 3]T, [0, 3] T, [1, 0] T, and [0, 0] T. Ans: [1, 3] T, interior point ∇f(x)=[2, 6] T0 No. [0, 3] T, ∇f(x)=[0, 6] T; feasible directions d10 and d2 can be arbitrary. dT∇f(x)=6d2 may not always be nonnegative. No. Similar about [1, 0] TNo. For [0, 0] T, ∇f(x)=[0, 3] T; feasible directions d10 and d2 0. dT∇f(x)=3d2 is always nonnegative. Yes. Sept, 2010

23 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Necessary conditions for a local minimizer If a point satisfies FONC, it can be checked whether it satisfies SONC. Second order necessary condition (SONC): dT2f(x*) d0. 2f(x*) is called the Hessian matrix of f or H(x*). dT2f(x*) d0 can also be said to be positive semidefinite (or f is nonnegative curvature). Sept, 2010

24 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Example: Min No constraint. FONC requires ∇f(x)=[2x1, -2x2]T=0. Thus x*=[0, 0]T. 2f(x*) = It can easily be checked that the Hessian matrix is not positive semidefinite (for example, if selecting d=[0, 1]T, dT2f(x*) d<0).. Thus x*=[0, 0] T is not a local minimizer. Sept, 2010

25 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization The figure of Sept, 2010

26 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Necessary conditions for a local minimizer If a point is a local minimizer, it needs to satisfy FONC and SONC; In other words, ∇f(x*)=0 and dT2f(x*) d0. But if a point satisfies FONC and SONC, it may not be a local minimizer. Sept, 2010

27 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Necessary conditions for a local minimum in a unconstrained optimization problem: 1st order condition: Zero slope at a local minimum x*  ∇f(x*)=0 2nd order condition: Nonnegative curvature at a local minimum and x*  ∇2f(x*) is positive semidefinite. There may exist points that satisfy the above 1st and 2nd order conditions but are not local minima. Sept, 2010

28 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

29 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Proofs of necessary conditions • 1st order condition ∇f(x*)=0 . Fix Then (since x* is a local min), from 1st order Taylor Replace d with d, to obtain Sept, 2010

30 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

31 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Sufficient conditions for a local minimizer for an interior case (or unconstrained case) • First order sufficient condition (FOSC): ∇f(x*) = 0 • Second order sufficient condition (SOSC):: dT2f(x*) d>0 or ∇2f(x*) : Positive Definite Sept, 2010

32 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

33 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Convex set and convex functions: In optimization, convex sets and convex functions are usually considered. Definition: A set  is convex, if for any two points, x and y , w=x+(1)y also in  for any [0, 1]. Definition: A function f(x) is convex, if for any two points, x and y  (a convex set), f(x+(1)y)  f(x) +(1) f(y). Note: x+(1)y for any [0, 1] can be interpreted as any point on the line segment between x and y. Sept, 2010

34 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

35 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization f is a convex function defined on a convex set , if and only if f(y)f(x)+f(x)(yx), for all x, y Let f is a convex function defined on a convex set. Then a minimizer of f is the global minimizer. f is a convex function defined on a convex set. f is a concave function. Sept, 2010

36 Fundamentals of Optimization
®Copyright of Shun-Feng Su Fundamentals of Optimization Selected homework for Prob 1: 6.1, 6.5, 6.7, 6.14, 6.18, and 6.22 Only for your excise no turn in required. Sept, 2010

37 Outline Preface Fundamentals of Optimization
®Copyright of Shun-Feng Su Outline Preface Fundamentals of Optimization Unconstrained Optimization Ideas of finding solutions One-Dimensional Search Gradient Methods Newton’s Method and Its Variations

38 Unconstrained Optimization
®Copyright of Shun-Feng Su Unconstrained Optimization If the objective function can be explicitly expressed as a function of parameters, traditional mathematic approaches can be employed to solve the optimization: Traditional optimization approaches can be classified into two categories; direct approach and incremental approach. Sept, 2010

39 Unconstrained Optimization
®Copyright of Shun-Feng Su Unconstrained Optimization Direct approaches can be said to find the solution mathematically (to find the solution with certain properties). In a direct approach, the idea is to directly find x such that df(x)/dx=0 or f(x)=0. This kind of approaches is Newton kind of approaches. In optimization, it is f(x)=0 Newton’s method is to find a way of solving f(x)=0 and the used approach can also be iterative. Sept, 2010

40 Unconstrained Optimization
®Copyright of Shun-Feng Su Unconstrained Optimization Increment approach is to find which way can improve the current situation based on the current error. (back forward approach) Usually, an incremental approach is to update the parameter vector as x(k+1)=x(k)+x. In fact, such an approach is usually fulfilled as a gradient approach; that is x=f(x)/x. Need to find a relationship between the current error and the change of the variable considered; that is why x=f(x)/x is employed. Sept, 2010

41 Unconstrained Optimization
®Copyright of Shun-Feng Su Unconstrained Optimization Unconstrained Optimization One-Dimensional Search Find solutions through pure search Gradient Methods Find solution through incremental approach Newton’s Method and Its Variations Find solution through direct approach Sept, 2010

42 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search A very simple approach for finding the minimizer (only one local minimizer or called unimodal) for one variable function is to evaluate the function at different points and then try to progressively narrow the range to a certain sufficient accuracy. This can be said to be line search or one-dimensional search. Sept, 2010

43 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search Consider the interval [a0, b0]. By choosing the intermediate points so that a1-a0=b0-b1=(b0-a0) where <1/2. Sept, 2010

44 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search Since f(x) is unimodal, it can be found that when f(a1)<f(b1), then the minimizer must lie in [a0, b1]; if not (i.e., f(a1)f(b1)), the minimizer must lie in [a1, b0]. The search can repeat the above process by using the same  and make one intermediate point coincide with the one already used in the previous search. For example, let x*[a0, b1]. Then while selecting a2 and b2, make a1 coincide with b2. Sept, 2010

45 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search With the same , we have (searching in [a0, b1].) a0-a2=b1-b2=(b1-a0) where a1=b2. Without loss of generality assume the length of [a0, b0] is 1, then b1-a0=1- and b1-b2=1-2 We can obtain 3-3+1=0.

46 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search Then = (with <1/2, = =0.382) To divide a segment into the ratio  to 1- is called Golden section (a famous ancient Greek geometry rule). Using the Golden section, after N steps, the range will be reduced by the factor (1-)N( )N. Thus, we can decide how many times we need to search for a prescribed accuracy.

47 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search Example: consider In the range [0, 2]. Find the minimizer within range of 0.3. Ans: ( )N0.3/2  N4. Iteration 1: a1=a0+ (b0-a0)=0.7639, f(a1)=-24.36 b1=a0+ (1-)(b0-a0)=1.2361, f(b1)=-18.96 Since f(a1)<f(b1), the next interval is [0, b1] Continue this process to have the minimizer is within [a4, b3]=[0.6525, ]. (=0.2918)

48 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search How about different  are used in the process: With the same ideas (only one new point at each step) but different k: k+1 (1- k)=1-2k or k+1=1-k /(1- k) Sept, 2010

49 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search There are many sequence that can satisfy the above requirement. The reduced factor now is (1- 1) (1- 2)…(1- N). Which sequence can have the maximal reduction? Min (1- 1) (1- 2)…(1- N) Subject k+1=1-k /(1- k), k=1, …, N. 0  k1/2, k=1, …, N. Sept, 2010

50 F-1=0 and F0=1. Also, for k  0, Fk+1 =Fk +Fk-1
®Copyright of Shun-Feng Su Fibonacci Search Fibonacci sequence, F1, F2, …, Fk, … F-1=0 and F0=1. Also, for k  0, Fk+1 =Fk +Fk-1 F1, F2, …, Fk, …=1, 2, 3, 5, 8, 13, 21, 34, ….. The solution to the above optimization problem is 1=1-FN/FN+1 2=1-FN-1/FN k=1-FN-k+1/FN-k+2 N=1-F1/F2 Sept, 2010

51 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search The reduction factor is (1-1) (1-2)…(1-N)=1/FN+1. However, be aware that F1=1 and F2=2. Then N=1-F1/F2=1/2. Then it means two intermediate points coincide. No reduction at all. Then we can use N=1/2, where  is a small number. The reduction factor becomes (1+2)/FN+1 . Sept, 2010

52 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search Example: consider In the range [0, 2]. Find the minimizer within range of 0.3. Ans: (1+2)/FN+1 0.3/2  N4. Iteration 1: 1=1-F4/F5=3/8 a1=a0+ 1 (b0-a0)=3/4, f(a1)=-24.34 b1=a0+ (1-1)(b0-a0)=5/4, f(b1)=-18.65 Since f(a1)<f(b1), the next interval is [0, b1] Continue this process to have the minimizer is within [a4, b3]=[0.725, 1]. (=0.275)

53 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search Another approach to find the minimizer is to use the function’s quadratic approximation. This kind of approach is called Newton’s method. Assume we can calculate f(x), f’(x), and f’’(x). The quadratic approximation of f(x) is: q(x)=f(x(k))+f’(x(k))(x-x(k))+1/2f’’(x(k))(x-x(k))2 It is easy to verify that q(x(k))=f(x(k)); q’(x(k))=f’(x(k)); and q’’(x(k))=f’’(x(k)).

54 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search With FONC, q’(x)=0 q'(x)=f’(x(k))+f’’(x(k))(x-x(k)) Considering x=x(k+1), x(k+1)=x(k)- f’(x(k))/f’’(x(k)). Then We can continue the iterations as above. This kind of iteration for finding the next point is called newton’s method, which can also be used for multi-variable. (direct approach)

55 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search Example: consider Ans: Let x(0)=0.5. The required accuracy is 10-5 f’(x)=xcos(x) and f’’(x)=1+sin(x). x(1)=0.5-(0.5-cos(0.5))/(1+sin(0.5))=0.7552 Similarly, x(2)=x(1)- f’(x(1))/f’’(x(1))=0.7391, x(3)=0.7390, and x(4)=0.7390 Since |x(4) x(3)|=0<10-5, we can stop. Furthermore, f’(x(4))=  8.610-60 and f’’(x(4))= 1.673>0. It can be conclude that x*x(4) is a strict minimizer.

56 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search Newton’s method works well if f’’(x)>0 everywhere and may fail if f’’(x)<0 for some x. Newton’s method can also be viewed as a way to find the location where f’(x)=0. This can be seen in traditional numerical approaches and in our later study, it also is used for multi-variable functions. Another problem exists when the second derivative is not available.

57 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search When the second derivative is not available, f’’(x(k)) can be approximated by Then It is called the secant method. Note that the secant method needs two initial points.

58 One-Dimensional Search
®Copyright of Shun-Feng Su One-Dimensional Search Selected homework for prob 2: 7.2 (all).


Download ppt "Nonlinear programming"

Similar presentations


Ads by Google