Nonlinear programming

Slides:



Advertisements
Similar presentations
Line Search.
Advertisements

Optimization.
CSE 330: Numerical Methods
Engineering Optimization
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Optimization 吳育德.
Optimisation The general problem: Want to minimise some function F(x) subject to constraints, a i (x) = 0, i=1,2,…,m 1 b i (x)  0, i=1,2,…,m 2 where x.
Engineering Optimization
1 OR II GSLM Outline  classical optimization – unconstrained optimization  dimensions of optimization  feasible direction.
Optimization Introduction & 1-D Unconstrained Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Thursday, April 25 Nonlinear Programming Theory Separable programming Handouts: Lecture Notes.
MIT and James Orlin © Nonlinear Programming Theory.
Chapter 4 Roots of Equations
OPTIMAL CONTROL SYSTEMS
Engineering Optimization
Constrained Optimization
Unconstrained Optimization Problem
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
Tier I: Mathematical Methods of Optimization
Introduction to Optimization (Part 1)
Instructor: Prof.Dr.Sahand Daneshvar Presented by: Seyed Iman Taheri Student number: Non linear Optimization Spring EASTERN MEDITERRANEAN.
ENCI 303 Lecture PS-19 Optimization 2
Block 4 Nonlinear Systems Lesson 14 – The Methods of Differential Calculus The world is not only nonlinear but is changing as well 1 Narrator: Charles.
Nonlinear Programming Models
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
Numerical Methods.
Engineering Optimization Chapter 3 : Functions of Several Variables (Part 1) Presented by: Rajesh Roy Networks Research Lab, University of California,
Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function.
Example Ex. Find Sol. So. Example Ex. Find (1) (2) (3) Sol. (1) (2) (3)
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Introduction to Optimization
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Linear & Nonlinear Programming -- Basic Properties of Solutions and Algorithms.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Function Optimization
Chapter 14 Partial Derivatives
deterministic operations research
Announcements Topics: Work On:
Chapter 11 Optimization with Equality Constraints
Solver & Optimization Problems
Calculus-Based Solutions Procedures MT 235.
Computational Optimization
Computational Optimization
FIRST ORDER DIFFERENTIAL EQUATIONS
Lecture 8 – Nonlinear Programming Models
Dr. Arslan Ornek IMPROVING SEARCH
Hidden Markov Models Part 2: Algorithms
Chap 3. The simplex method
Collaborative Filtering Matrix Factorization Approach
Chapter 10. Numerical Solutions of Nonlinear Systems of Equations
Chapter 7 Optimization.
Outline Unconstrained Optimization Functions of One Variable
Optimization and Some Traditional Methods
EE 458 Introduction to Optimization
Part 4 - Chapter 13.
Least Squares Now, we go back to consider a simple problem: Ax=b
Outline Preface Fundamentals of Optimization
Simplex method (algebraic interpretation)
Outline Preface Fundamentals of Optimization
1 Newton’s Method.
8/7/2019 Berhanu G (Dr) 1 Chapter 3 Convex Functions and Separation Theorems In this chapter we focus mainly on Convex functions and their properties in.
Conjugate Direction Methods
Multivariable optimization with no constraints
Presentation transcript:

Nonlinear programming ®Copyright of Shun-Feng Su Course in 2019, spring Nonlinear programming 非線性規劃化   Offered by 蘇順豐 Shun-Feng Su, E-mail: sfsu@mail.ntust.edu.tw Department of Electrical Engineering, National Taiwan University of Science and Technology

®Copyright of Shun-Feng Su Preface Optimization is central to many occasions involving decision or finding good solutions in various research problems. In this course, I shall provide fundamental concepts and ideas about optimization, especially for nonlinear problems. Such a topic is usually called nonlinear programming.

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization In general, an optimization problem requires finding a setting of variable vector (or parameters) of the system such that an objective function is optimized. Sometimes, the variable vector may have to satisfy some constraints. Alternatives are to choose among values  Numerical approach. This is why optimization is considered as one part of computational intelligence. Sept, 2010

Traditional Optimization ®Copyright of Shun-Feng Su Traditional Optimization A traditional optimization problem can be expressed as to find x that Min (or Max) f(x) subject to x f( ) is the objective function to be optimized. or called performance index, cost function, fitness function, etc. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization If some constraint like x is specified, it is referred to as a constrained optimization problem; otherwise it is called unconstrained optimization problem. If f( ) is linear and  is polyhedral, the problem is a linear programming problem. Otherwise it is a nonlinear programming problem Sept, 2010

Nonlinear Programming ®Copyright of Shun-Feng Su Course details Nonlinear Programming Spring, 2019 Prerequisite: Basic Engineering Mathematics Instructor: Shun-Feng Su, Office : T2 502-3 Phone: ext 6704 E-mail : sfsu@mail.ntust.edu.tw Classroom :TBD Time : 09:10~12:10 Monday

®Copyright of Shun-Feng Su Course details References : E. K. P. Chong and S. H. Żak, An Introduction to Optimization, Classnote: Available on http://intelligence.ee.ntust.edu.tw/su Please select the course information and the click the nonlinear programming icon to download. Tests :One Midterm and one final.

Tentative Outline** Preface Fundamentals of Optimization ®Copyright of Shun-Feng Su Tentative Outline** Preface Fundamentals of Optimization Unconstrained Optimization Linear Programming Constrained optimization Non-derivative Approaches ** The possible contents will depend on the time.

Tentative Outline** Preface Fundamentals of Optimization ®Copyright of Shun-Feng Su Tentative Outline** Preface Fundamentals of Optimization Unconstrained Optimization Linear Programming Constrained optimization Non-derivative Approaches ** The possible contents will depend on the time.

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Optimization is to find the best one among all possible alternatives. It is easy to see that optimization is always a good means in demonstrating your research results. But, the trick is what you mean “better”?

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Optimization is to find the best one among all possible alternatives. It is easy to see that optimization is always a good means in demonstrating your research results. But, the trick is what you mean “better”? Why the optimal one is better than the others? In other words, based on which criterion the evaluation is conducted?

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization The measure of goodness of alternatives is described by an so-called objective function or performance index. Thus, it is desired that when you see “optimal”, you should first check what is the objective function used. Optimization then is to maximized or minimized the objective function considered. Other terms used are cost function (maximized), fitness function (minimized), etc. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization In general, an optimization problem requires finding a setting of variable vector (or parameters) of the system such that an objective function is optimized. Sometimes, the variable vector may have to satisfy some constraints. Alternatives are to choose among values  Numerical approach. This is why optimization is considered as one part of computational intelligence. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization A traditional optimization problem can be expressed as to find x that Min (or Max) f(x) subject to x (Rn) f( ) (f: Rn  R) is the objective function to be optimized. or called performance index, cost function, fitness function, etc. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization If some constraint like x is specified, it is referred to as a constrained optimization problem; otherwise it is called unconstrained optimization problem. If f( ) is linear and  is polyhedral, the problem is a linear programming problem. Otherwise it is a nonlinear programming problem Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization The constraint x is called a set constraint. Usually,  = {x| h(x)=0, g(x)0}, where h and g are given functions. This kind of description of  is referred to as function constraints. When = Rn, as mentioned it becomes an unconstrained optimization problem. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization The optimization problem is to find the “best” vector x over all possible vector in . This point (vector) x* is called the minimizer of f(x) over . Note that the minimizers may not be unique. Similar idea for the maximization problem. A possible way is -- Max f = Min f Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Definition: Local minimizer -- Suppose that f: Rn  R is a real-valued function defined on some  Rn. A point x* is a local minimizer of f over , if there exist  >0 such that f(x)f(x*) for all x \ {x*} and ||xx*||<. Global minimizer -- A point x* is a global minimizer, if f(x)f(x*) for all x \ {x*}. If f(x)f(x*) becomes f(x)>f(x*), x* is a strict local (or global) minimizer. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Given an optimization problem with a constraint set , a minimizer may lie either in the interior or on the boundary. In order to define it, the notion of feasible directions must be given. Definition: Feasible direction– a vector dRn, d0, is a feasible direction at x, if there exists 0>0 such that x+d for all   [0, 0]. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Necessary conditions for a local minimizer First order necessary condition (FONC): For any feasible direction d, dTf(x*)0. It is easy to understand that for all feasible points around x* have value larger than f(x*), for which x* is called a local minimizer. If an interior case or unconstrained case is considered, FONC become ∇f(x*)=0 Zero slope at a local minimum x*. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Example: Min subject to Whether the following point satisfied the FONC? [1, 3]T, [0, 3] T, [1, 0] T, and [0, 0] T. Ans: [1, 3] T, interior point ∇f(x)=[2, 6] T0 No. [0, 3] T, ∇f(x)=[0, 6] T; feasible directions d10 and d2 can be arbitrary. dT∇f(x)=6d2 may not always be nonnegative. No. Similar about [1, 0] TNo. For [0, 0] T, ∇f(x)=[0, 3] T; feasible directions d10 and d2 0. dT∇f(x)=3d2 is always nonnegative. Yes. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Necessary conditions for a local minimizer If a point satisfies FONC, it can be checked whether it satisfies SONC. Second order necessary condition (SONC): dT2f(x*) d0. 2f(x*) is called the Hessian matrix of f or H(x*). dT2f(x*) d0 can also be said to be positive semidefinite (or f is nonnegative curvature). Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Example: Min No constraint. FONC requires ∇f(x)=[2x1, -2x2]T=0. Thus x*=[0, 0]T. 2f(x*) = . It can easily be checked that the Hessian matrix is not positive semidefinite (for example, if selecting d=[0, 1]T, dT2f(x*) d<0).. Thus x*=[0, 0] T is not a local minimizer. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization The figure of Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Necessary conditions for a local minimizer If a point is a local minimizer, it needs to satisfy FONC and SONC; In other words, ∇f(x*)=0 and dT2f(x*) d0. But if a point satisfies FONC and SONC, it may not be a local minimizer. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Necessary conditions for a local minimum in a unconstrained optimization problem: 1st order condition: Zero slope at a local minimum x*  ∇f(x*)=0 2nd order condition: Nonnegative curvature at a local minimum and x*  ∇2f(x*) is positive semidefinite. There may exist points that satisfy the above 1st and 2nd order conditions but are not local minima. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Proofs of necessary conditions • 1st order condition ∇f(x*)=0 . Fix . Then (since x* is a local min), from 1st order Taylor Replace d with d, to obtain Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Sufficient conditions for a local minimizer for an interior case (or unconstrained case) • First order sufficient condition (FOSC): ∇f(x*) = 0 • Second order sufficient condition (SOSC):: dT2f(x*) d>0 or ∇2f(x*) : Positive Definite Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Convex set and convex functions: In optimization, convex sets and convex functions are usually considered. Definition: A set  is convex, if for any two points, x and y , w=x+(1)y also in  for any [0, 1]. Definition: A function f(x) is convex, if for any two points, x and y  (a convex set), f(x+(1)y)  f(x) +(1) f(y). Note: x+(1)y for any [0, 1] can be interpreted as any point on the line segment between x and y. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization f is a convex function defined on a convex set , if and only if f(y)f(x)+f(x)(yx), for all x, y Let f is a convex function defined on a convex set. Then a minimizer of f is the global minimizer. f is a convex function defined on a convex set. f is a concave function. Sept, 2010

Fundamentals of Optimization ®Copyright of Shun-Feng Su Fundamentals of Optimization Selected homework for Prob 1: 6.1, 6.5, 6.7, 6.14, 6.18, and 6.22 Only for your excise no turn in required. Sept, 2010

Outline Preface Fundamentals of Optimization ®Copyright of Shun-Feng Su Outline Preface Fundamentals of Optimization Unconstrained Optimization Ideas of finding solutions One-Dimensional Search Gradient Methods Newton’s Method and Its Variations

Unconstrained Optimization ®Copyright of Shun-Feng Su Unconstrained Optimization If the objective function can be explicitly expressed as a function of parameters, traditional mathematic approaches can be employed to solve the optimization: Traditional optimization approaches can be classified into two categories; direct approach and incremental approach. Sept, 2010

Unconstrained Optimization ®Copyright of Shun-Feng Su Unconstrained Optimization Direct approaches can be said to find the solution mathematically (to find the solution with certain properties). In a direct approach, the idea is to directly find x such that df(x)/dx=0 or f(x)=0. This kind of approaches is Newton kind of approaches. In optimization, it is f(x)=0 Newton’s method is to find a way of solving f(x)=0 and the used approach can also be iterative. Sept, 2010

Unconstrained Optimization ®Copyright of Shun-Feng Su Unconstrained Optimization Increment approach is to find which way can improve the current situation based on the current error. (back forward approach) Usually, an incremental approach is to update the parameter vector as x(k+1)=x(k)+x. In fact, such an approach is usually fulfilled as a gradient approach; that is x=f(x)/x. Need to find a relationship between the current error and the change of the variable considered; that is why x=f(x)/x is employed. Sept, 2010

Unconstrained Optimization ®Copyright of Shun-Feng Su Unconstrained Optimization Unconstrained Optimization One-Dimensional Search Find solutions through pure search Gradient Methods Find solution through incremental approach Newton’s Method and Its Variations Find solution through direct approach Sept, 2010

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search A very simple approach for finding the minimizer (only one local minimizer or called unimodal) for one variable function is to evaluate the function at different points and then try to progressively narrow the range to a certain sufficient accuracy. This can be said to be line search or one-dimensional search. Sept, 2010

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search Consider the interval [a0, b0]. By choosing the intermediate points so that a1-a0=b0-b1=(b0-a0) where <1/2. Sept, 2010

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search Since f(x) is unimodal, it can be found that when f(a1)<f(b1), then the minimizer must lie in [a0, b1]; if not (i.e., f(a1)f(b1)), the minimizer must lie in [a1, b0]. The search can repeat the above process by using the same  and make one intermediate point coincide with the one already used in the previous search. For example, let x*[a0, b1]. Then while selecting a2 and b2, make a1 coincide with b2. Sept, 2010

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search With the same , we have (searching in [a0, b1].) a0-a2=b1-b2=(b1-a0) where a1=b2. Without loss of generality assume the length of [a0, b0] is 1, then b1-a0=1- and b1-b2=1-2 We can obtain 3-3+1=0.

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search Then = (with <1/2, = =0.382) To divide a segment into the ratio  to 1- is called Golden section (a famous ancient Greek geometry rule). Using the Golden section, after N steps, the range will be reduced by the factor (1-)N(0.61803)N. Thus, we can decide how many times we need to search for a prescribed accuracy.

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search Example: consider In the range [0, 2]. Find the minimizer within range of 0.3. Ans: (0.61803)N0.3/2  N4. Iteration 1: a1=a0+ (b0-a0)=0.7639, f(a1)=-24.36 b1=a0+ (1-)(b0-a0)=1.2361, f(b1)=-18.96 Since f(a1)<f(b1), the next interval is [0, b1] Continue this process to have the minimizer is within [a4, b3]=[0.6525, 0.9443]. (=0.2918)

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search How about different  are used in the process: With the same ideas (only one new point at each step) but different k: k+1 (1- k)=1-2k or k+1=1-k /(1- k) Sept, 2010

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search There are many sequence that can satisfy the above requirement. The reduced factor now is (1- 1) (1- 2)…(1- N). Which sequence can have the maximal reduction? Min (1- 1) (1- 2)…(1- N) Subject k+1=1-k /(1- k), k=1, …, N. 0  k1/2, k=1, …, N. Sept, 2010

F-1=0 and F0=1. Also, for k  0, Fk+1 =Fk +Fk-1 ®Copyright of Shun-Feng Su Fibonacci Search Fibonacci sequence, F1, F2, …, Fk, … F-1=0 and F0=1. Also, for k  0, Fk+1 =Fk +Fk-1 F1, F2, …, Fk, …=1, 2, 3, 5, 8, 13, 21, 34, ….. The solution to the above optimization problem is 1=1-FN/FN+1 2=1-FN-1/FN … k=1-FN-k+1/FN-k+2 N=1-F1/F2 Sept, 2010

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search The reduction factor is (1-1) (1-2)…(1-N)=1/FN+1. However, be aware that F1=1 and F2=2. Then N=1-F1/F2=1/2. Then it means two intermediate points coincide. No reduction at all. Then we can use N=1/2, where  is a small number. The reduction factor becomes (1+2)/FN+1 . Sept, 2010

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search Example: consider In the range [0, 2]. Find the minimizer within range of 0.3. Ans: (1+2)/FN+1 0.3/2  N4. Iteration 1: 1=1-F4/F5=3/8 a1=a0+ 1 (b0-a0)=3/4, f(a1)=-24.34 b1=a0+ (1-1)(b0-a0)=5/4, f(b1)=-18.65 Since f(a1)<f(b1), the next interval is [0, b1] Continue this process to have the minimizer is within [a4, b3]=[0.725, 1]. (=0.275)

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search Another approach to find the minimizer is to use the function’s quadratic approximation. This kind of approach is called Newton’s method. Assume we can calculate f(x), f’(x), and f’’(x). The quadratic approximation of f(x) is: q(x)=f(x(k))+f’(x(k))(x-x(k))+1/2f’’(x(k))(x-x(k))2 It is easy to verify that q(x(k))=f(x(k)); q’(x(k))=f’(x(k)); and q’’(x(k))=f’’(x(k)).

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search With FONC, q’(x)=0 q'(x)=f’(x(k))+f’’(x(k))(x-x(k)) Considering x=x(k+1), x(k+1)=x(k)- f’(x(k))/f’’(x(k)). Then We can continue the iterations as above. This kind of iteration for finding the next point is called newton’s method, which can also be used for multi-variable. (direct approach)

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search Example: consider . Ans: Let x(0)=0.5. The required accuracy is 10-5 f’(x)=xcos(x) and f’’(x)=1+sin(x). x(1)=0.5-(0.5-cos(0.5))/(1+sin(0.5))=0.7552 Similarly, x(2)=x(1)- f’(x(1))/f’’(x(1))=0.7391, x(3)=0.7390, and x(4)=0.7390 Since |x(4) x(3)|=0<10-5, we can stop. Furthermore, f’(x(4))=  8.610-60 and f’’(x(4))= 1.673>0. It can be conclude that x*x(4) is a strict minimizer.

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search Newton’s method works well if f’’(x)>0 everywhere and may fail if f’’(x)<0 for some x. Newton’s method can also be viewed as a way to find the location where f’(x)=0. This can be seen in traditional numerical approaches and in our later study, it also is used for multi-variable functions. Another problem exists when the second derivative is not available.

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search When the second derivative is not available, f’’(x(k)) can be approximated by Then It is called the secant method. Note that the secant method needs two initial points.

One-Dimensional Search ®Copyright of Shun-Feng Su One-Dimensional Search Selected homework for prob 2: 7.2 (all).