1 OR II GSLM 52800. 2 Outline  classical optimization – unconstrained optimization  dimensions of optimization  feasible direction.

Slides:



Advertisements
Similar presentations
1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Advertisements

Recall Taylor’s Theorem from single variable calculus:
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
Engineering Optimization
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
D Nagesh Kumar, IIScOptimization Methods: M2L2 1 Optimization using Calculus Convexity and Concavity of Functions of One and Two Variables.
Economics 2301 Lecture 31 Univariate Optimization.
Review + Announcements 2/22/08. Presentation schedule Friday 4/25 (5 max)Tuesday 4/29 (5 max) 1. Miguel Jaller 8:031. Jayanth 8:03 2. Adrienne Peltz 8:202.
Thursday, April 25 Nonlinear Programming Theory Separable programming Handouts: Lecture Notes.
EE 553 Introduction to Optimization
The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no.
MIT and James Orlin © Nonlinear Programming Theory.
OPTIMAL CONTROL SYSTEMS
Engineering Optimization
Constrained Optimization
Unconstrained Optimization Problem
Economics 214 Lecture 31 Univariate Optimization.
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
ECON 1150, Spring 2013 Lecture 3: Optimization: One Choice Variable Necessary conditions Sufficient conditions Reference: Jacques, Chapter 4 Sydsaeter.
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
Tier I: Mathematical Methods of Optimization
Lecture 9 – Nonlinear Programming Models
1 OR II GSLM Outline  separable programming  quadratic programming.
Introduction to Optimization (Part 1)
Computational Optimization
KKT Practice and Second Order Conditions from Nash and Sofer
1. Problem Formulation. General Structure Objective Function: The objective function is usually formulated on the basis of economic criterion, e.g. profit,
Chapter 11 Nonlinear Programming
Differentiation. f(x) = x 3 + 2x 2 – 3x + 5 f’(x) = 3x 2 + 4x - 3 f’(1) = 3 x x 1 – 3 = – 3 = 4 If f(x) = x 3 + 2x 2 –
Nonlinear Programming Models
Sec 15.6 Directional Derivatives and the Gradient Vector
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function.
Section 15.6 Directional Derivatives and the Gradient Vector.
Ecnomics D10-1: Lecture 11 Profit maximization and the profit function.
Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification.
Signal & Weight Vector Spaces
Performance Surfaces.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Introduction to Optimization
1 OR II GSLM Outline  equality constraint  tangent plane  regular point  FONC  SONC  SOSC.
Linear & Nonlinear Programming -- Basic Properties of Solutions and Algorithms.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Section 15.7 Maximum and Minimum Values. MAXIMA AND MINIMA A function of two variables has a local maximum at (a, b) if f (x, y) ≤ f (a, b) when (x, y)
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
OR II GSLM
Lecture 4 Chapter 3 Improving Search
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Another sufficient condition of local minima/maxima
L6 Optimal Design concepts pt B
deterministic operations research
Chapter 11 Optimization with Equality Constraints
Computational Optimization
L11 Optimal Design L.Multipliers
Computational Optimization
Lecture 8 – Nonlinear Programming Models
Dr. Arslan Ornek IMPROVING SEARCH
Absolute Maximum and Minimum Values
L5 Optimal Design concepts pt A
Outline Unconstrained Optimization Functions of One Variable
EE 458 Introduction to Optimization
Derivatives in Action Chapter 7.
Performance Surfaces.
Nonlinear programming
L8 Optimal Design concepts pt D
Presentation transcript:

1 OR II GSLM 52800

2 Outline  classical optimization – unconstrained optimization  dimensions of optimization  feasible direction

3 Classical Optimization Results  Unconstrained Optimization  different dimensions of optimization conditions  nature of conditions  necessary conditions ( 必要條件 ): satisfied by any minimum (and possibly by some non-minimum points)  sufficient conditions ( 充分條件 ): if satisfied by a point, implying that the point is a minimum (though some minima may not satisfy the conditions)  order of conditions  first-order conditions: in terms of the first derivatives of f & g j  second-order conditions: in terms of the second derivatives of f & g j  general assumptions: f, g, g j  C 1 (i.e., once continuously differentiable) or C 2 (i.e., twice continuously differentiable) as required by the conditions

4 Feasible Direction  S   n : the feasible region  x  S: a feasible point  a feasible direction d of x: if there exists  > 0 such that x+  d  S for 0 <  < 

5 Two Key Concepts for Classical Results   f: the direction of steepest accent  gradient of f at x 0 being orthogonal to the tangent of the contour f(x) = c at x 0

6 The Direction of Steepest Accent  f  contours of f(x 1, x 2 ) =   f   f: direction of steepest accent  f  in some sense, increment of unit move depending on the angle with  f  within  90  of  f: increasing  closer to 0  : increasing more  beyond  90  of  f: decreasing  closer to 180  : decreasing more  above results generally true for any f x2x2 x1x1

 f(x 1, x 2 ) =  f(x 10, x 20 ) = c  d on the tangent plane at x 0  f(x 0 +  d)  c for small   roughly speaking, for f(x 0 ) = c, f(x 0 +  d) = c for small  when d is on the tangent plane at x 0 7 Gradient of f at x 0 Being Orthogonal to the Tangent of the Contour f(x) = c at x 0

8 First-Order Necessary Condition (FONC)  f  C 1 on S and x * a local minimum of f  then for any feasible direction d at x *,  T f(x * )d  0  increasing of f at any feasible direction f(x) = x 2 for 2  x  5 f(x, y) = x 2 + y 2 for 0  x, y  2 f(x, y) = x 2 + y 2 for x  3, y  3

9 FONC for Unconstrained NLP  f  C 1 on S & x * an interior local minimum (i.e., without touching any boundary)   T f(x * ) = 0

10 FONC Not Sufficient  Example 3.2.2: f(x, y) = -(x 2 + y 2 ) for 0  x, y   T f((0, 0))d = 0 for all feasible direction d  (0, 0): a maximum point  Example 3.2.3: f(x) = x 3   f(0) = 0  x = 0 a stationary point

11 Feasible Region with Non-negativity Constraints  Example (Example 10.8 of JB) Find candidates of the minimum points by the FONC.  min f(x) =  subject to x 1  0, x 2  0, x 2  0 or, equivalently

12 Second-Order Conditions  another form of Taylor’s Theorem  f(x) = f(x * )+  T f(x * )(x-x * ) +0.5(x- x * ) T H(x * )(x - x * )+ ,  where  being small, dominated by other terms  if  T f(x * )(x-x * ) = 0,  f(x)  f(x * )  (x- x * ) T H(x * )(x - x * )  0

13 Second-Order Necessary Condition  f  C 2 on S  if x * is a local minimum of f, then for any feasible direction d   n at x *,  (i).  T f(x * )d  0, and  (ii). if  T f(x * )d = 0, then d T H(x * )d  0

14 Example 3.3.1(a)  SONC satisfied f(x) = x 2 for 2  x  5 f(x, y) = x 2 + y 2 for 0  x, y  2 f(x, y) = x 2 + y 2 for x  3, y  3

15 Example 3.3.1(b)  SONC: more discriminative than FONC  f(x, y) = -(x 2 + y 2 ) for 0  x, y in Example  (0, 0), a maximum point, failing the SONC

16 SONC for Unconstrained NLP  f  C 2 in S  x * an interior local minimum of f, then  (i).  T f(x * ) = 0, and  (ii). for all d, d T H(x * )d  0  (ii). for all d, d T H(x * )d  0 H(x * ) being positive semi-definite  (ii)  H(x * ) being positive semi-definite  convex f satisfying (ii) (and actually more)

17 Example  identity candidates of minimum points for the f(x) =   T f(x * ) =  x = (1, -1) or (-1, -1)  H(x) =  (1, -1) satisfying SONC but not (-1, -1)

18 SONC Not Sufficient  f(x, y) = -(x 4 + y 4 )   T f((0, 0))d = 0 for all d  (0, 0) a maximum

19 SOSC for Unconstrained NLP  f  C 2 on S   n and x * an interior point  if  (i).  T f(x * ) = 0, and  (ii). H(x * ) is positive definite  x * a strict local minimum of f

20 SOSC Not Necessary  Example  x = 0 a minimum of f(x) = x 4  SOSC not satisfied

21 Example  In Example 3.2.4, is (1, 1, 1) a minimum? ..  6 > 0;  positive definite, i.e., SOSC satisfied

22 Effect of Convexity  If for all y in the neighborhood of x *  S,  T f(x * )(y-x * )  0  convexity of f implies  f(y)  f(x * ) +  T f(x * )(y-x * )  f(x * )  x * a local min of f in the neighborhood of x *  x * a global minimum of f

23 Effect of Convexity  f  C 2 convex  H positive semi-definite everywhere  Taylor's Theorem, when  T f(x * )(x-x * ) = 0,  f(x) = f(x * ) +  T f(x * )(x-x * ) + (x- x * ) T H(  x * + (1-  )x)(x - x * ) = f(x * ) + (x- x * ) T H(  x * + (1-  )x)(x - x * )  f(x * )   x * a local min  a global min

24 Effect of Convexity  facts of convex functions  (i). a local min = a global min  (ii). H(x) positive semi-definite everywhere  (iii). strictly convex function, H(x) positive definite everywhere  implications  for f  C 2 convex function, the FONC  T f(x * ) = 0 is sufficient for x * to be a global minimum  if f strictly convex, x * the unique global min