Download presentation
Presentation is loading. Please wait.
Published byLeticia Parmley Modified over 9 years ago
1
1 OR II GSLM 52800
2
2 Outline some terminology differences between LP and NLP basic questions in NLP gradient and Hessian quadratic form contour, graph, and tangent plane
3
3 Feasible Points, Solution Set, and Neighborhood feasible point: a point that satisfies all the constraints solution set (feasible set, feasible region): the collection of all feasible points neighborhood of x 0 = {x| |x x 0 | < } for some pre-specified feasible region the neighborhood of a point for a given A C B D only the neighborhood of D is completely feasible for this
4
4 Weak and Strong; Local and Global local minima: x 1, any point in [s, t], x 3 strict (strong) local minima: x 1, x 3 weak local minima: any point in [s, t] strict global minimum: x 1 weak local maxima: any point in [s, t] x f(x ) 12x3x3 x2x2 x1x1 t s
5
5 Differences Between Linear and Non-Linear Programming linear programming there exists an optimal extreme point (a corner point) direction of improvement keeps on being so unless hitting a constraint a local optimum point is also globally optimal direction of improvement optimal point
6
6 Differences Between Linear and Non-Linear Programming none of these necessarily holds for a non- linear program x f(x ) 12x3x3 x2x2 x1x1 t s min x 2 + y 2, s.t. -2 x, y 2
7
7 Basic Questions in Non-Linear Programming main question: given an initial location x 0, how to get to a local minimum, or, better, a global minimum (a) the direction of improvement? (b) the necessary conditions of an optimal point? (c) the sufficient conditions of an optimal point? (d) any conditions to simplify the processes in (a), (b), and (c)? (e) any algorithmic procedures to solve a NLP problem?
8
8 Basic Questions in Non-Linear Programming calculus required for (a) to (e) direction of improvement of f = gradient of f shaped by constraints convexity for (d), and also (b) and (c) identification of convexity: definiteness of matrices, especially for Hessians
9
9 Gradient and Hessian gradient of f: f(x) = in short Hessian = f and g j usually assumed to be twice differentiable functions Hessian is a symmetric matrix
10
10 Gradient and Hessian e j : (0, …, 0, 1, 0, …, 0) T, where “1” at the jth position for small , f(x+ e j ) f(x) + in general, x = ( x 1, …, x n ) T from x, f(x+ x) f(x) +
11
11 Example 1.6.1 (a). f(x) = x 2 ; f(3.5+ ) ? for small (b). f(x, y) = x 2 + y 2, f((1, 1) + ( x, y)) ? for small x, y gradient f : direction of steepest accent of the objective fucntion
12
12 Example 1.6.2 find the Hessian of (a). f(x, y) = x 2 + 7y 2 (b). f(x, y) = x 2 + 5xy + 7y 2 (c). f(x, y) = x 3 + 7y 2
13
13 Quadratic Form general form: x T Qx/2 + c T x + a, where x is an n-dimensional vector; Q an n n square matrix; c and a are matrices of appropriate dimensions how to derive the gradient and Hessian? gradient f(x) = Qx+c Hessian H = Q
14
14 Quadratic Form relate the two forms x T Qx/2 + c T x + a and f(x, y) = 1 x 2 + 2 xy+ 3 y 2 + 4 x+ 5 y+ 6 Example 1.6.3
15
15 Example 1.6.4 Find the first two derivatives of the following f(x) f(x) = x 2 for x [-2, 2] f(x) = -x 2 for x [-2, 2]
16
16 Contour and Graph (i.e., Surface) of Function f Example 1.7.1: f(x 1, x 2 ) =
17
17 Contour and Graph (i.e., Surface) of Function f an n-dimensional function a contour of f: a diagram f(x) = c in the n- dimensional space for a given value c the graph (surface function) of f: the diagram z = f(x) in the (n+1)st dimensional space as x and z vary
18
18 Contour and Graph (i.e., Surface) of Function f how do the contours of the one-dimensional function f(x) = x 2 look like?
19
19 An Important Property Between the Gradient and the Tangent Plane at a Contour the gradient of f at point x 0 is orthogonal to the tangent of the contour f(x) = c at x 0 many optimization results are related to the above property
20
20 Gradient of f at x 0 Being Orthogonal to the Tangent of the Contour f(x) = c at x 0 Example 1.7.3: f(x 1, x 2 ) = x 1 +2x 2 gradient at (4, 2)? tangent of contour at (4, 2)?
21
21 Gradient of f at x 0 Being Orthogonal to the Tangent of the Contour f(x) = c at x 0 Example 1.7.2: f(x 1, x 2 ) = point (x 10, x 20 ) on a contour f(x 1, x 2 ) = c
22
22 Tangent at a Contour and the Corresponding Tangent Plane at a Surface the above two are related for contour of f(x, y) = x 2 +y 2, the tangent at (x 0, y 0 ) (x-x 0, y- y 0 ) T (2x 0, 2y 0 ) = 0 two orthogonal vectors u and v: u T v = 0
23
23 Tangent at a Contour and the Corresponding Tangent Plane at a Surface the tangent place at (x 0, y 0 ) for the surface of f(x, y) = x 2 +y 2 the surface: z = x 2 +y 2 defining a contour at a higher dimension: F(x, y, z) = x 2 +y 2 z tangent plane at (x 0, y 0, ) of the surface: what happens when z =
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.