Presentation is loading. Please wait.

Presentation is loading. Please wait.

Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification.

Similar presentations


Presentation on theme: "Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification."— Presentation transcript:

1

2 Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification of stationary points –Necessary and sufficient conditions for local optima. Convexity and global optimality

3 One dimensional optimization We are accustomed to think that if f(x) has a minimum then f’(x)=0 but….

4 1D Optimization jargon A point with zero derivative is a stationary point. x=5, Can be a minimum A maximum An inflection point

5 Optimality criteria for smooth 1D functions at point x* f’(x*)=0 is the condition for stationarity and a necessary condition for a minimum or a maximum. f“(x*)>0 is sufficient for a minimum f“(x*)<0 is sufficient for a maximum With f”(x*)=0 needs information from higher derivatives. Example?

6 Problems 1D Classify the stationary points of the following functions from the optimality conditions, then check by plotting them 1.2x 3 +3x 2 Solution Solution 2.3x 4 +4x 3 -12x 2 Solution Solution 3.X 5 Solution Solution 4.f=x 4 +4x 3 +6x 2 +4x SolutionSolution Answer true or false: Solutions in the notes page. –A function can have a negative value at its maximum point. –If a constant is added to a function, the location of its minimum point can change. –If the curvature of a function is negative at a stationary point, then the point is a maximum.

7 Taylor series expansion in n dimensions Expanding about a candidate minimum x* This is the condition for stationarity

8 Conditions for minimum Sufficient condition for a minimum is that That is, the matrix of second derivatives (Hessian) is positive definite Simplest way to check positive definiteness is eigenvalues: All eigenvalues need to be positive Necessary conditions matrix is positive-semi definite, all eigenvalues non-negative

9 Types of stationary points Positive definite: Minimum Positive semi-definite: possibly minimum Indefinite: Saddle point Negative semi-definite: possibly maximum Negative definite: maximum

10 Example

11 Problems n-dimensional Find the stationary points of the following functions and classify them: Solution

12 Global optimization The function x+sin(2x)

13 Convex function A straight line connecting two points will not dip below the function graph. Convex function will have a single minimum. Sufficient condition: Positive semi-definite Hessian everywhere.

14 Problems convexity Check for convexity the following functions. If the function is not convex everywhere, check its domain of convexity. See notes page Solution

15 Reciprocal approximation Reciprocal approximation (linear in one over the variables) is desirable in many cases because it captures decreasing returns behavior. Linear approximation Reciprocal approximation

16 Conservative-convex approximation At times we benefit from conservative approximations All second derivatives of f C are non-negative Called convex linearization (CONLIN), Claude Fleury

17 Problems approximations 1.Construct the linear, reciprocal, and convex approximation at (1,1) to the function 2.Plot and compare the function and the two approximations. 3.Check on their properties of convexity and conservativeness. Solution


Download ppt "Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification."

Similar presentations


Ads by Google