Download presentation
Presentation is loading. Please wait.
Published byAnis Bryant Modified over 9 years ago
1
Non-Linear Programming © 2011 Daniel Kirschen and University of Washington 1
2
Motivation Method of Lagrange multipliers – Very useful insight into solutions – Analytical solution practical only for small problems – Direct application not practical for real-life problems because these problems are too large – Difficulties when problem is non-convex Often need to search for the solution of practical optimization problems using: – Objective function only or – Objective function and its first derivative or – Objective function and its first and second derivatives © 2011 Daniel Kirschen and University of Washington 2
3
Naïve One-Dimensional Search Suppose: – That we want to find the value of x that minimizes f(x) – That the only thing that we can do is calculate the value of f(x) for any value of x We could calculate f(x) for a range of values of x and choose the one that minimizes f(x) © 2011 Daniel Kirschen and University of Washington 3
4
x f(x) Naïve One-Dimensional Search Requires a considerable amount of computing time if range is large and a good accuracy is needed © 2011 Daniel Kirschen and University of Washington 4
5
One-Dimensional Search x f(x) x0x0 © 2011 Daniel Kirschen and University of Washington 5
6
One-Dimensional Search x f(x) x0x0 x1x1 © 2011 Daniel Kirschen and University of Washington 6
7
One-Dimensional Search x f(x) x0x0 x1x1 x2x2 If the function is convex, we have bracketed the optimum Current search range © 2011 Daniel Kirschen and University of Washington 7
8
One-Dimensional Search x f(x) x0x0 x1x1 x2x2 x3x3 Optimum is between x 1 and x 2 We do not need to consider x 0 anymore © 2011 Daniel Kirschen and University of Washington 8
9
One-Dimensional Search x f(x) x0x0 x1x1 x2x2 x3x3 Repeat the process until the range is sufficiently small x4x4 © 2011 Daniel Kirschen and University of Washington 9
10
One-Dimensional Search x f(x) x0x0 x1x1 x2x2 x3x3 The procedure is valid only if the function is convex! x4x4 © 2011 Daniel Kirschen and University of Washington 10
11
Multi-Dimensional Search Unidirectional search not applicable Naïve search becomes totally impossible as dimension of the problem increases If we can calculate the first derivatives of the objective function, much more efficient searches can be developed The gradient of a function gives the direction in which it increases/decreases fastest © 2011 Daniel Kirschen and University of Washington 11
12
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 12
13
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 13
14
Unidirectional Search Gradient direction Objective function © 2011 Daniel Kirschen and University of Washington 14
15
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 15
16
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 16
17
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 17
18
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 18
19
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 19
20
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 20
21
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 21
22
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 22
23
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 23
24
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 24
25
Steepest Ascent/Descent Algorithm x1x1 x2x2 © 2011 Daniel Kirschen and University of Washington 25
26
Choosing a Direction Direction of steepest ascent/descent is not always the best choice Other techniques have been used with varying degrees of success In particular, the direction chosen must be consistent with the equality constraints © 2011 Daniel Kirschen and University of Washington 26
27
How far to go in that direction? Unidirectional searches can be time- consuming Second order techniques that use information about the second derivative of the objective function can be used to speed up the process Problem with the inequality constraints – There may be a lot of inequality constraints – Checking all of them every time we move in one direction can take an enormous amount of computing time © 2011 Daniel Kirschen and University of Washington 27
28
Handling of inequality constraints © 2011 Daniel Kirschen and University of Washington 28 x1x1 x2x2 Move in the direction of the gradient How do I know that I have to stop here?
29
Handling of inequality constraints © 2011 Daniel Kirschen and University of Washington 29 x1x1 x2x2 How do I know that I have to stop here?
30
Penalty functions © 2011 Daniel Kirschen and University of Washington 30 Penalty x max x min
31
Penalty functions © 2011 Daniel Kirschen and University of Washington 31 Replace enforcement of inequality constraints by addition of penalty terms to objective function Penalty x max x min K(x-x max ) 2
32
Problem with penalty functions © 2011 Daniel Kirschen and University of Washington 32 Stiffness of the penalty function must be increased progressively to enforce the constraints tightly enough Not very efficient method Penalty x max x min
33
Barrier functions © 2011 Daniel Kirschen and University of Washington 33 Barrier cost x max x min
34
Barrier functions © 2011 Daniel Kirschen and University of Washington 34 Barrier must be made progressively closer to the limit Works better than penalty function Interior point methods Barrier cost x max x min
35
Non-Robustness © 2011 Daniel Kirschen and University of Washington 35 x1x1 x2x2 B A C D X Y Different starting points may lead to different solutions if the problem is not convex
36
Conclusions Very sophisticated non-linear programming methods have been developed They can be difficult to use: – Different starting points may lead to different solutions – Some problems will require a lot of iterations – They may require a lot of “tuning” © 2011 Daniel Kirschen and University of Washington 36
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.