Download presentation
Presentation is loading. Please wait.
Published byMilton Randolph Douglas Modified over 9 years ago
1
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 9. Optimization problems.
2
Optimization
3
Optimization problems
4
Examples
5
Global vs. local optimization
6
Global optimization Finding, or even verifying, global minimum is difficult, in general Most optimization methods are designed to find local minimum, which may or may not be global minimum If global minimum is desired, one can try several widely separated starting points and see if all produce same result For some problems, such as linear programming, global optimization is more tractable
7
Existence of Minimum
8
Level sets
9
Uniqueness of minimum
10
First-order optimality condition
11
Second-order optimality condition
12
Constrained optimality
15
If inequalities are present, then KKT optimality conditions also require nonnegativity of Lagrange multipliers corresponding to inequalities, and complementarity condition
16
Sensitivity and conditioning
17
Unimodality
18
Golden section search
21
Example
22
Example (cont.)
23
Successive parabolic interpolation
24
Example
25
Newton’s method Newton’s method for finding minimum normally has quadratic convergence rate, but must be started close enough to solution to converge
26
Example
27
Safeguarded methods
28
Multidimensional optimization. Direct search methods
29
Steepest descent method
31
Example
32
Example (cont.)
33
Newton’s method
35
Example
36
Newton’s method
38
Trust region methods
40
Quasi-Newton methods
41
Secant updating methods
42
BFGS method
45
Example For quadratic objective function, BFGS with exact line search finds exact solution in at most n iterations, where n is dimension of problem
46
Conjugate gradient method
47
CG method
48
CG method example
49
Example (cont.)
50
Truncated Newton methods Another way to reduce work in Newton-like methods is to solve linear system for Newton step by iterative method Small number of iterations may suffice to produce step as useful as true Newton step, especially far from overall solution, where true Newton step may be unreliable anyway Good choice for linear iterative solver is CG method, which gives step intermediate between steepest descent and Newton-like step Since only matrix-vector products are required, explicit formation of Hessian matrix can be avoided by using finite difference of gradient along given vector
51
Nonlinear Least squares
52
Nonlinear least squares
53
Gauss-Newton method
54
Example
55
Example (cont.)
56
Gauss-Newton method
57
Levenberg-Marquardt method With suitable strategy for choosing μk, this method can be very robust in practice, and it forms basis for several effective software packages
58
Equality-constrained optimization
59
Sequential quadratic programming
60
Merit function
61
Inequality-constrained optimization
62
Penalty methods This enables use of unconstrained optimization methods, but problem becomes ill-conditioned for large ρ, so we solve sequence of problems with gradually increasing values of, with minimum for each problem used as starting point for next problem
63
Barrier methods
64
Example: constrained optimization
65
Example (cont.)
67
Linear progamming
68
Linear programming
69
Example: linear programming
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.