Download presentation
Presentation is loading. Please wait.
Published byAlyson McCormick Modified over 9 years ago
1
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal November 6, 2006 INFORMS Annual Meeting 2006
2
Outline Introduction Problem formulation Motivation for inexactness Unconstrained optimization and nonlinear equations Algorithm Development Step computation Step acceptance Global Analysis Merit function and sufficient decrease Satisfying first-order conditions Conclusions/Final remarks
3
Outline Introduction Problem formulation Motivation for inexactness Unconstrained optimization and nonlinear equations Algorithm Development Step computation Step acceptance Global Analysis Merit function and sufficient decrease Satisfying first-order conditions Conclusions/Final remarks
4
Goal: solve the problem Equality constrained optimization Define: the Lagrangian Define: the derivatives Goal: solve KKT conditions
5
Equality constrained optimization Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques
6
Equality constrained optimization Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques KKT matrix Cannot be formed Cannot be factored
7
Equality constrained optimization Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques KKT matrix Cannot be formed Cannot be factored Linear system solve Iterative method Inexactness
8
Unconstrained optimization Goal: minimize a nonlinear objective Algorithm: Newton’s method (CG) Note: choosing any intermediate step ensures global convergence to a local solution of NLP (Steihaug, 1983)
9
Note: choosing any step with and ensures global convergence Nonlinear equations Goal: solve a nonlinear system Algorithm: Newton’s method (Eisenstat and Walker, 1994) (Dembo, Eisenstat, and Steihaug, 1982)
10
Outline Introduction/Motivation Unconstrained optimization Nonlinear equations Constrained optimization Algorithm Development Step computation Step acceptance Global Analysis Merit function and sufficient decrease Satisfying first-order conditions Conclusions/Final remarks
11
Equality constrained optimization Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques Question: can we ensure convergence to a local solution by choosing any step into the ball?
12
Globalization strategy: exact merit function … with Armijo line search condition Globalization strategy Step computation: inexact SQP step
13
First attempt Proposition: sufficiently small residual 1e-81e-71e-61e-51e-41e-31e-21e-1 Success100% 97% 90%85%72%38% Failure0% 3% 10%15%28%62% Test: 61 problems from CUTEr test set
14
First attempt… not robust Proposition: sufficiently small residual … not enough for complete robustness We have multiple goals (feasibility and optimality) Lagrange multipliers may be completely off
15
Recall the line search condition Second attempt Step computation: inexact SQP step We can show
16
Recall the line search condition Second attempt Step computation: inexact SQP step We can show... but how negative should this be?
17
Quadratic/linear model of merit function Create model Quantify reduction obtained from step
18
Quadratic/linear model of merit function Create model Quantify reduction obtained from step
19
Exact case
20
Exact step minimizes the objective on the linearized constraints
21
Exact case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the objective (but that’s ok)
22
Inexact case
23
Option #1: current penalty parameter
24
Step is acceptable if for
25
Option #2: new penalty parameter
26
Step is acceptable if for
27
Option #2: new penalty parameter Step is acceptable if for
28
for k = 0, 1, 2, … Iteratively solve Until Update penalty parameter Perform backtracking line search Update iterate Algorithm outline or
29
Observe KKT conditions Termination test
30
Outline Introduction/Motivation Unconstrained optimization Nonlinear equations Constrained optimization Algorithm Development Step computation Step acceptance Global Analysis Merit function and sufficient decrease Satisfying first-order conditions Conclusions/Final remarks
31
The sequence of iterates is contained in a convex set over which the following hold: the objective function is bounded below the objective and constraint functions and their first and second derivatives are uniformly bounded in norm the constraint Jacobian has full row rank and its smallest singular value is bounded below by a positive constant the Hessian of the Lagrangian is positive definite with smallest eigenvalue bounded below by a positive constant Assumptions
32
Sufficient reduction to sufficient decrease Taylor expansion of merit function yields Accepted step satisfies
33
Intermediate results is bounded below by a positive constant is bounded above
34
Sufficient decrease in merit function
35
Step in dual space (for sufficiently small and ) Therefore, We converge to an optimal primal solution, and
36
Outline Introduction/Motivation Unconstrained optimization Nonlinear equations Constrained optimization Algorithm Development Step computation Step acceptance Global Analysis Merit function and sufficient decrease Satisfying first-order conditions Conclusions/Final remarks
37
Conclusion/Final remarks Review Defined a globally convergent inexact SQP algorithm Require only inexact solutions of KKT system Require only matrix-vector products involving objective and constraint function derivatives Results also apply when only reduced Hessian of Lagrangian is assumed to be positive definite Future challenges Implementation and appropriate parameter values Nearly-singular constraint Jacobian Inexact derivative information Negative curvature etc., etc., etc….
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.