Presentation is loading. Please wait.

Presentation is loading. Please wait.

Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.

Similar presentations


Presentation on theme: "Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge."— Presentation transcript:

1 Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge Nocedal August 1, 2006 ISMP 2006

2 Outline Introduction/Motivation  Unconstrained optimization  Nonlinear equations  Constrained optimization Algorithm Development  Step computation  Step acceptance Global Analysis  Merit function and sufficient decrease  Satisfying first-order conditions Conclusions/Final remarks

3 Outline Introduction/Motivation  Unconstrained optimization  Nonlinear equations  Constrained optimization Algorithm Development  Step computation  Step acceptance Global Analysis  Merit function and sufficient decrease  Satisfying first-order conditions Conclusions/Final remarks

4 Unconstrained optimization Goal: minimize a single nonlinear objective Algorithm: Newton’s method (CG)

5 Unconstrained optimization Goal: minimize a single nonlinear objective Algorithm: Newton’s method (CG) Note: choosing any intermediate step ensures global convergence to a local solution of NLP (Steihaug, 1983)

6 Nonlinear equations Goal: solve a single nonlinear system Algorithm: Newton’s method

7 Note: choosing any step with and ensures global convergence Nonlinear equations Goal: solve a single nonlinear system Algorithm: Newton’s method (Eisenstat and Walker, 1994) (Dembo, Eisenstat, and Steihaug, 1982)

8 Goal: solve the problem Equality constrained optimization Define: the Lagrangian Define: the derivatives Goal: solve KKT conditions

9 Equality constrained optimization Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques

10 Equality constrained optimization Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques Question: can we ensure convergence to a local solution by choosing any step into the ball?

11 Equality constrained optimization Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques Question: can we ensure convergence with a step to constraints? step to reduce objective? Preferably both, but… (Heinkenschloss and Vicente, 2001)

12 Equality constrained optimization Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques … what if we can’t do both?

13 Equality constrained optimization Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques Exact solution minimizes the objective subject to satisfying the constraints (but this can be expensive to find) Question: what inexact solutions are acceptable?

14 Outline Introduction/Motivation  Unconstrained optimization  Nonlinear equations  Constrained optimization Algorithm Development  Step computation  Step acceptance Global Analysis  Merit function and sufficient decrease  Satisfying first-order conditions Conclusions/Final remarks

15 Globalization strategy: exact merit function … with Armijo line search condition Proposed algorithm Step computation: inexact SQP step

16 First attempt Proposition: sufficiently small residual 1e-81e-71e-61e-51e-41e-31e-21e-1 Success100% 97% 90%85%72%38% Failure0% 3% 10%15%28%62% Test: 61 problems from CUTEr test set

17 First attempt… Not robust Proposition: sufficiently small residual … not enough for complete robustness  We have multiple goals (feasibility and optimality)  Lagrange multipliers may be completely off

18 Quadratic/linear model of merit function Create model Quantify reduction obtained from step

19 Quadratic/linear model of merit function Create model Quantify reduction obtained from step

20 What are acceptable steps? positive or negative positive or negative 4 possibilities =x

21 What are acceptable steps?

22 Option #1: “constraint reduction”

23 Penalty parameter can be increased to ensure sufficiently large model reduction

24 Option #2: “quadratic objective reduction”

25 Question: if model reduction is positive, is it large enough?

26 Split reduction in two parts

27 Central idea: “sufficient reduction” Sufficient reduction condition:

28 Option #1: “constraint reduction”

29

30 Step is acceptable if:

31 Option #1: “constraint reduction” Step is acceptable if:

32 Option #2: “quadratic objective reduction”

33 Step is acceptable if:

34 Option #2: “quadratic objective reduction” Step is acceptable if:

35 for k = 0, 1, 2, …  Iteratively solve  Until  Update penalty parameter  Perform backtracking line search  Update iterate Algorithm outline or

36 Observe KKT conditions Termination test

37 Outline Introduction/Motivation  Unconstrained optimization  Nonlinear equations  Constrained optimization Algorithm Development  Step computation  Step acceptance Global Analysis  Merit function and sufficient decrease  Satisfying first-order conditions Conclusions/Final remarks

38 The sequence of iterates is contained in a convex set over which the following hold:  the objective function is bounded below  the objective and constraint functions and their first and second derivatives are uniformly bounded in norm  the constraint Jacobian has full row rank and  the Hessian of the Lagrangian is positive definite and Assumptions

39 Sufficient reduction to sufficient decrease Taylor expansion of merit function yields Accepted step satisfies

40 Intermediate results is positive, and bounded above zero is bounded above

41 Sufficient decrease in merit function

42 Step in dual space (for sufficiently small and ) …, therefore, We converge to an optimal primal solution, and

43 Outline Introduction/Motivation  Unconstrained optimization  Nonlinear equations  Constrained optimization Algorithm Development  Step computation  Step acceptance Global Analysis  Merit function and sufficient decrease  Satisfying first-order conditions Conclusions/Final remarks

44 Conclusion/Final remarks Review  Defined a globally convergent inexact SQP algorithm  Require only inexact solutions of KKT system  Require only matrix-vector products involving objective and constraint function derivatives  Results also apply when only reduced Hessian of Lagrangian is assumed to be positive definite Future challenges  Implementation and appropriate parameter values  Nearly-singular constraint Jacobian  Inexact derivative information  Negative curvature  etc., etc., etc….


Download ppt "Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge."

Similar presentations


Ads by Google