Download presentation
Presentation is loading. Please wait.
Published byTiffany Flowers Modified over 9 years ago
1
Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal February 12, 2007 Inexact Methods for PDE-Constrained Optimization Emory University
2
Nonlinear Optimization “One” problem
3
Circuit Tuning Building blocks: Transistors (switches) and Gates (logic units) Improve aspects of the circuit – speed, area, power – by choosing transistor widths AT1 AT3 AT2 d1 d2 w1w2 (A. Wächter, C. Visweswariah, and A. R. Conn, 2005)
4
Circuit Tuning Building blocks: Transistors (switches) and Gates (logic units) Improve aspects of the circuit – speed, area, power – by choosing transistor widths Formulate an optimization problem AT1 AT3 AT2 d1 d2 w1w2 (A. Wächter, C. Visweswariah, and A. R. Conn, 2005)
5
Strategic Bidding in Electricity Markets Independent operator collects bids and sets production schedule and “spot price” to minimize cost to consumers (Pereira, Granville, Dix, and Barroso, 2004)
6
Strategic Bidding in Electricity Markets Independent operator collects bids and sets production schedule and “spot price” to minimize cost to consumers (Pereira, Granville, Dix, and Barroso, 2004) Electricity production companies “bid” on how much they will charge for one unit of electricity
7
Strategic Bidding in Electricity Markets Independent operator collects bids and sets production schedule and “spot price” to minimize cost to consumers Bilevel problem Equivalent to MPCC Hard geometry! (Pereira, Granville, Dix, and Barroso, 2004) Electricity production companies “bid” on how much they will charge for one unit of electricity
8
Challenges for NLP algorithms Very large problems Numerical noise Availability of derivatives Degeneracies Difficult geometries Expensive function evaluations Real-time solutions needed Integer variables Negative curvature
9
Outline Problem Formulation Equality constrained optimization Sequential Quadratic Programming Inexact Framework Unconstrained optimization and nonlinear equations Stopping conditions for linear solver Global Behavior Merit function and sufficient decrease Satisfying first order conditions Numerical Results Model inverse problem Accuracy tradeoffs Final Remarks Future work Negative curvature
10
Outline Problem Formulation Equality constrained optimization Sequential Quadratic Programming Inexact Framework Unconstrained optimization and nonlinear equations Stopping conditions for linear solver Global Behavior Merit function and sufficient decrease Satisfying first order conditions Numerical Results Model inverse problem Accuracy tradeoffs Final Remarks Future work Negative curvature
11
Equality constrained optimization e.g., minimize the difference between observed and expected behavior, subject to atmospheric flow equations (Navier-Stokes) Goal: solve the problem
12
Equality constrained optimization Define: the Lagrangian Define: the derivatives Goal: solve KKT conditions
13
Sequential Quadratic Programming (SQP) Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques
14
Sequential Quadratic Programming (SQP) Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques KKT matrix Cannot be formed Cannot be factored
15
Sequential Quadratic Programming (SQP) Algorithm: Newton’s methodAlgorithm: the SQP subproblem Two “equivalent” step computation techniques KKT matrix Cannot be formed Cannot be factored Linear system solve Iterative method Inexactness
16
Outline Problem Formulation Equality constrained optimization Sequential Quadratic Programming Inexact Framework Unconstrained optimization and nonlinear equations Stopping conditions for linear solver Global Behavior Merit function and sufficient decrease Satisfying first order conditions Numerical Results Model inverse problem Accuracy tradeoffs Final Remarks Future work Negative curvature
17
Unconstrained optimization Goal: minimize a nonlinear objective Algorithm: Newton’s method (CG)
18
Unconstrained optimization Goal: minimize a nonlinear objective Algorithm: Newton’s method (CG) Note: choosing any intermediate step ensures global convergence to a local solution of NLP (Steihaug, 1983)
19
Nonlinear equations Goal: solve a nonlinear system Algorithm: Newton’s method
20
any step with and ensures descent Nonlinear equations Goal: solve a nonlinear system Algorithm: Newton’s method (Eisenstat and Walker, 1994) (Dembo, Eisenstat, and Steihaug, 1982)
21
Line Search SQP Framework Define “exact” penalty function
22
Line Search SQP Framework Define “exact” penalty function
23
for k = 0, 1, 2, … Compute step by… Set penalty parameter to ensure descent on… Perform backtracking line search to satisfy… Update iterate Algorithm Outline (exact steps)
24
Exact Case
25
Exact step minimizes the objective on the linearized constraints
26
Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective
27
Quadratic/linear model of merit function Create model Quantify reduction obtained from step
28
Quadratic/linear model of merit function Create model Quantify reduction obtained from step
29
Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective
30
Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective … but this is ok since we can account for this conflict by increasing the penalty parameter
31
Exact Case Exact step minimizes the objective on the linearized constraints … which may lead to an increase in the model objective … but this is ok since we can account for this conflict by increasing the penalty parameter
32
for k = 0, 1, 2, … Compute step by… Set penalty parameter to ensure descent on… Perform backtracking line search to satisfy… Update iterate Algorithm Outline (exact steps)
33
First attempt Proposition: sufficiently small residual 1e-81e-71e-61e-51e-41e-31e-21e-1 Success100% 97% 90%85%72%38% Failure0% 3% 10%15%28%62% Test: 61 problems from CUTEr test set
34
First attempt… not robust Proposition: sufficiently small residual … not enough for complete robustness We have multiple goals (feasibility and optimality) Lagrange multipliers may be completely off … may not have descent!
35
Recall the line search condition Second attempt Step computation: inexact SQP step We can show
36
Recall the line search condition Second attempt Step computation: inexact SQP step We can show... but how negative should this be?
37
for k = 0, 1, 2, … Compute step Set penalty parameter to ensure descent Perform backtracking line search Update iterate Algorithm Outline (exact steps)
38
for k = 0, 1, 2, … Compute step and set penalty parameter to ensure descent and a stable algorithm Perform backtracking line search Update iterate Algorithm Outline (inexact steps)
39
Inexact Case
41
Step is acceptable if for
42
Inexact Case Step is acceptable if for
43
Inexact Case Step is acceptable if for
44
for k = 0, 1, 2, … Iteratively solve Until Update penalty parameter Perform backtracking line search Update iterate Algorithm Outline or
45
Observe KKT conditions Termination Test
46
Outline Problem Formulation Equality constrained optimization Sequential Quadratic Programming Inexact Framework Unconstrained optimization and nonlinear equations Stopping conditions for linear solver Global Behavior Merit function and sufficient decrease Satisfying first order conditions Numerical Results Model inverse problem Accuracy tradeoffs Final Remarks Future work Negative curvature
47
The sequence of iterates is contained in a convex set and the following conditions hold: the objective and constraint functions and their first and second derivatives are bounded the multiplier estimates are bounded the constraint Jacobians have full row rank and their smallest singular values are bounded below by a positive constant the Hessian of the Lagrangian is positive definite with smallest eigenvalue bounded below by a positive constant Assumptions
48
Sufficient Reduction to Sufficient Decrease Taylor expansion of merit function yields Accepted step satisfies
49
Intermediate Results is bounded below by a positive constant is bounded above
50
Sufficient Decrease in Merit Function
51
Step in Dual Space (for sufficiently small and ) Therefore, We converge to an optimal primal solution, and
52
Outline Problem Formulation Equality constrained optimization Sequential Quadratic Programming Inexact Framework Unconstrained optimization and nonlinear equations Stopping conditions for linear solver Global Behavior Merit function and sufficient decrease Satisfying first order conditions Numerical Results Model inverse problem Accuracy tradeoffs Final Remarks Future work Negative curvature
53
Problem Formulation Tikhonov-style regularized inverse problem Want to solve for a reasonably large mesh size Want to solve for small regularization parameter SymQMR for linear system solves Input parameters: orRecall: (Curtis and Haber, 2007)
54
Numerical Results Iters.TimeTotal LS Iters. Avg. LS Iters. Avg. Rel. Res. 0.52929.5s145250.13.12e-1 0.11211.37s65454.56.90e-2 0.01911.60s68175.76.27e-3 n1024 m512 1e-6 (Curtis and Haber, 2007)
55
Numerical Results Iters.TimeTotal LS Iters. Avg. LS Iters. Avg. Rel. Res. 0.52929.5s145250.13.12e-1 0.11211.37s65454.56.90e-2 0.01911.60s68175.76.27e-3 n1024 m512 1e-6 (Curtis and Haber, 2007)
56
Numerical Results Iters.TimeTotal LS Iters. Avg. LS Iters. Avg. Rel. Res. 1e-61211.40s65454.56.90e-2 1e-71114.52s84076.46.99e-2 1e-8810.57s63979.96.15e-2 1e-91118.52s11391048.65e-2 1e-101944.41s27081438.90e-2 n1024 m512 1e-1 (Curtis and Haber, 2007)
57
Numerical Results Iters.TimeTotal LS Iters. Avg. LS Iters. Avg. Rel. Res. 1e-615264.47s19921338.13e-2 1e-711236.51s17761616.89e-2 1e-89204.51s15671746.77e-2 1e-911347.66s26812448.29e-2 1e-1016805.14s62493918.93e-2 n8192 m4096 1e-1 (Curtis and Haber, 2007)
58
Numerical Results Iters.TimeTotal LS Iters. Avg. LS Iters. Avg. Rel. Res. 1e-6155055.9s43652918.46e-2 1e-7104202.6s36303638.87e-2 1e-8125686.2s48254027.96e-2 1e-9126678.7s56334698.77e-2 1e-101414783s125258958.63e-2 n65536 m32768 1e-1 (Curtis and Haber, 2007)
59
Outline Problem Formulation Equality constrained optimization Sequential Quadratic Programming Inexact Framework Unconstrained optimization and nonlinear equations Stopping conditions for linear solver Global Behavior Merit function and sufficient decrease Satisfying first order conditions Numerical Results Model inverse problem Accuracy tradeoffs Final Remarks Future work Negative curvature
60
Review and Future Challenges Review Defined a globally convergent inexact SQP algorithm Require only inexact solutions of primal-dual system Require only matrix-vector products involving objective and constraint function derivatives Results also apply when only reduced Hessian of Lagrangian is assumed to be positive definite Numerical experience on model problem is promising Future challenges (Nearly) Singular constraint Jacobians Inexact derivative information Negative curvature etc., etc., etc….
61
Negative Curvature Big question What is the best way to handle negative curvature (i.e., when the reduced Hessian may be indefinite)? Small question What is the best way to handle negative curvature in the context of our inexact SQP algorithm? We have no inertia information! Smaller question When can we handle negative curvature in the context of our inexact SQP algorithm with NO algorithmic modifications? When do we know that a given step is OK? Our analysis of the inexact case leads to a few observations…
62
Why Quadratic Models?
63
Provides a good… direction? Yes step length? Yes Provides a good… direction? Maybe step length? Maybe
64
Why Quadratic Models? One can use our stopping criteria as a mechanism for determining which are good directions All that needs to be determined is whether the step lengths are acceptable
65
Unconstrained Optimization Direct method is the angle test Indirect method is to check the conditions or
66
Unconstrained Optimization Direct method is the angle test Indirect method is to check the conditions or step qualitystep length
67
Constrained Optimization Step quality determined by Step length determined by or
68
Thanks!
69
Actual Stopping Criteria Stopping conditions: Model reduction condition or
70
Constraint Feasible Case If feasible, conditions reduce to
71
Constraint Feasible Case If feasible, conditions reduce to
72
Constraint Feasible Case If feasible, conditions reduce to Some region around the exact solution
73
Constraint Feasible Case If feasible, conditions reduce to Ellipse distorted toward the linearized constraints
74
Constraint Feasible Case If feasible, conditions reduce to
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.