Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS5321 Numerical Optimization

Similar presentations


Presentation on theme: "CS5321 Numerical Optimization"— Presentation transcript:

1 CS5321 Numerical Optimization
18 Sequential Quadratic Programming (Active Set methods) 12/6/2018

2 Local SQP model The problem minxf (x) subject to c(x)=0 can be modeled as a quadratic programming at x=xk Assumptions: A(x), the constraint Jacobian, has full row rank. xx2L(x,) is positive definite on the tangent space of constraints, that is, dTxx2L(x,)d>0 for all d0,Ad=0. m i n p k ( ) = f + r T 1 2 x L s . t A c 12/6/2018

3 Inequality constraints
For The local quadratic model is m i n x f ( ) s . t c = ; 2 E I m i n p k ( ) = f + r T 1 2 x L s . t c ; E I 12/6/2018

4 Theorem of active set methods
Theorem 18.1 (Robinson 1974) If x* is a local solution of the original problem with some *, and the pair (x*, *) satisfies the KKT condition, the LICO condition, and the second order sufficient conditions, then for (xk, k) sufficiently close to (x*, *), then there is a local quadratic model whose active set is the same as that of the original problem. 12/6/2018

5 Sequential QP method Choose an initial guess x0, 0 For k = 1, 2, 3, …
Evaluate fk, fk, xx2Lk, ck, and ck(=Ak) Solve the local quadratic programming Set xk+1 = xk + pk How to choose the active set? How to solve 2(b)? Haven’t we solved that in chap 16? (Yes and No) 12/6/2018

6 Algorithms Two types of algorithms to choose active set
Inequality constrained QP (IQP): Solve QPs with inequality constraints and take the local active set as the optimal one. Equality constrained QP (EQP): Select constraints as the active set and solve equality constrained QPs. Basic algorithms to solve 2(b) Line search methods Trust region methods Nonlinear gradient projection 12/6/2018

7 Solving SQP All techniques in chap 16 can be applied.
But there are additional problems need be solved Linearized constraints may not be consistent Convergence guarantee (Hessian is not positive def) And some useful properties can be used Hessian can be updated by the quasi-Newton method Solutions can be used as an initial guess (warm start) The exact solution is not required. 12/6/2018

8 Inconsistent linearizations
Linearizing at xk=1 gives The constraints cannot be enforced since they may not exact or consistent. Use penalty function 1 x 2 4 p 2 3 m i n x ; v w t f ( ) + X 2 E I s u b j e c o = 12/6/2018

9 Quasi-Newton approximations
k = x + 1 y r L ( ; ) Recall for the update of Hessian is (BFGS, chap 6) If the updated Hessian Bk+1 is not positive definite Condition fails. Define for (0,1) B k + 1 = s T y s T k B + 1 = y > r k = y + ( 1 ) B s B k + 1 = s T r 12/6/2018

10 BFGS for reduced-Hessian
T = Q 1 2 R ; p + Let Need to solve Solve * first to obtained an active set. Ignore term. Solve The reduced secant formula is Use BFGS on this equation. R p 1 = h ( Q T 2 G ) g + Q T 2 G 1 p ( Q T 2 G ) p = g ( Q T 2 G ) k + 1 p = [ r x L ; ] 12/6/2018

11 1.Line search SQP method Set step length k such that the merit function is sufficiently decreased? One of the Wolfe condition (chap 3) D(1, pk) is the directional derivative of 1 in pk. (theorem 18.2 ) Let k =1 and decrease it until the condition is satisfied. Á 1 ( x ; ) = f + k c Á 1 ( x k + p ; ) D D ( Á 1 x k ; ) p = r f T c 12/6/2018

12 2.Trust region SQP method
Problem modification Relax the constraints (rk=0 in the original algorithm) Add trust region radius as a constraint There are smart ways to choose rk. For example, m i n p k ( ) = f + r T 1 2 x L s . t A c ; m i n v r k ( ) = A + c 2 s . t : 8 12/6/2018

13 3.Nonlinear gradient projection
Let Bk be the s.p.d. approximation to 2f (xk). Step direction is pk=x−xk Combine with the line-search dir xk+1=xk+kpk. Choose k s.t. f (xk+1)  f (xk)+kfkTpk. Combine with the trust region bounds ||pk||k. m i n x q k ( ) = f + r T 1 2 B s . t l u m i n x q k ( ) = f + r T 1 2 B s . t a l ; e u 12/6/2018


Download ppt "CS5321 Numerical Optimization"

Similar presentations


Ads by Google