CS5321 Numerical Optimization

Slides:



Advertisements
Similar presentations
C&O 355 Lecture 8 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
Advertisements

1 LP Duality Lecture 13: Feb Min-Max Theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum.
Geometry and Theory of LP Standard (Inequality) Primal Problem: Dual Problem:
ECE Longest Path dual 1 ECE 665 Spring 2005 ECE 665 Spring 2005 Computer Algorithms with Applications to VLSI CAD Linear Programming Duality – Longest.
Introduction to Algorithms
Copyright (c) 2003 Brooks/Cole, a division of Thomson Learning, Inc
Separating Hyperplanes
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no.
CSCI 3160 Design and Analysis of Algorithms Tutorial 6 Fei Chen.
ISM 206 Lecture 4 Duality and Sensitivity Analysis.
Linear Programming and Approximation
Solving a system of equations 1 x x x 3 = 3 2 x x x 3 = 2 3 x x x 3 = 3.
Constrained Maximization
Duality Dual problem Duality Theorem Complementary Slackness
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and.
Exploiting Duality (Particularly the dual of SVM) M. Pawan Kumar VISUAL GEOMETRY GROUP.
7(2) THE DUAL THEOREMS Primal ProblemDual Problem b is not assumed to be non-negative.
ISM 206 Lecture 4 Duality and Sensitivity Analysis.
Problem Set # 4 Maximize f(x) = 3x1 + 2 x2 subject to x1 ≤ 4 x1 + 3 x2 ≤ 15 2x1 + x2 ≤ 10 Problem 1 Solve these problems using the simplex tableau. Maximize.
Linear Programming – Max Flow – Min Cut Orgad Keller.
Optimization Theory Primal Optimization Problem subject to: Primal Optimal Value:
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
ITERATIVE TECHNIQUES FOR SOLVING NON-LINEAR SYSTEMS (AND LINEAR SYSTEMS)
Operations Research Assistant Professor Dr. Sana’a Wafa Al-Sayegh 2 nd Semester ITGD4207 University of Palestine.
Duality Theory 對偶理論.
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
ECE 556 Linear Programming Ting-Yuan Wang Electrical and Computer Engineering University of Wisconsin-Madison March
Introduction to Operations Research
Duality Theory  Every LP problem (called the ‘Primal’) has associated with another problem called the ‘Dual’.  The ‘Dual’ problem is an LP defined directly.
Duality Theory.
Advanced Operations Research Models Instructor: Dr. A. Seifi Teaching Assistant: Golbarg Kazemi 1.
Minicourse on parameterized algorithms and complexity Part 4: Linear programming Dániel Marx (slides by Daniel Lokshtanov) Jagiellonian University in Kraków.
Linear Programming Maximize Subject to Worst case polynomial time algorithms for linear programming 1.The ellipsoid algorithm (Khachian, 1979) 2.Interior.
Chapter 4 Sensitivity Analysis, Duality and Interior Point Methods.
Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
OR Relation between (P) & (D). OR optimal solution InfeasibleUnbounded Optimal solution OXX Infeasible X( O )O Unbounded XOX (D) (P)
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Copyright © 2006 Brooks/Cole, a division of Thomson Learning, Inc. Linear Programming: An Algebraic Approach 4 The Simplex Method with Standard Maximization.
Approximation Algorithms Duality My T. UF.
Linear Programming Piyush Kumar Welcome to CIS5930.
1 Lecture 12 Chapter 7 Duality and Sensitivity in Linear Programming 7.1 Objective values usually can be interpreted as either minimize cost or maximize.
Optimal Control.
Chap 10. Sensitivity Analysis
Computational Optimization
6.5 Stochastic Prog. and Benders’ decomposition
Chap 9. General LP problems: Duality and Infeasibility
Chapter 5. Sensitivity Analysis
CS5321 Numerical Optimization
CS5321 Numerical Optimization
ENGM 631 Optimization.
Chapter 4. Duality Theory
CS5321 Numerical Optimization
Linear Programming and Approximation
Linear Programming Duality, Reductions, and Bipartite Matching
CS5321 Numerical Optimization
BT8118 – Adv. Topics in Systems Biology
Chapter 5. The Duality Theorem
CS5321 Numerical Optimization
CONSTRAINT GENERATION (BENDERS DECOMPOSITION)
CS5321 Numerical Optimization
Lecture 20 Linear Program Duality
Solving a system of equations
Some Comments on Root finding
Chapter-III Duality in LPP
6.5 Stochastic Prog. and Benders’ decomposition
Pivoting, Perturbation Analysis, Scaling and Equilibration
CS5321 Numerical Optimization
Presentation transcript:

CS5321 Numerical Optimization 14 Linear Programming: The Interior Point Method 7/15/2019

Interior point methods for LP There are many variations of IPM for LP Primal IPM, dual IPM, primal-dual IPM Potential Reduction Methods Path Following Methods Short-step methods Long-step methods Predictor-corrector methods We will cover the path following method and the long-step method 7/15/2019

The dual problem of LP minxz=cTx subject to Ax = b, x  0 The dual problem is (see chap 12) maxbT subject to AT  +s = c, s  0 Weak duality For any x and  that are feasible in the primal and the dual problems, bT  cTx. Strong duality If x* and * are solutions to the primal and the dual problems, bT* = cTx*. 7/15/2019

Optimality conditions of LP minxz=cTx subject to Ax = b, x  0 The Lagrangian function L(x,,s) = cTx − T(Ax−b) − sTx The optimality conditions (KKT, see chap 12) The last one is the complementarity condition A T ¸ + s = c x b ; i 7/15/2019

Primal-dual method The optimality conditions can be expressed as X=diag(x), S=diag(s) and e=(1,1,…,1)T. F is a nonlinear equation: R2n+m  R2n+m. Solved by the Newton’s method: (J the Jacobian of F.) (x, , s)+=(x, , s), F ( x ; ¸ s ) = @ A T + ¡ c b X S e 1 f o r J ( x ; ¸ s ) @ ¢ 1 A = ¡ F 7/15/2019

Jacobian of F Let . The Jacobian J (x, , s) of F(x, , s) is F ( x ; ¸ s ) = @ A T + ¡ c b X S e 1 r J = @ r x c ¸ s b X S 1 A T I 7/15/2019

Solving the linear systems The Newton’s direction is obtained by solving S is an artificial variable. Let The left hand side is symmetric, but indefinite. @ A T I S X 1 ¢ x ¸ s = ¡ r c b D = S ¡ 1 2 X µ ¡ D 2 A T ¶ ¢ x ¸ = r c + X 1 S b ¢ s = ¡ X 1 r S x 7/15/2019

Centering In practice, rXS=XSe −e rather than rXS=XSe. Duality measure: Centering parameter   [0,1]: Decreasing as approaching solutions This is related to the barrier method (chap 15) ¹ = 1 n X i x s T Centered direction Without centering 7/15/2019

The path following algorithm Given (x0,0,s0) with (x0,s0)>0 For k = 0, 1, 2 …until (xk)Tsk<  Choose k[0,1] and let k=(xk)Tsk/n. Solve the Newton’s direction (xk, k, sk) Set (xk+1, k+1, sk+1)=(xk, k, sk) + k(xk, k, sk), where k is chosen to make (xk+1,sk+1)>0 7/15/2019

Central path and neighborhood Feasible set F0={(x,,s) | Ax=b, AT+s=c, (x,s)>0} Point (x,,s)F0 is on the central path C if x(i)s(i)=  for all i=1,2,…,n. An example of the neighborhood of the central path  is the duality measure (two slides before) Typically =10−3. 7/15/2019

Long-step path-following Given (x0,0,s0)N− and , min, max. For k = 0, 1, 2 …until (xk)Tsk<  Choose k[min,max] and let k=(xk)Tsk/n. Solve the Newton’s direction (xk, k, sk) Set (xk+1, k+1, sk+1)=(xk, k, sk) + k(xk, k, sk), where k is chosen to make (xk+1, k+1, sk+1)N−. 7/15/2019

Complexity The long step following method can converge in O(nlog(1/)) iterations (Theorem 14.4) The most effective interior point method for LP is the predictor-corrector algorithm (Mehrotra 1992) 7/15/2019