Linear Programming Piyush Kumar Welcome to COT 5405.

Slides:



Advertisements
Similar presentations
1 LP, extended maxflow, TRW OR: How to understand Vladimirs most recent work Ramin Zabih Cornell University.
Advertisements

1 Outline relationship among topics secrets LP with upper bounds by Simplex method basic feasible solution (BFS) by Simplex method for bounded variables.
1 LP Duality Lecture 13: Feb Min-Max Theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum.
Geometry and Theory of LP Standard (Inequality) Primal Problem: Dual Problem:
Introduction to Algorithms
Dragan Jovicic Harvinder Singh
1 of 56 Linear Programming-Based Approximation Algorithms Shoshana Neuburger Graduate Center, CUNY May 13, 2009.
Linear Programming Fundamentals Convexity Definition: Line segment joining any 2 pts lies inside shape convex NOT convex.
Basic Feasible Solutions: Recap MS&E 211. WILL FOLLOW A CELEBRATED INTELLECTUAL TEACHING TRADITION.
Separating Hyperplanes
CSCI 3160 Design and Analysis of Algorithms Tutorial 6 Fei Chen.
Linear Programming and Approximation
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
1 Introduction to Linear and Integer Programming Lecture 9: Feb 14.
Introduction to Linear and Integer Programming Lecture 7: Feb 1.
Approximation Algorithms
Chapter 10: Iterative Improvement
Computer Algorithms Integer Programming ECE 665 Professor Maciej Ciesielski By DFG.
Computational Methods for Management and Economics Carla Gomes
C&O 355 Lecture 2 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
Computational Geometry Piyush Kumar (Lecture 5: Linear Programming) Welcome to CIS5930.
Linear Programming Piyush Kumar. Graphing 2-Dimensional LPs Example 1: x y Feasible Region x  0y  0 x + 2 y  2 y  4 x  3 Subject.
Duality Theory 對偶理論.
Linear Programming Data Structures and Algorithms A.G. Malamos References: Algorithms, 2006, S. Dasgupta, C. H. Papadimitriou, and U. V. Vazirani Introduction.
TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A A A A A A A Image:
Duality Theory.
§1.4 Algorithms and complexity For a given (optimization) problem, Questions: 1)how hard is the problem. 2)does there exist an efficient solution algorithm?
Linear Program Set Cover. Given a universe U of n elements, a collection of subsets of U, S = {S 1,…, S k }, and a cost function c: S → Q +. Find a minimum.
C&O 355 Mathematical Programming Fall 2010 Lecture 5 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A.
Linear Programming Maximize Subject to Worst case polynomial time algorithms for linear programming 1.The ellipsoid algorithm (Khachian, 1979) 2.Interior.
CPSC 536N Sparse Approximations Winter 2013 Lecture 1 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAA.
Lecture.6. Table of Contents Lp –rounding Dual Fitting LP-Duality.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Approximation Algorithms Duality My T. UF.
OR Chapter 4. How fast is the simplex method  Efficiency of an algorithm : measured by running time (number of unit operations) with respect to.
Linear Programming Piyush Kumar Welcome to CIS5930.
Approximation Algorithms based on linear programming.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
1 Chapter 4 Geometry of Linear Programming  There are strong relationships between the geometrical and algebraic features of LP problems  Convenient.
Discrete Optimization
Linear Programming for Solving the DSS Problems
Linear Programming Many problems take the form of maximizing or minimizing an objective, given limited resources and competing constraints. specify the.
Lap Chi Lau we will only use slides 4 to 19
Chapter 1. Introduction Ex : Diet Problem
ADVANCED COMPUTATIONAL MODELS AND ALGORITHMS
Topics in Algorithms Lap Chi Lau.
The minimum cost flow problem
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
The Simplex Method The geometric method of solving linear programming problems presented before. The graphical method is useful only for problems involving.
10CS661 OPERATION RESEARCH Engineered for Tomorrow.
Haim Kaplan and Uri Zwick
Linear Programming.
Chap 9. General LP problems: Duality and Infeasibility
Chapter 6. Large Scale Optimization
Analysis of Algorithms
Chapter 4. Duality Theory
Linear Programming and Approximation
CISC5835, Algorithms for Big Data
Chapter 6. Large Scale Optimization
The Simplex Method The geometric method of solving linear programming problems presented before. The graphical method is useful only for problems involving.
Linear Programming (Linear optimization)
Chapter 5. The Duality Theorem
Flow Feasibility Problems
Chapter-III Duality in LPP
DUALITY THEORY Reference: Chapter 6 in Bazaraa, Jarvis and Sherali.
Chapter 10: Iterative Improvement
1.2 Guidelines for strong formulations
Chapter 6. Large Scale Optimization
Chapter 2. Simplex method
Linear Constrained Optimization
Presentation transcript:

Linear Programming Piyush Kumar Welcome to COT 5405

Optimization

For example This is what is known as a standard linear program.

Linear Programming Significance A lot of problems can be converted to LP formulation Perceptrons (learning), Shortest path, max flow, MST, matching, … Accounts for major proportion of all scientific computations Helps in finding quick and dirty solutions to NP-hard optimization problems Both optimal (B&B) and approximate (rounding)

Graphing 2-Dimensional LPs Optimal Solution Example 1: y 4 Maximize x + y 3 x + 2 y ³ 2 Subject to: Feasible Region x £ 3 2 y £ 4 1 x ³ 0 y ³ 0 x 1 2 3 These LP animations were created by Keely Crowston.

Multiple Optimal Solutions! Graphing 2-Dimensional LPs Multiple Optimal Solutions! Example 2: y 4 Minimize ** x - y 3 1/3 x + y £ 4 Subject to: -2 x + 2 y £ 4 2 Feasible Region x £ 3 1 x ³ 0 y ³ 0 x 1 2 3

Graphing 2-Dimensional LPs Example 3: y 40 Minimize x + 1/3 y 30 x + y ³ 20 Subject to: Feasible Region -2 x + 5 y £ 150 20 x ³ 5 10 x ³ 0 y ³ 0 x Optimal Solution 10 20 30 40

Do We Notice Anything From These 3 Examples? Extreme point y y y 4 4 40 3 3 30 2 2 20 1 1 10 x x x 1 2 3 1 2 3 10 20 30 40

A Fundamental Point y y y 4 4 40 3 3 30 2 2 20 1 1 10 x x x 1 2 3 1 2 3 10 20 30 40 If an optimal solution exists, there is always a corner point optimal solution!

Graphing 2-Dimensional LPs Optimal Solution Second Corner pt. Example 1: y 4 Maximize x + y 3 x + 2 y ³ 2 Subject to: Feasible Region x £ 3 2 y £ 4 1 x ³ 0 y ³ 0 Initial Corner pt. x 1 2 3

And We Can Extend this to Higher Dimensions

Then How Might We Solve an LP? The constraints of an LP give rise to a geometrical shape - we call it a polyhedron. If we can determine all the corner points of the polyhedron, then we can calculate the objective value at these points and take the best one as our optimal solution. The Simplex Method intelligently moves from corner to corner until it can prove that it has found the optimal solution.

But an Integer Program is Different y Feasible region is a set of discrete points. Can’t be assured a corner point solution. There are no “efficient” ways to solve an IP. Solving it as an LP provides a relaxation and a bound on the solution. 4 3 2 1 x 1 2 3

Linear Programs in higher dimensions minimize z = 7x1 + x2 + 5x3 subject to x1 - x2 + 3x3 >= 10 5x1 + 2x2 - x3 >= 6 x1, x2, x3  0 What happens at (2,1,3)? What does it tell us about z* = optimal value of z?

LP Upper bounds Any feasible solution to LP gives an upper bound on z* So now we know z* <= 30. How do we construct a lower bound? z* >= 16? [Y/N]?

Lower bounding an LP 7x1+x2+5x3 >= (x1-x2+3x3) + (5x1+2x2-x3) >= 16 Find suitable multipliers ( >0 ?) to construct lower bounds. How do we choose the multipliers?

The Dual maximize z’ = 10y1 + 6y2 subject to y1 + 5y2 <= 7 What is the dual of a dual? Every feasible solution of the dual gives a lower bound on z*

The Primal minimize z = 7x1 + x2 + 5x3 subject to x1 - x2 + 3x3 >= 10 5x1 + 2x2 - x3 >= 6 x1, x2, x3  0 Every feasible solution of the primal is an upper bound on the solution to the dual.

Primal – Dual picture Strong Optimality Primal = Dual at opt Z* Primal Dual Solutions Primal Solutions

Duality A variable in the dual is paired with a constraint in the primal Objective function of the dual is determined by the right hand side of the primal constraints The constraint matrix of the dual is the transpose of the constraint matrix in the primal.

Duality Properties Some relationships between the primal and dual problems: If one problem has feasible solutions and a bounded objective function (and so has an optimal solution), then so does the other problem, so both the weak and the strong duality properties are applicable If the optimal value of the primal is unbounded then the dual is infeasible. If the optimal value of the dual is unbounded then the primal is infeasible.

In Matrix terms

LP Geometry Forms a n dimensional polyhedron Is convex : If z1 and z2 are two feasible solutions then λz1+ (1- λ)z2 is also feasible. Extreme points can not be written as a convex combination of two feasible points.

LP Geometry The normals to the halfspaces defining the polyhedron are formed by the coefficents of the constraints. Rows of A form the normals to the hyperplanes defining the primal LP pointing inside the polyhedron.

LP Geometry Extreme point theorem: If there exists an optimal solution to an LP Problem, then there exists one extreme point where the optimum is achieved. Local optimum = Global Optimum

LP: Algorithms Simplex. (Dantzig 1947) Developed shortly after WWII in response to logistical problems: used for 1948 Berlin airlift. Practical solution method that moves from one extreme point to a neighboring extreme point. Finite (exponential) complexity, but no polynomial implementation known. As a graduate student at the University of California Berkeley in 1939, Dantzig arrived late to class one day and copied two problems from a blackboard. After struggling with what he thought was a difficult homework assignment, he submitted his work to the eminent statistician Jerzy Neyman. Six weeks later on a Sunday at 8 AM, Neyman excitedly awoke Dantzig to say he had written an introduction to Dantzig's paper. It turned out that Dantzig had found solutions to two famous, previously unsolved statistical problems. Moral of the Story: if you come in late to class, you must solve a previously unsolved problem. Courtesy Kevin Wayne

LP: Polynomial Algorithms Ellipsoid. (Khachian 1979, 1980) Solvable in polynomial time: O(n4 L) bit operations. n = # variables L = # bits in input Theoretical tour de force. Not remotely practical. Karmarkar's algorithm. (Karmarkar 1984) O(n3.5 L). Polynomial and reasonably efficient implementations possible. Interior point algorithms. O(n3 L). Competitive with simplex! Dominates on simplex for large problems. Extends to even more general problems. celebrated result of Karmarkar was not so much in reducing the complexity, but rather that it was possible to implement his algorithm with reasonable efficiency liberti.dhs.org/liberti/phd/doc/wos-search/potra-interior_point_methods.pdf

Ellipsoid Method LP Feasibility and LP are equivalent Courtesy S. Boyd

Barrier Algorithms Simplex solution path Barrier central path Optimum Predictor Corrector Optimum IP methods have proven to be much better at solving very large LPs. Interior Point Methods

Back to LP Basics

Standard form of LP

Standard form of the Dual

Weak Duality We will not prove strong duality in this class but assume it.

Complementary solutions For any primal feasible (but suboptimal) x, its complementary solution y is dual infeasible, with cx=yb For any primal optimal x*, its complementary solution y* is dual optimal, with cx*=y*b=z* Duality Gap = cx-yb

Complementary slackness x*, y* are feasible, then they are optimal for (P) and (D) iff For I = 1..m if yi* > 0 Then aix* = bi For J = 1..n if xj* > 0 Then y*Aj = ci ai are rows of A and Aj are the columns of A

Complementary slackness x*, y* are simultaneously optimal for (P) and (D) iff y*(Ax* - b) = 0 (y*A – c)x* = 0 Summary: If a variable is positive, its dual constraint is tight Or if a constraint is loose its dual variable is zero.

Complementary Slackness Proof? y*(Ax* - b) - (y*A – c)x* = y*Ax* - y*b - y*Ax* + cx* = cx* - y*b = 0 ( But all terms are non-negative ) Hence all must be zero!

Primal-Dual Algorithms Find a feasible solution for both P and D. Try to satisfy the complementary slackness conditions.

Algorithm Design Techniques LP Relaxation Rounding Round the fractional solution obtained by solving LP-relaxation. Runs fast  Primal Dual Schema (iteratively constructs primal n dual solutions)

y objective LP optimum feasible solutions x Linear Program

Integer Program y x objective feasible solutions = optimum of LP relaxation IP optimum rounding down optimum of LP relaxation feasible solutions = x Integer Program

Linear Relaxations What happens if the optimal of a LP-Relaxation is Integral? There are a class of IPs for which this is guaranteed to happen Transportation problems MaxFlow problems In general (Unimodularity) … Exact Relaxation

Lower Bounds Assume minimization problem Any relaxation of the original IP has a _____________ optimal objective function value than the optimal objective function value of the original IP z*relaxation z* z*relaxation is called a __________________ on z* Difference between these two values is called the relaxation gap

Upper Bounds Any feasible solution to the original IP has a _____________ objective function value than the optimal objective function value of the original IP zfeasible z* zfeasible is called an __________________ on z* Heuristic techniques can be used to find “good” feasible solutions Efficient, may be beneficial if optimality can be sacrificed Usually application- or problem-specific

Vertex Cover Introduction to LP Rounding A simple 2-approximation using LP Better than 2-factor approx?