Submodularity Reading Group Submodular Function Minimization via Linear Programming M. Pawan Kumar

Slides:



Advertisements
Similar presentations
Introduction to Algorithms 6.046J/18.401J/SMA5503
Advertisements

Beyond Convexity – Submodularity in Machine Learning
CPSC 455/555 Combinatorial Auctions, Continued… Shaili Jain September 29, 2011.
1 LP Duality Lecture 13: Feb Min-Max Theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum.
C&O 355 Mathematical Programming Fall 2010 Lecture 22 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
MS&E 211 Quadratic Programming Ashish Goel. A simple quadratic program Minimize (x 1 ) 2 Subject to: -x 1 + x 2 ≥ 3 -x 1 – x 2 ≥ -2.
Dragan Jovicic Harvinder Singh
C&O 355 Mathematical Programming Fall 2010 Lecture 20 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
C&O 355 Mathematical Programming Fall 2010 Lecture 21 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
CS38 Introduction to Algorithms Lecture 15 May 20, CS38 Lecture 15.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
1 Introduction to Linear and Integer Programming Lecture 9: Feb 14.
Introduction to Linear and Integer Programming Lecture 7: Feb 1.
Ch 6.3: Step Functions Some of the most interesting elementary applications of the Laplace Transform method occur in the solution of linear equations.
Solving a system of equations 1 x x x 3 = 3 2 x x x 3 = 2 3 x x x 3 = 3.
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Branch and Bound Algorithm for Solving Integer Linear Programming
Polyhedral Optimization Lecture 3 – Part 3 M. Pawan Kumar Slides available online
D Nagesh Kumar, IIScOptimization Methods: M3L6 1 Linear Programming Other Algorithms.
1 Spanning Tree Polytope x1 x2 x3 Lecture 11: Feb 21.
Computational aspects of stability in weighted voting games Edith Elkind (NTU, Singapore) Based on joint work with Leslie Ann Goldberg, Paul W. Goldberg,
Polyhedral Optimization Lecture 3 – Part 2
C&O 355 Lecture 2 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
Polyhedral Optimization Lecture 1 – Part 2 M. Pawan Kumar Slides available online
Polyhedral Optimization Lecture 4 – Part 3 M. Pawan Kumar Slides available online
Computational Geometry Piyush Kumar (Lecture 5: Linear Programming) Welcome to CIS5930.
Linear Programming Piyush Kumar. Graphing 2-Dimensional LPs Example 1: x y Feasible Region x  0y  0 x + 2 y  2 y  4 x  3 Subject.
Polyhedral Optimization Lecture 5 – Part 1 M. Pawan Kumar Slides available online
C&O 355 Mathematical Programming Fall 2010 Lecture 4 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
Pareto Linear Programming The Problem: P-opt Cx s.t Ax ≤ b x ≥ 0 where C is a kxn matrix so that Cx = (c (1) x, c (2) x,..., c (k) x) where c.
Linear Programming: Data Fitting Steve Gu Mar 21, 2008.
Theory of Computing Lecture 13 MAS 714 Hartmut Klauck.
296.3Page :Algorithms in the Real World Linear and Integer Programming II – Ellipsoid algorithm – Interior point methods.
CS38 Introduction to Algorithms Lecture 16 May 22, CS38 Lecture 16.
1 Prune-and-Search Method 2012/10/30. A simple example: Binary search sorted sequence : (search 9) step 1  step 2  step 3  Binary search.
Approximation Algorithms for Prize-Collecting Forest Problems with Submodular Penalty Functions Chaitanya Swamy University of Waterloo Joint work with.
Polyhedral Optimization Lecture 5 – Part 2 M. Pawan Kumar Slides available online
C&O 355 Mathematical Programming Fall 2010 Lecture 8 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
Optimization - Lecture 4, Part 1 M. Pawan Kumar Slides available online
X y x-y · 4 -y-2x · 5 -3x+y · 6 x+y · 3 Given x, for what values of y is (x,y) feasible? Need: y · 3x+6, y · -x+3, y ¸ -2x-5, and y ¸ x-4 Consider the.
Linear Programming Maximize Subject to Worst case polynomial time algorithms for linear programming 1.The ellipsoid algorithm (Khachian, 1979) 2.Interior.
Submodular set functions Set function z on V is called submodular if For all A,B µ V: z(A)+z(B) ¸ z(A[B)+z(AÅB) Equivalent diminishing returns characterization:
Linear Programming: Formulations, Geometry and Simplex Method Yi Zhang January 21 th, 2010.
Discrete Optimization Lecture 5 – Part 1 M. Pawan Kumar Slides available online
Optimization - Lecture 5, Part 1 M. Pawan Kumar Slides available online
Submodularity Reading Group Matroids, Submodular Functions M. Pawan Kumar
Polyhedral Optimization Lecture 5 – Part 3 M. Pawan Kumar Slides available online
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
Linear Programming Piyush Kumar Welcome to CIS5930.
2.5 The Fundamental Theorem of Game Theory For any 2-person zero-sum game there exists a pair (x*,y*) in S  T such that min {x*V. j : j=1,...,n} =
Unconstrained Submodular Maximization Moran Feldman The Open University of Israel Based On Maximizing Non-monotone Submodular Functions. Uriel Feige, Vahab.
Submodularity Reading Group Matroid Polytopes, Polymatroid M. Pawan Kumar
Submodularity Reading Group Matroids, Submodular Functions M. Pawan Kumar
Linear program Separation Oracle. Rounding We consider a single-machine scheduling problem, and see another way of rounding fractional solutions to integer.
Polyhedral Optimization Lecture 2 – Part 2 M. Pawan Kumar Slides available online
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Lap Chi Lau we will only use slides 4 to 19
Topics in Algorithms Lap Chi Lau.
Computational Optimization
Submodularity Reading Group Polymatroid
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
3-3 Optimization with Linear Programming
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
2. Generating All Valid Inequalities
I.4 Polyhedral Theory (NW)
Linear Programming Problem
I.4 Polyhedral Theory.
Practical Issues Finding an initial feasible solution Cycling
Chapter 2. Simplex method
Presentation transcript:

Submodularity Reading Group Submodular Function Minimization via Linear Programming M. Pawan Kumar

Separation and Optimization Submodular Function Minimization Outline

Solving Linear Programs s.t. A x ≤ b max x c T x Optimization Feasibility asks if there exists an x such that c T x ≥ K A x ≤ b Optimization via binary search on K Feasible solution For a given K

Feasibility via Ellipsoid Method Feasible region of LP

Feasibility via Ellipsoid Method Ellipsoid containing feasible region of LP

Feasibility via Ellipsoid Method Centroid of ellipsoid

Feasibility via Ellipsoid Method Separating hyperplane for centroid

Feasibility via Ellipsoid Method Smallest ellipsoid containing “truncated” ellipsoid

Feasibility via Ellipsoid Method Centroid of ellipsoid

Feasibility via Ellipsoid Method Separating hyperplane for centroid

Feasibility via Ellipsoid Method Smallest ellipsoid containing “truncated” ellipsoid

Feasibility via Ellipsoid Method Centroid of ellipsoid

Feasibility via Ellipsoid Method Terminate when feasible solution is found

Separating hyperplane in polynomial time –Check each of the ‘m’ LP constraints in O(n) time New ellipsoid in polynomial time –Shor (1971), Nemirovsky and Yudin (1972) Polynomial iterations (Khachiyan 1979, 1980) –Volume of ellipsoid reduces exponentially Only requires a separation oracle –Constraint matrix A can be very large Ellipsoid Method

Separation implies easy optimization What about the reverse? Polymatroids admit greedy optimization Do they allow easy separation? Why are we even interested in this? Optimization vs. Separation

Separation and Optimization –Polar Polyhedron –Separation via Optimization –Poly-time Equivalence Submodular Function Minimization Outline

Polar Polyhedron Polyhedron P = {x: Ax ≤ b} Polar Polyhedron P* = {y: for all x ∈ P, x T y ≤ 1} Assume 0 is in the interior of P (P*)* = PProof? b > 0 No “loss of generality” as P can be translated

P is a subset of (P*)* If x ∈ P, then for all y ∈ P* we have x T y ≤ 1 (P*)* = {z: for all y ∈ P*, z T y ≤ 1} Therefore, x ∈ (P*)*

(P*)* is a subset of P Let there be an x ∉ P There must exist a separating hyperplane c T x > d c T z ≤ d, for all z ∈ P Since 0 ∈ interior of P, d > 0 Without loss of generality, d = 1

(P*)* is a subset of P Let there be an x ∉ P There must exist a separating hyperplane c T x > 1 c T z ≤ 1, for all z ∈ P c ∈ P* x ∉ (P*)* Why? Hence proved

Separation and Optimization –Polar Polyhedron –Separation via Optimization –Poly-time Equivalence Submodular Function Minimization Outline

Optimization Problem over P Polyhedron P = {x: Ax ≤ b} max c T x x ∈ P

Separation Problem over P* Polar Polyhedron P* = {y: for all x ∈ P, x T y ≤ 1} Given y, return ‘YES’ if y ∈ P* Otherwise, return separating hyperplane

Using Optimization for Separation Set c = y max c T x x ∈ P C* = If C* ≤ 1, then return ‘YES’ If C* > 1, then return x* Optimal solution x*

Separation and Optimization –Polar Polyhedron –Separation via Optimization –Poly-time Equivalence Submodular Function Minimization Outline

Poly-Time Equivalence Optimization on P Separation on P* Polarity Optimization on P* Ellipsoid method Separation on (P*)* = P Polarity Ellipsoid method

Separation and Optimization Submodular Function Minimization Outline

Submodular Function Minimization min T ⊆ S f(T) We will assume f(null set) = 0 If not, we can add a constant

Submodular Function Minimization min T ⊆ S f(T) Brute force search is exponential in |S| We will prove that SFM is easy First, we need two properties

Property 1 f(U) = max {x(U) | x ∈ EP f } Set w = v U Proof? f is a submodular function over S

Property 2 f is a submodular function over S Define f’(U) = min T ⊆ U f(T) f’ is also submodularProof?

Proof Sketch We have to prove the following f’(T) + f’(U) ≥ f’(T ∪ U) + f’(T ∩ U) for all T, U ⊆ S f’(T) = f(X) for some X ⊆ T f’(U) = f(Y) for some Y ⊆ S

Proof Sketch f’(T) + f’(U)= f(X) + f(Y) ≥ f(X ∪ Y) + f(X ∩ Y) ≥ f’(T ∪ U) + f’(T ∩ U)

Property 2 Continued f is a submodular function over S Define f’(U) = min T ⊆ U f(T) f’ is also submodular EP f’ = { x ∈ EP f, x ≤ 0} Proof?

Proof Sketch If x ∈ P, then x(U) ≤ f(U) for all U ⊆ S x(T\U) + x(U) ≤ f(U), for all U ⊆ T ⊆ S Why? Because x ∈ EP f Why? Because x ≤ 0 We can show that { x ∈ EP f, x ≤ 0} = P ⊆ EP f’

Proof Sketch We can show that { x ∈ EP f, x ≤ 0} = P ⊇ EP f’ If x ∈ EP f’ then x ∈ EP f For any s ∈ S, x s ≤ 0 Why? Because x(U) ≤ f(U) for all U ⊆ S Why? Because x s ≤ f(null set) = 0

Submodular Function Minimization min T ⊆ S f(T) = f’(S) f’(U) = min T ⊆ U f(T) Optimization over EP f is easy = max{x(S) | x ∈ EP f’ } = max{x(S) | x ∈ EP f, x ≤ 0} Separation over EP f is easy Separation over EP f’ is easy Optimization over EP f’ is easy Hence Proved