A Linear Round Lower Bound for Lovasz-Schrijver SDP relaxations of Vertex Cover Grant Schoenebeck Luca Trevisan Madhur Tulsiani UC Berkeley.

Slides:



Advertisements
Similar presentations
Iterative Rounding and Iterative Relaxation
Advertisements

Tight integrality gaps for vertex-cover semidefinite relaxations in the Lovász-Schrijver Hierarchy Avner Magen Joint work with Costis Georgiou, Toni Pitassi.
How to Round Any CSP Prasad Raghavendra University of Washington, Seattle David Steurer, Princeton University (In Principle)
Linear Round Integrality Gaps for the Lasserre Hierarchy Grant Schoenebeck.
Inapproximability of Hypergraph Vertex-Cover. A k-uniform hypergraph H= : V – a set of vertices E - a collection of k-element subsets of V Example: k=3.
1 Matching Polytope x1 x2 x3 Lecture 12: Feb 22 x1 x2 x3.
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
Theory of Computing Lecture 18 MAS 714 Hartmut Klauck.
Totally Unimodular Matrices
Mohit SinghKunal Talwar MSR NE, McGillMSR SV TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A AA A A A A A A.
MaxClique Inapproximability Seminar on HARDNESS OF APPROXIMATION PROBLEMS by Dr. Irit Dinur Presented by Rica Gonen.
Heuristics for the Hidden Clique Problem Robert Krauthgamer (IBM Almaden) Joint work with Uri Feige (Weizmann)
1 NP-completeness Lecture 2: Jan P The class of problems that can be solved in polynomial time. e.g. gcd, shortest path, prime, etc. There are many.
Combinatorial Algorithms
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
Semidefinite Programming
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Approximation Algorithms
1 Traveling Salesman Problem (TSP) Given n £ n positive distance matrix (d ij ) find permutation  on {0,1,2,..,n-1} minimizing  i=0 n-1 d  (i),  (i+1.
Semidefinite Programming Based Approximation Algorithms Uri Zwick Uri Zwick Tel Aviv University UKCRC’02, Warwick University, May 3, 2002.
2-Layer Crossing Minimisation Johan van Rooij. Overview Problem definitions NP-Hardness proof Heuristics & Performance Practical Computation One layer:
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
CSE 421 Algorithms Richard Anderson Lecture 27 NP Completeness.
Lower Bounds for Property Testing Luca Trevisan U C Berkeley.
Integer Programming Difference from linear programming –Variables x i must take on integral values, not real values Lots of interesting problems can be.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Linear Programming Relaxations for MaxCut Wenceslas Fernandez de la Vega Claire Kenyon -Mathieu.
1 Introduction to Approximation Algorithms Lecture 15: Mar 5.
(work appeared in SODA 10’) Yuk Hei Chan (Tom)
Tight Integrality Gaps for Lovász-Schrijver LP relaxations of Vertex Cover Grant Schoenebeck Luca Trevisan Madhur Tulsiani UC Berkeley.
Integrality Gaps for Sparsest Cut and Minimum Linear Arrangement Problems Nikhil R. Devanur Subhash A. Khot Rishi Saket Nisheeth K. Vishnoi.
Approximation Schemes via Sherali-Adams Hierarchy for Dense Constraint Satisfaction Problems and Assignment Problems Yuichi Yoshida (NII & PFI) Yuan Zhou.
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Simplex method (algebraic interpretation)
Expanders via Random Spanning Trees R 許榮財 R 黃佳婷 R 黃怡嘉.
Lower Bounds for Property Testing Luca Trevisan U.C. Berkeley.
Semidefinite Programming
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
1.1 Chapter 3: Proving NP-completeness Results Six Basic NP-Complete Problems Some Techniques for Proving NP-Completeness Some Suggested Exercises.
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
OR Simplex method (algebraic interpretation) Add slack variables( 여유변수 ) to each constraint to convert them to equations. (We may refer it as.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
Dense graphs with a large triangle cover have a large triangle packing Raphael Yuster SIAM DM’10.
NP Completeness Piyush Kumar. Today Reductions Proving Lower Bounds revisited Decision and Optimization Problems SAT and 3-SAT P Vs NP Dealing with NP-Complete.
Final Review Chris and Virginia. Overview One big multi-part question. (Likely to be on data structures) Many small questions. (Similar to those in midterm.
C&O 355 Lecture 19 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Approximation Algorithms by bounding the OPT Instructor Neelima Gupta
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
COSC 3101A - Design and Analysis of Algorithms 14 NP-Completeness.
Approximation Algorithms based on linear programming.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Approximation algorithms
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Lower Bounds for Property Testing
Lap Chi Lau we will only use slides 4 to 19
Richard Anderson Lecture 26 NP-Completeness
Polynomial integrality gaps for
Topics in Algorithms Lap Chi Lau.
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
Richard Anderson Lecture 28 NP-Completeness
András Sebő and Anke van Zuylen
I.4 Polyhedral Theory (NW)
Topics in Algorithms 2005 Max Cuts
Switching Lemmas and Proof Complexity
Locality In Distributed Graph Algorithms
Presentation transcript:

A Linear Round Lower Bound for Lovasz-Schrijver SDP relaxations of Vertex Cover Grant Schoenebeck Luca Trevisan Madhur Tulsiani UC Berkeley

For G = (V,E) find the smallest subset of vertices containing at least one endpoint of every edge. Integrality Gap = Max G = 2 – o(1) for both Minimum Vertex Cover SDP (Lovasz -function) Minimize  u2V z 0 ¢z u k z u k  1 8 u 2 V, k z 0 k  1 (z 0 - z u )¢(z 0 - z v ) = 0 8 (u,v) 2 E LP Minimize  u2V x u x u 2 [0,1] 8 u 2 V x u + x v ¸ 1 8 (u,v) 2 E Integer Optimum Optimum of Program

Integrality Gaps with more constraints For the complete graph (K n ) on n vertices: LP Optimum = n/2; Integer Optimum = n-1 Integrality Gap = 2 – 2/n What if we add the constraint x u +x v +x w ¸ 2 for every triangle (u,v,w) in G? Performance ratio for K n is 3/2, but the integrality gap still remains 2-o(1). What if add constraints analogous to the one above for every odd cycle? What if we add x u + x v + x w + x z  3 for every clique of size 4? Size 5? One needs to prove integrality gaps from scratch every time new constraints are added.

Automatically generating “natural” constraints LS/LS + hierarchies define define “cut operators” applied to (convex) solution space. Operators can be iteratively applied to generate tighter LP/SDP relaxations. Relaxation obtained by r cuts (rounds) solvable in n O(r) time. Constant number of rounds produce most known LP/SDP relaxations.

From polytopes to cones Allow scaling of solutions to convert solution set to a cone. For Vertex Cover: Minimize  u2V y u y 2 R n+1 0  y u  y 0  u  V y u + y v  y 0  (u,v)  E y 0 = 1 Define a cone VC(G)

The Lovasz-Schrijver Hierarchy  Goal: Only allow convex combinations of 0/1 solutions. Probability distributions! y = (1, y 1, y 2, …, y n )  K z (1) = (1, 1, z (1) 2, …, z (1) n ) w (1) = (1, 0, w (1) 2, …, w (1) n ) y 1  z (1) + (1-y 1 )  w (1) = y z (n) = (1, z (n) 1, z (n) 2, …, 1) w (n) = (1, w (2) 1, w (n) 2, …, 0) y n  z (n) + (1-y n )  w (n) = y z (2) = (1, z (2) 1, 1, …, z (2) n ) w (2) = (1, w (2) 1, 0, …, w (2) n ) y 2  z (2) + (1-y 2 )  w (2) = y … y y 1  z (1) y 2  z (2)  y n  z (n) Y = 1.Y = Y T 2.Diagonal(Y) = y 3.Y i  K, y - Y i  K  i 4.Y is p.s.d. Y ij = Pr[i=1 ^ j=1] LS LS + Marginal distribution! Ask for conditionals.

The Lovasz-Schrijver Hierarchy  Goal: Only allow convex combinations of 0/1 solutions. Probability distributions! y = (y 1, y 2, …, y n )  S z (1) = (1, z (1) 2, …, z (1) n ) w (1) = (0, w (1) 2, …, w (1) n ) y 1  z (1) + (1-y 1 )  w (1) = y z (n) = (z (n) 1, z (n) 2, …, 1) w (n) = (w (2) 1, w (n) 2, …, 0) y n  z (n) + (1-y n )  w (n) = y z (2) = (z (2) 1, 1, …, z (2) n ) w (2) = (w (2) 1, 0, …, w (2) n ) y 2  z (2) + (1-y 2 )  w (2) = y … y y 1  z (1) y 2  z (2)  y n  z (n) Y = 1.Y = Y T 2.Diagonal(Y) = y 3.Y i /y i  S, (y –Y i )/(1-y i )  S 4.Y is p.s.d. Y ij = Pr[i=1 ^ j=1] LS LS + Marginal distribution! Ask for conditionals.

Prover-Adversary Game y  LS r (VC) u (0 < y u < 1) z (u), w (u)  LS r-1 (VC) z (u), v ProverAdversary Showing y  LS r (VC) can be viewed as “almost” a 2-player game 

Prover-Adversary Game (contd…) 2/3 x 2/ / ? 1 ? ?? 1/3 x 1/2 x Ha!

Work on LS + for Vertex Cover k-1-   (n) rounds (VC in k-uniform hypergraphs) AAT’  (√(logn/loglogn)) GMPT’06 7/6 -   (n) rounds This paper 7/6 -  1 round (random 3XOR)FO’  1 round + triangle inequalityCharikar’  1 roundGK’98

The Graphs Random 3XOR formula (m = cn clauses) (FO’06) Show solutions for Independent Set (y 1, …, y n )  LS r (VC)  (1-y 1, …, 1-y n )  LS r (IS) x 1 + x 2 + x 3 = x 3 + x 4 + x 5 = /4

Properties of random 3XOR formulas Any assignment satisfies at most (1/2 +  )m clauses. For VC: Integer optimum  4m - (1/2+  )m = (7/2-  )m (IS  assignment, vertices included  clauses satisfied) Fractional optimum = 4m – ¼  4m = 3m Integrality gap  7/6 -  Two clauses share at most one variable (constant probability). k clauses involve at least 1.9k variables for k  n (AAT’05, other works in proof complexity) (implies no small unsatisfiable subset)

The conditional distributions Expansion prevents propogation of effects from conditioning. ? B A A B 1/ /2 1

The conditional distributions (contd…) Conditional distribution for y i = 0 is a convex combination of distributions for y j = 1 for other j’s. All values are 0, 1/4, 1/2 or 1 and neighbors of 1 are 0. Solutions are fractional independent sets. Need to maintain expansion even after variables are assigned. 0 1/ = + 1/3  1/3  + 1/3 

Maintaining the expansion Problem: Adversary fixes variables – may cause loss of expansion. Solution: We fix more variables - all variables in a maximal non-expanding subset of clauses, say B. Adversary fixes t (= 1 or 2) clauses, we fix |B| (k, 1.9)  (k – t – |B|,1.9) If y =  i y (i), then y (i)  S  i  y  S. Express y as uniform distribution over consistent assignments to clauses in B. B

Maintaining the expansion (contd…) If B is consistent, k clauses & l vars  2 l-k solutions. For C  B (with 3 variables), if every assignment to C is consistent with B, every assignment appears in 2 k-l-2 solutions (solutions form affine subspace). B consistent with any assignment to C as contradictions have very low expansion. 1/4 2 l-k-2

Assigning clauses consistently Let S  B be minimal unsatisfiable subset after fixing an assignment to C  B. S involves at most 1.5|S| variables as each must occur twice. B Adversary : 1 (or 2) clauses, at most 3 (or 4) variables C : 1 clause, at most 3 variables S : |S| clauses, at most 1.5|S| variables 1.5|S|  1.9 (|S| + 2)  |S|  5

How many rounds? Start with (  n, 1.95)-expansion. At i th round, adversary fixes set S i, we fix T i. 1.95(  i |S i | + |T i |)  #(fixed vars)  3  i |S i |  i |T i | Stop after r rounds if  n -  i (|S i | + |T i |)  4   n – 44r  4

But the title says LS + ! Still need to show matrix Y at each round is p.s.d. A matrix Y  R m  m is p.s.d. iff  v 1, …, v m  R m such that Y ij = v i  v j  i,j Need to exhibit a vector for every vertex with above property (also shows symmetry).

The vector solutions (FO’06) Divide variables into equivalence classes: 3-variable eqn  different classes 2-variable eqn  same class One coordinate for each class. v i has  y i in the coordinate corresponding to the classes contained and 0 elsewhere. One extra coordinate = y i in v i x 1 + x 2 + x 3 = x 3 + x 4 = 0 Classes: (x 1 ), (x 2 ), (x 3 & x 4 ) v i  v j = y i y j (1 + n agree – n disagree ) 0 if contradict = y i y j no shared vars 2y i y j shared agree (¼, ¼, -¼, -¼)(¼, ¼, ¼, ¼) (¼, -¼, ¼, -¼)(¼, ¼, ¼, ¼) (½, 0,0, - ½) (½, 0,0, ½)

Conclusions/Open Problems 2-  gap for  (n) rounds. Reduction from CSPs  LS + rounds  partial assignments. 2-  gap using other CSPs? Interpretation as probability distributions still does not give a natural interpretation of the vector solutions. Other natural ways of looking at the hierarchy? Similar results for Sherali-Adams hierarchy? LS + results for other problems? Sparsest Cut Khot’s SDP for Unique Games

Thank You Questions?