Approximation Schemes via Sherali-Adams Hierarchy for Dense Constraint Satisfaction Problems and Assignment Problems Yuichi Yoshida (NII & PFI) Yuan Zhou.

Slides:



Advertisements
Similar presentations
Iterative Rounding and Iterative Relaxation
Advertisements

On allocations that maximize fairness Uriel Feige Microsoft Research and Weizmann Institute.
How to Round Any CSP Prasad Raghavendra University of Washington, Seattle David Steurer, Princeton University (In Principle)
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
Understanding the Power of Convex Relaxation Hierarchies: Effectiveness and Limitations Yuan Zhou Computer Science Department Carnegie Mellon University.
Lecture 22: April 18 Probabilistic Method. Why Randomness? Probabilistic method: Proving the existence of an object satisfying certain properties without.
Bart Jansen, Utrecht University. 2  Max Leaf  Instance: Connected graph G, positive integer k  Question: Is there a spanning tree for G with at least.
Constraint Satisfaction over a Non-Boolean Domain Approximation Algorithms and Unique Games Hardness Venkatesan Guruswami Prasad Raghavendra University.
Dependent Randomized Rounding in Matroid Polytopes (& Related Results) Chandra Chekuri Jan VondrakRico Zenklusen Univ. of Illinois IBM ResearchMIT.
Linear Time Approximation Schemes for the Gale-Berlekamp Game and Related Minimization Problems Marek Karpinski (Bonn) Warren Schudy (Brown) STOC 2009.
Approximation Algorithms for Unique Games Luca Trevisan Slides by Avi Eyal.
The Stackelberg Minimum Spanning Tree Game Jean Cardinal · Erik D. Demaine · Samuel Fiorini · Gwenaël Joret · Stefan Langerman · Ilan Newman · OrenWeimann.
The number of edge-disjoint transitive triples in a tournament.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Oded Goldreich Shafi Goldwasser Dana Ron February 13, 1998 Max-Cut Property Testing by Ori Rosen.
Venkatesan Guruswami (CMU) Yuan Zhou (CMU). Satisfiable CSPs Theorem [Schaefer'78] Only three nontrivial Boolean CSPs for which satisfiability is poly-time.
A Linear Round Lower Bound for Lovasz-Schrijver SDP relaxations of Vertex Cover Grant Schoenebeck Luca Trevisan Madhur Tulsiani UC Berkeley.
Parameterized Approximation Scheme for the Multiple Knapsack Problem by Klaus Jansen (SODA’09) Speaker: Yue Wang 04/14/2009.
Semidefinite Programming
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Generic Rounding Schemes for SDP Relaxations
Implicit Hitting Set Problems Richard M. Karp Harvard University August 29, 2011.
. Hidden Markov Models For Genetic Linkage Analysis Lecture #4 Prepared by Dan Geiger.
1 Traveling Salesman Problem (TSP) Given n £ n positive distance matrix (d ij ) find permutation  on {0,1,2,..,n-1} minimizing  i=0 n-1 d  (i),  (i+1.
Testing Metric Properties Michal Parnas and Dana Ron.
A New Algorithm for Optimal 2-Constraint Satisfaction and Its Implications Ryan Williams Computer Science Department, Carnegie Mellon University Presented.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
Linear Programming Relaxations for MaxCut Wenceslas Fernandez de la Vega Claire Kenyon -Mathieu.
Distributed Combinatorial Optimization
NP-complete and NP-hard problems. Decision problems vs. optimization problems The problems we are trying to solve are basically of two kinds. In decision.
(work appeared in SODA 10’) Yuk Hei Chan (Tom)
1 Slides by Asaf Shapira & Michael Lewin & Boaz Klartag & Oded Schwartz. Adapted from things beyond us.
Tight Integrality Gaps for Lovász-Schrijver LP relaxations of Vertex Cover Grant Schoenebeck Luca Trevisan Madhur Tulsiani UC Berkeley.
1 Joint work with Shmuel Safra. 2 Motivation 3 Motivation.
Finding Almost-Perfect
Approximation Algorithms: Bristol Summer School 2008 Seffi Naor Computer Science Dept. Technion Haifa, Israel TexPoint fonts used in EMF. Read the TexPoint.
Dana Moshkovitz, MIT Joint work with Subhash Khot, NYU.
Integrality Gaps for Sparsest Cut and Minimum Linear Arrangement Problems Nikhil R. Devanur Subhash A. Khot Rishi Saket Nisheeth K. Vishnoi.
Approximation Algorithms for Stochastic Combinatorial Optimization Part I: Multistage problems Anupam Gupta Carnegie Mellon University.
1 Energy-aware stage illumination. Written by: Friedrich Eisenbrand Stefan Funke Andreas Karrenbauer Domagoj Matijevic Presented By: Yossi Maimon.
Design Techniques for Approximation Algorithms and Approximation Classes.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Greedy Approximation Algorithms for finding Dense Components in a Graph Paper by Moses Charikar Presentation by Paul Horn.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Lower Bounds for Property Testing Luca Trevisan U.C. Berkeley.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
Approximation Algorithms for Graph Homomorphism Problems Chaitanya Swamy University of Waterloo Joint work with Michael Langberg and Yuval Rabani Open.
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
NP-Complete problems.
Linear Program Set Cover. Given a universe U of n elements, a collection of subsets of U, S = {S 1,…, S k }, and a cost function c: S → Q +. Find a minimum.
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Chapter 4 Partition (1) Shifting Ding-Zhu Du. Disk Covering Given a set of n points in the Euclidean plane, find the minimum number of unit disks to cover.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
Different Local Search Algorithms in STAGE for Solving Bin Packing Problem Gholamreza Haffari Sharif University of Technology
Yuan Zhou, Ryan O’Donnell Carnegie Mellon University.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
PRIMAL-DUAL APPROXIMATION ALGORITHMS FOR METRIC FACILITY LOCATION AND K-MEDIAN PROBLEMS K. Jain V. Vazirani Journal of the ACM, 2001.
The geometric GMST problem with grid clustering Presented by 楊劭文, 游岳齊, 吳郁君, 林信仲, 萬高維 Department of Computer Science and Information Engineering, National.
Approximation Algorithms based on linear programming.
CHAPTER SIX T HE P ROBABILISTIC M ETHOD M1 Zhang Cong 2011/Nov/28.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Finding Almost-Perfect
A simple parallel algorithm for the MIS problem
Polynomial integrality gaps for
Computer Science cpsc322, Lecture 14
Venkatesan Guruswami Yuan Zhou (Carnegie Mellon University)
Presentation transcript:

Approximation Schemes via Sherali-Adams Hierarchy for Dense Constraint Satisfaction Problems and Assignment Problems Yuichi Yoshida (NII & PFI) Yuan Zhou (CMU)

Constraint satisfaction problems (CSPs) In Max- k CSP, given: – a set of variables: V = {v 1, v 2, v 3, …, v n } – the domain of variables: D – a set of arity- k “local” constraints: C Goal: find an assignment α : V  D to maximize #satisfied constraints in C

Constraint satisfaction problems (CSPs) In Max- k CSP, given: – a set of variables: V = {v 1, v 2, v 3, …, v n } – the domain of variables: D – a set of arity- k “local” constraints: C Goal: find an assignment α : V  D to maximize #satisfied constraints in C Example: MaxCut – D = {0, 1} – p (i,j) = 1[v i ≠ v j ], Max-3SAT, UniqueGames, …

Assignment problems (APs) In Max- k AP, given – a set of variables V = {v 1, v 2, v 3, …, v n } – a set of arity- k “local” constraints C Goal: find a bijection π : V  {1, 2, …, n} (i.e. permutaion) to maximize #satisfied constraints in C

Assignment problems (APs) Examples –MaxAcyclicSubgraph ( MAS ) π(u) < π(v) –Betweenness π(u) < π(v) < π(w) or π(w) < π(v) < π(u) –MaxGraphIsomorphism ( Max-GI ) (π(u), π(v)) ∈ E(H), where H is a fixed graph –Dense k Subgraph ( D k S ) (π(u), π(v)) ∈ E(K k ), where K k is a k -clique

Approximate schemes Max- k CSP and Max- k AP are NP-Hard in general Polynomial-time approximation scheme (PTAS): for any constant ε > 0, the algorithm runs in n O(1) time and gives (1-ε)-approximation Quasi-PTAS: the algorithm runs in n O(log n) time Max- k CSP / Max- k AP admits PTAS or quasi-PTAS when the instance is “dense” or “metric”

PTAS for dense/metric Max- k CSP Max- k CSP is dense: has Ω(n k ) constraints. – PTAS for dense MaxCut [dlV96] – PTAS for dense Max- k CSP [AKK99, FK96, AdlVKK03] Max-2CSP is metric: edge weight ω satisfies ω(u, v) ≤ ω(u, w)+ω(w, v) – PTAS for metric MaxCut [dlVK01] – PTAS for metric MaxBisection [FdlVKK04] – PTAS for locally dense Max- k CSP (a generalized definition of “metric”) [dlVKKV05]

Quasi-PTAS for dense Max- k AP Max- k AP is dense: – roughly speaking, the instance has Ω(n k ) constraints In [AFK02] – (1-ε)-approximate dense MAS, Betweenness in n O(1/ε^2) time – (1-ε)-approximate dense D k S, Max-GI, Max- k AP in n O(log n/ε^2) time

Previous techniques Exhaustive search on a small set of variables [AKK99] Weak Szemerédi’s regularity lemma [FK96] Copying important variables [dlVK01] A variant of SVD [dlVKKV05] Linear programming relaxation for “assignment problems with extra constraints” [AFK02] In this paper, we show: The standard Sherali-Adams LP relaxation hierarchy is a unified approach to all these results!

Sherali-Adams LP relaxation hierarchy A systematic way to write tighter and tighter LP relaxations: [SA90] In an r-round SA LP relaxation, – For each set S = {v 1, …, v r } of r variables, we have a distribution of assignments μ S = μ {v1, …, vr} – For any two sets S and T, marginal distributions are consistent: μ S (S∩T) = μ T (S∩T) Solving an r-round LP relaxation takes n O(r) time.

Our results Sherali-Adams LP-based proof for known results – O(1/ε 2 )-round SA LP relaxation gives (1-ε)-approximation to dense or locally dense Max- k CSP, and Max- k CSP with global cardinality constraints such as MaxBisection – O(log n/ε 2 )-round SA LP relaxation gives (1-ε)- approximation to dense or locally dense Max- k AP New algorithms – Quasi-PTAS for Max k -HypergraphIsomorphism when one graph is dense and the other one is locally dense

Our techniques Solve the Sherali-Adams LP relaxation for sufficiently many rounds (Ω(1/ε 2 ) or Ω((log n)/ε 2 )) Randomized conditioning operation to bring down the pair-wise correlations Independent rounding for Max- k CSP Special rounding for Max- k AP

Conditioning operation Randomly choose v from V, sample a ~ μ v For each local distribution μ {v1, …, vr}, generate the new local distribution μ {v1, …, vr} |v= a r-round SA solution  (r-1)-round SA solution Essentially from [RT12] : – After t steps of conditioning, – on average, μ {v1, …, vk} is only -far from μ {v1} x … x μ {vk}

Independent rounding for Max- k CSP After Ω(1/ε 2 ) steps of conditioning, on average, μ {v1, …, vk} is only ε-far from μ {v1} x … x μ {vk} Sample each v from μ {v}, and we have Therefore, This is a (1-O(ε))-(multiplicative) approximation because of the density

Rounding for Max- k AP Independent sampling does not work: – objective value is good, but resulting assignment might not be permutation because of collisions Our special rounding: – View {μ {v} (w)} v,w as a doubly stochastic matrix, therefore a distribution of permutations – Distribution supported on one permutation  ✔ – Two permutations?  Merge them – Even more permutations?  Pick arbitrary two, merge them, and iterate Similar operation in [AFK02]

Merging two permutations 1.View the two permutations as disjoint cycles 2.Break long cycles (length > n 1/2 ) into short ones (length ≤ n 1/2 ) 3.In each cycle, choose Permutation 1/Permutation 2 independently Analysis Step 2: modified O(n 1/2 ) entries of Permutation 2, affecting O(n -1/2 )- fraction of the constraints n 1/2

Merging two permutations 1.View the two permutations as disjoint cycles 2.Break long cycles (length > n 1/2 ) into short ones (length ≤ n 1/2 ) 3.In each cycle, choose Permutation 1/Permutation 2 independently Analysis Step 3: value of the constraints where each variable from a distinct cycle is preserved because of independence – all but n -1/2 -fraction of them n 1/2

Merging two permutations 1.View the two permutations as disjoint cycles 2.Break long cycles (length > n 1/2 ) into short ones (length ≤ n 1/2 ) 3.In each cycle, choose Permutation 1/Permutation 2 independently Analysis Conclusion: In this way, we get a permutation whose objective value is at least (1 – O(n -1/2 )) * [Indep. Sampling] ≥ (1 – O(n -1/2 )) (1 – O(ε)) [Val of LP] n 1/2

Future directions Can we solve the Sherali-Adams LP faster (as in [GS12] ) to get a PTAS for dense assignment problems?

Thanks