Finding Almost-Perfect

Slides:



Advertisements
Similar presentations
Arora: SDP + Approx Survey Semidefinite Programming and Approximation Algorithms for NP-hard Problems: A Survey Sanjeev Arora Princeton University
Advertisements

How to Round Any CSP Prasad Raghavendra University of Washington, Seattle David Steurer, Princeton University (In Principle)
Linear Round Integrality Gaps for the Lasserre Hierarchy Grant Schoenebeck.
Hardness of Approximating Multicut S. Chawla, R. Krauthgamer, R. Kumar, Y. Rabani, D. Sivakumar (2005) Presented by Adin Rosenberg.
Poly-Logarithmic Approximation for EDP with Congestion 2
Tutorial 6 of CSCI2110 Bipartite Matching Tutor: Zhou Hong ( 周宏 )
Approximation Algorithms Chapter 5: k-center. Overview n Main issue: Parametric pruning –Technique for approximation algorithms n 2-approx. algorithm.
Combinatorial Algorithms
Approximation Algorithms for Unique Games Luca Trevisan Slides by Avi Eyal.
Introduction to PCP and Hardness of Approximation Dana Moshkovitz Princeton University and The Institute for Advanced Study 1.
Inapproximability from different hardness assumptions Prahladh Harsha TIFR 2011 School on Approximability.
Dictator tests and Hardness of approximating Max-Cut-Gain Ryan O’Donnell Carnegie Mellon (includes joint work with Subhash Khot of Georgia Tech)
The number of edge-disjoint transitive triples in a tournament.
Introduction to Approximation Algorithms Lecture 12: Mar 1.
Approximation Algorithm for Multicut
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Computational problems, algorithms, runtime, hardness
Venkatesan Guruswami (CMU) Yuan Zhou (CMU). Satisfiable CSPs Theorem [Schaefer'78] Only three nontrivial Boolean CSPs for which satisfiability is poly-time.
Semidefinite Programming
1 Introduction to Linear and Integer Programming Lecture 9: Feb 14.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Approximation Algorithm: Iterative Rounding Lecture 15: March 9.
Perfect Graphs Lecture 23: Apr 17. Hard Optimization Problems Independent set Clique Colouring Clique cover Hard to approximate within a factor of coding.
A general approximation technique for constrained forest problems Michael X. Goemans & David P. Williamson Presented by: Yonatan Elhanani & Yuval Cohen.
Semidefinite Programming Based Approximation Algorithms Uri Zwick Uri Zwick Tel Aviv University UKCRC’02, Warwick University, May 3, 2002.
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
Complexity 1 Hardness of Approximation. Complexity 2 Introduction Objectives: –To show several approximation problems are NP-hard Overview: –Reminder:
On the hardness of approximating Sparsest-Cut and Multicut Shuchi Chawla, Robert Krauthgamer, Ravi Kumar, Yuval Rabani, D. Sivakumar.
1 Introduction to Approximation Algorithms Lecture 15: Mar 5.
(work appeared in SODA 10’) Yuk Hei Chan (Tom)
Tight Integrality Gaps for Lovász-Schrijver LP relaxations of Vertex Cover Grant Schoenebeck Luca Trevisan Madhur Tulsiani UC Berkeley.
Dana Moshkovitz, MIT Joint work with Subhash Khot, NYU.
Integrality Gaps for Sparsest Cut and Minimum Linear Arrangement Problems Nikhil R. Devanur Subhash A. Khot Rishi Saket Nisheeth K. Vishnoi.
C&O 355 Lecture 2 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
V. V. Vazirani. Approximation Algorithms Chapters 3 & 22
Approximation Schemes via Sherali-Adams Hierarchy for Dense Constraint Satisfaction Problems and Assignment Problems Yuichi Yoshida (NII & PFI) Yuan Zhou.
Edge Covering problems with budget constrains By R. Gandhi and G. Kortsarz Presented by: Alantha Newman.
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Yuan Zhou Carnegie Mellon University Joint works with Boaz Barak, Fernando G.S.L. Brandão, Aram W. Harrow, Jonathan Kelner, Ryan O'Donnell and David Steurer.
Semidefinite Programming
Graph Colouring L09: Oct 10. This Lecture Graph coloring is another important problem in graph theory. It also has many applications, including the famous.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
C&O 355 Mathematical Programming Fall 2010 Lecture 16 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
Shorter Long Codes and Applications to Unique Games 1 Boaz Barak (MSR, New England) Parikshit Gopalan (MSR, SVC) Johan Håstad (KTH) Prasad Raghavendra.
C&O 355 Lecture 24 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A A A A A A A.
Julia Chuzhoy (TTI-C) Yury Makarychev (TTI-C) Aravindan Vijayaraghavan (Princeton) Yuan Zhou (CMU)
CPS Computational problems, algorithms, runtime, hardness (a ridiculously brief introduction to theoretical computer science) Vincent Conitzer.
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
A Unified Continuous Greedy Algorithm for Submodular Maximization Moran Feldman Roy SchwartzJoseph (Seffi) Naor Technion – Israel Institute of Technology.
Introduction to Graph Theory
Yuan Zhou, Ryan O’Donnell Carnegie Mellon University.
Boaz Barak (MSR New England) Fernando G.S.L. Brandão (Universidade Federal de Minas Gerais) Aram W. Harrow (University of Washington) Jonathan Kelner (MIT)
C&O 355 Lecture 19 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
Approximation Algorithms by bounding the OPT Instructor Neelima Gupta
1 On MultiCuts and Related Problems Michael Langberg Joint work with Adi Avidor On MultiCuts and Related Problems Michael Langberg California Institute.
Approximation Algorithms based on linear programming.
Yuan Zhou Carnegie Mellon University Joint works with Boaz Barak, Fernando G.S.L. Brandão, Aram W. Harrow, Jonathan Kelner, Ryan O'Donnell and David Steurer.
Finding Almost-Perfect
Approximating k-route cuts
Polynomial integrality gaps for
Maximum Matching in the Online Batch-Arrival Model
Approximating k-route cuts
Structural Properties of Low Threshold Rank Graphs
Introduction to PCP and Hardness of Approximation
CPS 173 Computational problems, algorithms, runtime, hardness
Presentation transcript:

Finding Almost-Perfect Graph Bisections Venkatesan Guruswami (CMU) Yury Makarychev (TTI-C) Prasad Raghavendra (Georgia Tech) David Steurer (MSR) Yuan Zhou (CMU)

Bipartite graph recognition Depth-first search/breadth-first search With some noise? Given a bipartite graph with 1% noisy edges, can we remove a small fraction of edges (10% say) to get a bipartite graph, i.e. can we divide the vertices into two parts, so that 90% of the edges go accross the two parts?

MaxCut G=(V,E) cut(A, B) = edges(A, B) / |E| B = V - A where B = V - A exact one of i, j in A : edge (i, j) "on the cut" MaxCut: find A, B such that cut(A, B) is maximized Bipartite graph recognition: MaxCut = 1 ? Robust bipartite graph recognition: given MaxCut ≥ 0.99, to find cut(A, B) ≥ 0.9 B = V - A A cut(A, B) = 4/5 subject to

c vs. s approximation for MaxCut Given a graph with MaxCut value at least c, can we find a cut of value at least s ? Robust bipartite graph recognition: given MaxCut ≥ 0.99, to find cut(A, B) ≥ 0.9 0.99 vs 0.9 approximation "approximating almost perfect MaxCut"

Robust bipartite graph recognition Task: given MaxCut ≥ 0.99, find cut(A, B) ≥ 0.9 We can always find cut(A, B) ≥ 1/2. Assign each vertex -1, 1 randomly For any edge (i, j), E[(1 - xixj)/2] = 1/2 vi vj

Robust bipartite graph recognition (cont'd) Task: given MaxCut ≥ 0.99, find cut(A, B) ≥ 0.9 We can always find cut(A, B) ≥ 1/2. Better than 1/2? DFS/BFS/greedy? Linear Programming? No combinatorial algorithm known until very recent [KS11] Natural LPs have big Integrality Gaps [VK07, STT07, CMM09]

Robust bipartite graph recognition (cont'd) Task: given MaxCut ≥ 0.99, find cut(A, B) ≥ 0.9 We can always find cut(A, B) ≥ 1/2. Better than 1/2? The GW Semidefinite Programming relaxation [GW95] 0.878-approximation Given MaxCut , can find a cut vs approximation, tight under Unique Games Conjecture [Kho02, KKMO07, MOO10] subject to

Robust satisfiability algorithms Given an instance which can be satisfied by removing ε fraction of constraints, to make the instance satisfiable by removing g(ε) fraction of constraints g(ε) -> 0 as ε -> 0 Examples vs. algorithm for MaxCut [GW95] vs. algorithm for Max2SAT [Zwick98] vs. algorithm for MaxHorn3SAT [Zwick98]

MaxBisection G = (V, E) Objective: A B subject to

MaxBisection (cont'd) Approximating MaxBisection? No easier than MaxCut Reduction: take two copies of the MaxCut instance G = (V, E) Objective: A B

MaxBisection (cont'd) Approximating MaxBisection? No easier than MaxCut Strictly harder than MaxCut? Approximation ratio: 0.6514 [FJ97], 0.699 [Ye01], 0.7016 [HZ02], 0.7027 [FL06] Approximating almost perfect solutions? Not known G = (V, E) Objective: A B

Finding almost-perfect MaxBisection Question Is there a vs approximation algorithm for MaxBisection, where ? Answer. Yes. Our result. Theorem. There is a vs approximation algorithm for MaxBisection. Theorem. Given a satisfiable MaxBisection instance, it is easy to find a (.49, .51)-balanced cut of value .

Extension to MinBisection minimize edges(A, B)/|V|, s.t. B = V - A, |B| = |A| Our result Theorem. There is a vs approximation algorithm for MaxBisection. Theorem. Given a MinBisection instance of value , it is easy to find a (.49, .51)-balanced cut of value .

The rest of this talk... Previous algorithms for MaxBisection. Theorem. There is a vs approximation algorithm for MaxBisection.

Previous algorithms for MaxBisection

The GW algorithm for (almost perfect) MaxCut [GW95] MaxCut objective SDP relaxation subject to MaxCut = 2/3 -1 0 1 subject to SDP ≥ MaxCut In this example: SDP = 3/4 > MaxCut

The "rounding" algorithm Lemma. We can (in poly time) get a cut of value when Algorithm. Choose a random hyperplane, the hyperplane divides the vertices into two parts. Analysis. subject to

The "rounding" algorithm (cont'd) Lemma. We can (in poly time) get a cut of value when Algorithm. Choose a random hyperplane, the hyperplane divides the vertices into two parts. Analysis. implies for most edges (i, j), their SDP contribution is large Claim. If , then Therefore, the random hyperplane cuts many edges (in expectation) subject to

The "rounding" algorithm (cont'd) Claim. If , then Proof. vi, vj seperated by the hyperplane vi vj vi, vj not seperated by the hyperplane

Known algorithms for MaxBisection The standard SDP (used by all the previous algorithms) Gives non-trivial approximation gaurantee But does not help find almost perfect MaxBisection , subject to Bisection condition

Known algorithms for MaxBisection (cont'd) The standard SDP (used by all the previous algorithms) The "integrality gap" , subject to OPT < 0.9 SDP = 1

Known algorithms for MaxBisection (cont'd) The standard SDP (used by all the previous algorithms) The "integrality gap" : instances that OPT < 0.9, SDP = 1 Why is this a bad news for SDP? Instances that OPT > 1 - ε, SDP > 1 - ε Instances that OPT < 0.9, SDP > 1 - ε SDP cannot tell whether an instance is almost satisfiable (OPT > 1 - ε) or not. , subject to

Our approach

Theorem. There is a vs approximation algorithm for MaxBisection.

A simple fact Fact. -balanced cut of value bisection of value . Proof. Get the bisection by moving fraction of random vertices from the large side to the small side. fraction of cut edges affected : at most in expectation Only need to find almost bisections.

Almost perfect MaxCuts on expanders λ-expander: for each , such that , we have , where G=(V,E) S

Almost perfect MaxCuts on expanders (cont'd) λ-expander: for each , such that , we have , where Key Observation. The (volume of) difference between two cuts on a λ-expander is at most . Proof. C X A B Y D

Almost perfect MaxCuts on expanders (cont'd) λ-expander: for each , such that , we have , where Key Observation. The (volume of) difference between two cuts on a λ-expander is at most . Approximating almost perfect MaxBisection on expanders is easy. Just run the GW alg. to find the MaxCut.

The algorithm (sketch) Decompose the graph into expanders Discard all the inter-expander edges Approximate OPT's behavior on each expander by finding MaxCut (GW) Discard all the uncut edges Combine the cuts on the expanders Take one side from each cut to get an almost bisection. (subset sum) Step 2: find MaxCut Step 3: combine pieces Step 1: decompose into expanders G=(V,E)

Expander decomposition Cheeger's inequality. Can (efficiently) find a cut of sparsity if the graph is not a -expander. Corollary. A graph can be (efficiently) decomposed into -expanders by removing edges (in fraction). Proof. If the graph is not an expander, divide it into small parts by sparsest cut (cheeger's inequality). Process the small parts recursively. G=(V,E) λ-expander

The algorithm Decompose the graph into -expanders. Lose edges. Apply GW algorithm on each expander to approximate OPT. OPT(MaxBisection) = GW finds cuts on these expanders different from behavior of OPT Lose edges. Combine the cuts on the expanders (subset sum). -balanced cut of value a bisection of value

Theorem. There is a vs approximation algorithm for MaxBisection. Proved: Theorem. There is a vs approximation algorithm for MaxBisection. Will prove: short story

Eliminating the factor Recall. Only need to find almost bisections ( -close to a bisection) Observation. Subset sum is "flexible with small items" Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (3, 5) (8, 0) (6, 2) (6, 0) (5, 1) (5, 0) (3, 2) sum (515, 515)

Eliminating the factor Recall. Only need to find almost bisections ( -close to a bisection) Observation. Subset sum is "flexible with small items" Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (8, 0) (6, 0) (5, 0) sum (498, 505)

Eliminating the factor Recall. Only need to find almost bisections ( -close to a bisection) Observation. Subset sum is "flexible with small items" Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (8, 0) (6, 0) (5, 0) sum (506, 505)

Eliminating the factor Recall. Only need to find almost bisections ( -close to a bisection) Observation. Subset sum is "flexible with small items" Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (0, 8) (6, 0) (5, 0) sum (506, 513)

Eliminating the factor Recall. Only need to find almost bisections ( -close to a bisection) Observation. Subset sum is "flexible with small items" Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (0, 8) (6, 0) (5, 0) sum (512, 513)

Eliminating the factor Recall. Only need to find almost bisections ( -close to a bisection) Observation. Subset sum is "flexible with small items" Making small items more biased does not change the solution too much. (101, 304) (397, 201) (8, 0) (0, 8) (6, 0) (5, 0) sum (517, 513)

Eliminating the factor Recall. Only need to find almost bisections ( -close to a bisection) Observation. Subset sum is "flexible with small items" Making small items more biased does not change the solution too much. However, making small items more balanced might be a bad idea. (200, 0) (0, 2) (0, 2) 100 copies (0, 2) sum (200, 200)

Eliminating the factor Recall. Only need to find almost bisections ( -close to a bisection) Observation. Subset sum is "flexible with small items" Making small items more biased does not change the solution too much. However, making small items more balanced might be a bad idea. (200, 0) (1, 1) (1, 1) 100 copies (1, 1) sum (300, 100)

Eliminating the factor (cont'd) Idea. Terminate early in the decomposition process. Decompose the graph into -expanders (large items), or subgraphs of vertices (small items). Corollary. Only need to discard edges. Lemma. We can find an almost bisection if the MaxCuts we get for small sets are more biased than those in OPT.

Finding a biased MaxCut To find a cut that is as biased as OPT and as good as OPT (in terms of cut value). Lemma. Given G=(V,E), if there exists a cut (X, Y) of value , then one can find a cut (A, B) of value , such that . MaxBisection Biased MaxCut

The algorithm Decompose the graph into -expanders or small parts. Lose edges. Apply GW algorithm on each expander to approximate OPT. Lose edges, different from OPT Find biased MaxCuts in small parts. Lose edges, at most less biased than OPT Combine the cuts on the expanders and small parts (subset sum). -balanced cut of value a bisection of value

Finding a biased MaxCut -- A simpler task Lemma. Given G=(V,E), if there exists a cut (X, Y) of value , then one can find a cut (A, B) of value , such that . SDP. Claim. SDP ≥ |X|/|V| --- Bias maximize subject to --- Cut value

Rounding algorithm (sketch) Goal: given SDP solution, to find a cut (A, B) such that For most ( fraction) edges (i, j), we have vi, vj are almost opposite to each other: vi ≈ - vj, Indeed,

Rounding algorithm (sketch) (cont'd) for most edges (i, j): Project all vectors to v0 Divide v0 axis into intervals length = Most ( fraction ) edges' incident vertices fall into opposite intervals (good edges) Discard all bad edges v0 I(-4) I(-3) I(-2) I(-1) I(1) I(2) I(3) I(4)

Rounding algorithm (sketch) (cont'd) Let the cut (A, B) be for each pair of intervals I(k) and I(-k), let A include the one with more vertices, B include the other (A, B) cuts all good edges v0 -4 -3 -2 -1 1 2 3 4

Rounding algorithm (sketch) (cont'd) Let the cut (A, B) be for each pair of intervals I(k) and I(-k), let A include the one with more vertices, B include the other For each i in I(k) For each i in I(-k)

Finding a biased MaxCut Lemma. Given G=(V,E), if there exists a cut (X, Y) of value , then one can find a cut (A, B) of value , such that . SDP. --- Bias maximize subject to --- Cut value -triangle inequality

Future directions vs approximation? "Global conditions" for other CSPs. Balanced Unique Games?

The End. Any questions?

Eliminating the factor Another key step. Idea. Terminate early in the decomposition process. Decompose the graph into -expanders or subgraphs of vertices. Corollary. Only need to discard edges. Lemma. We can find an almost bisection if the MaxCuts for small sets are more biased than those in OPT. MaxBisection Biased MaxCut

Finding a biased MaxCut Lemma. Given G=(V,E), if there exists a cut (X, Y) of value , then one can find a cut (A, B) of value , such that . SDP. Rounding. A hybrid of hyperplane and threshold rounding. maximize subject to -triangle inequality

Future directions vs approximation? "Global conditions" for other CSPs. Balanced Unique Games?

The End. Any questions?