Guy Kindler Weizmann Institute. Well try to understand some notions, and their relations: Combinatorial optimization problems Combinatorial optimization.

Slides:



Advertisements
Similar presentations
Arora: SDP + Approx Survey Semidefinite Programming and Approximation Algorithms for NP-hard Problems: A Survey Sanjeev Arora Princeton University
Advertisements

The Primal-Dual Method: Steiner Forest TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A AA A A A AA A A.
Uri Feige Microsoft Understanding Parallel Repetition Requires Understanding Foams Guy Kindler Weizmann Ryan ODonnell CMU.
On allocations that maximize fairness Uriel Feige Microsoft Research and Weizmann Institute.
Routing in Undirected Graphs with Constant Congestion Julia Chuzhoy Toyota Technological Institute at Chicago.
Parallel Repetition of Two Prover Games Ran Raz Weizmann Institute and IAS.
Primal Dual Combinatorial Algorithms Qihui Zhu May 11, 2009.
Submodular Set Function Maximization via the Multilinear Relaxation & Dependent Rounding Chandra Chekuri Univ. of Illinois, Urbana-Champaign.
The Unique Games Conjecture and Graph Expansion School on Approximability, Bangalore, January 2011 Joint work with S Prasad Raghavendra Georgia Institute.
Linear Round Integrality Gaps for the Lasserre Hierarchy Grant Schoenebeck.
Shortest Vector In A Lattice is NP-Hard to approximate
Subhash Khot IAS Elchanan Mossel UC Berkeley Guy Kindler DIMACS Ryan O’Donnell IAS.
Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
Satyen Kale (Yahoo! Research) Joint work with Sanjeev Arora (Princeton)
Gillat Kol joint work with Ran Raz Locally Testable Codes Analogues to the Unique Games Conjecture Do Not Exist.
The Max-Cut problem: Election recounts? Majority vs. Electoral College? 7812.
C&O 355 Lecture 23 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
Approximation Algorithms
On the Unique Games Conjecture Subhash Khot Georgia Inst. Of Technology. At FOCS 2005.
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
Heuristics for the Hidden Clique Problem Robert Krauthgamer (IBM Almaden) Joint work with Uri Feige (Weizmann)
The Unique Games Conjecture with Entangled Provers is False Julia Kempe Tel Aviv University Oded Regev Tel Aviv University Ben Toner CWI, Amsterdam.
Semi-Definite Algorithm for Max-CUT Ran Berenfeld May 10,2005.
Introduction to PCP and Hardness of Approximation Dana Moshkovitz Princeton University and The Institute for Advanced Study 1.
PCPs and Inapproximability Introduction. My T. Thai 2 Why Approximation Algorithms  Problems that we cannot find an optimal solution.
Introduction to Approximation Algorithms Lecture 12: Mar 1.
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Computational problems, algorithms, runtime, hardness
Semidefinite Programming
Vol.1: Geometry Subhash Khot IAS Elchanan Mossel UC Berkeley Guy Kindler DIMACS Ryan O’Donnell IAS.
The Theory of NP-Completeness
Matching Polytope, Stable Matching Polytope Lecture 8: Feb 2 x1 x2 x3 x1 x2 x3.
1 Introduction to Approximation Algorithms Lecture 15: Mar 5.
Finding Almost-Perfect
C&O 355 Mathematical Programming Fall 2010 Lecture 17 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
Decision Procedures An Algorithmic Point of View
Programming & Data Structures
1 Graphs with tiny vector chromatic numbers and huge chromatic numbers Michael Langberg Weizmann Institute of Science Joint work with U. Feige and G. Schechtman.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Prabhas Chongstitvatana1 NP-complete proofs The circuit satisfiability proof of NP- completeness relies on a direct proof that L  p CIRCUIT-SAT for every.
TECH Computer Science NP-Complete Problems Problems  Abstract Problems  Decision Problem, Optimal value, Optimal solution  Encodings  //Data Structure.
Approximation Algorithms
CSE 3813 Introduction to Formal Languages and Automata Chapter 14 An Introduction to Computational Complexity These class notes are based on material from.
C&O 355 Mathematical Programming Fall 2010 Lecture 16 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
New algorithms for Disjoint Paths and Routing Problems
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
Lower Bounds for Embedding Edit Distance into Normed Spaces A. Andoni, M. Deza, A. Gupta, P. Indyk, S. Raskhodnikova.
Maximizing Symmetric Submodular Functions Moran Feldman EPFL.
Algorithms for hard problems Introduction Juris Viksna, 2015.
Functions Objective: To determine whether relations are functions.
Approximation Algorithms based on linear programming.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
Unconstrained Submodular Maximization Moran Feldman The Open University of Israel Based On Maximizing Non-monotone Submodular Functions. Uriel Feige, Vahab.
More NP-Complete and NP-hard Problems
Finding Almost-Perfect
Algorithms for hard problems
Algorithms for Routing Node-Disjoint Paths in Grids
Possibilities and Limitations in Computation
Subhash Khot Theory Group
Introduction to PCP and Hardness of Approximation
On the effect of randomness on planted 3-coloring models
Linear Programming Duality, Reductions, and Bipartite Matching
Prabhas Chongstitvatana
More NP-Complete Problems
Instructor: Aaron Roth
Presentation transcript:

Guy Kindler Weizmann Institute

Well try to understand some notions, and their relations: Combinatorial optimization problems Combinatorial optimization problems Approximation: relaxation by semi-definite programs. Approximation: relaxation by semi-definite programs. Integrality gaps Integrality gaps Hardness of approximation Hardness of approximation Main example: the Max-Cut problem About this talk

Combinatorial optimization problems Input, search space, objective function Example: MAX-CUT

input: G = (V,E) input: G = (V,E) Search space: Partition V=(C, C c ) Search space: Partition V=(C, C c ) Objective function: w(C) = fraction of cut edges Objective function: w(C) = fraction of cut edges The MAX-CUT Problem: Find mc(G)=max C {w(C)} The MAX-CUT Problem: Find mc(G)=max C {w(C)} [Karp 72]: MAX-CUT is NP-complete [Karp 72]: MAX-CUT is NP-complete

Example: MAX-CUT input: G = (V,E) input: G = (V,E) Search space: Partition V=(C, C c ) Search space: Partition V=(C, C c ) Objective function: w(C) = fraction of cut edges Objective function: w(C) = fraction of cut edges The MAX-CUT Problem: Find mc(G)=max C {w(C)} The MAX-CUT Problem: Find mc(G)=max C {w(C)} - approximation: Output S, s.t. mc(G) ¸ S ¸ ¢ mc(G). - approximation: Output S, s.t. mc(G) ¸ S ¸ ¢ mc(G). History: ½-approximation easy, was best record for long time. History: ½-approximation easy, was best record for long time.

Semi-definite Relaxation Introducing geometry into combinatorial optimization [GW 95]

Arithmetization v G=(V,E): xuxu xuxu u xvxv xvxv Problem: We cant maximize quadratic functions, even over convex domains.

Relaxation by geometric embedding v G=(V,E): xuxu xuxu u xvxv xvxv Problem: We cant maximize quadratic functions, even over convex domains.

Relaxation by geometric embedding v G=(V,E): xuxu xuxu u xvxv xvxv Problem: We cant maximize quadratic functions, even over convex domains.

Relaxation by geometric embedding v G=(V,E): xuxu xuxu u xvxv xvxv Problem: We cant maximize quadratic functions, even over convex domains.

Relaxation by geometric embedding v G=(V,E): u xuxu xuxu xvxv xvxv Now were maximizing a linear function over a convex domain! (unit sphere in R n ) Semi-definite relaxation

(unit sphere in R n ) Relaxation by geometric embedding v G=(V,E): u Is this really a relaxation?

Relaxation by geometric embedding (unit sphere in R n ) v G=(V,E): u xuxu xuxu xvxv xvxv Is this really a relaxation?

Analysis by randomized rounding xuxu xuxu (unit sphere in R n ) v G=(V,E): u xuxu xuxu xvxv xvxv

xuxu xuxu xvxv xvxv Analysis by randomized rounding xuxu xvxv arccos( ) donation to (*) So So (unit sphere in R n )

Analysis by randomized rounding xuxu xvxv arccos( ) donation to (*) So So

Analysis by randomized rounding L.h.s. is tight, iff all inner products are ρ* So So mc(G) ¸ GW ¢ rmc(G) ¸ GW ¢ mc(G)

mc(G) ¸ S ¸ GW ¢ mc(G) An approximation for Max-Cut The [GW 95] algorithm: The [GW 95] algorithm: Given G, compute rmc(G) Given G, compute rmc(G) Let S= GW ¢ rmc(G) Let S= GW ¢ rmc(G) Output S. Output S. Is GW the best constant here? L.h.s. is tight, iff all inner products are ρ* mc(G) ¸ GW ¢ rmc(G) ¸ GW ¢ mc(G) Is there a graph where this occurs?

Integrality gap Measuring the quality of the geometric approximation [FS 96]

The integrality gap of Max-Cut On instance G: rmc(G) mc(G) w S(G) r-S(G)

The integrality gap of Max-Cut On instance G: rmc(G) mc(G) w S(G) r-S(G) ¸ GW The [GW 95] algorithm: Given G, output IG ¢ rmc(G) = GW for some G *

The integrality gap of Max-Cut On instance G: rmc(G) mc(G) w S(G) r-S(G) The [GW 95] algorithm: Given G, output IG ¢ rmc(G) = GW for some G * Using ¢ rmc(G) to approximate mc(G), the factor =IG cannot be improved! Using ¢ rmc(G) to approximate mc(G), the factor =IG cannot be improved! On G * the algorithm computes mc(G * ) perfectly! On G * the algorithm computes mc(G * ) perfectly!

(unit sphere in R q ) The Feige-Schechtman graph, G * rmc(F) ¸ (1- * )/2 rmc(F) ¸ (1- * )/2 [FS]: mc(G * )= arccos(ρ * )/ [FS]: mc(G * )= arccos(ρ * )/ so so Vertices: S n Edges: u~v iff = * arccos(ρ * )

From IG to hardness A geometric trick may actually be inherent [KKMO 05]

Thoughts about integrality gaps The integrality gap is an artifact of the relaxation. The integrality gap is an artifact of the relaxation. The relaxation does great on an instance for which the integrality gap is achieved. The relaxation does great on an instance for which the integrality gap is achieved. And yet, sometimes the integrality gap is provably * the best approximation factor achievable: And yet, sometimes the integrality gap is provably * the best approximation factor achievable: [KKMO 04]: under UGC, GW is optimal for max-cut. [HK 03], [HKKSW], [KO 06], [ABHS 05], [AN 02] [HK 03], [HKKSW], [KO 06], [ABHS 05], [AN 02] *under some reasonable comlexity theoretic assumptions

Thoughts about integrality gaps The integrality gap is an artifact of the relaxation. The integrality gap is an artifact of the relaxation. An algorithm does great on an instance for which the integrality gap is achieved. An algorithm does great on an instance for which the integrality gap is achieved. And yet, sometimes the integrality gap is provably * the best approximation factor achievable: And yet, sometimes the integrality gap is provably * the best approximation factor achievable: [KKMO 04]: Under UGC, GW is optimal for max-cut. [HK 03], [HKKSW], [KO 06], [ABHS 05], [AN 02] [HK 03], [HKKSW], [KO 06], [ABHS 05], [AN 02] And the IG instance G * is used in the hardness proof

A recipe for proving hardness Take the instance G * : rmc(G * ) mc(G * ) w S(G * ) r-S(G * )

A recipe for proving hardness Add teeth to S(G * ) which achieve rmc(G * ). rmc(G * ) mc(G * ) w r-S(G * ) S(G * )

A recipe for proving hardness Now combine several instances, such that finding a point which belongs to all teeth becomes a hard combinatorial optimization problem. rmc(G * ) w mc(G * ) S(G * ) r-S(G * )

Now combine several instances, such that finding a point which belongs to all teeth becomes a hard combinatorial optimization problem. A recipe for proving hardness w

w Determining whether mc(G`)=mc(G * ) or whether mc(G`)=rmc(G * ) is intractable. Factor of hardness: mc(G * )/rmc(G * )=IG !!

Vertices: S n Edges: u~v iff = * Adding teeth to Feige-Schechtman (unit sphere in R q )

Adding teeth to Feige-Schechtman (unit sphere in R q ) Vertices: S n Edges: u~v iff = * Vertices: {-1,1} n Edges: u~v iff = * : a random edge (x,y): x~{-1,1} n, E[ ] = ρ E[ ] = ρ y: y i = x i w.p. ½ + ½ρ -x i w.p. ½ - ½ρ

Adding teeth to Feige-Schechtman mc = arccos(ρ)/ ? mc = arccos(ρ)/ ? For C(x) = x 7, For C(x) = x 7, w(C) = P edge [x 7 y 7 ] = (1-ρ)/2 !! w(C) = P edge [x 7 y 7 ] = (1-ρ)/2 !!

Adding teeth to Feige-Schechtman no influential coordinates mc = arccos(ρ)/ ? mc = arccos(ρ)/ ? For C(x) = x 7, For C(x) = x 7, w(C) = P edge [x 7 y 7 ] = (1-ρ)/2 !! w(C) = P edge [x 7 y 7 ] = (1-ρ)/2 !! For C(x) = sign( x i ) = Maj(x), For C(x) = sign( x i ) = Maj(x), w(C) = P[Maj(x)Maj(y)] (arccos ρ)/π

mc = arccos(ρ)/ ? mc = arccos(ρ)/ ? [for axis parallel cut]: w(C) =(1-ρ)/2 [for axis parallel cut]: w(C) =(1-ρ)/2 [MOO 05]: If i, I i (C)<, [MOO 05]: If i, I i (C)<, w(C) (arccos ρ)/π +o (1) w(C) (arccos ρ)/π +o (1) Ratio between weight of teeth cuts and regular cuts is GW (for ρ = ρ*) Ratio between weight of teeth cuts and regular cuts is GW (for ρ = ρ*) Adding teeth to Feige-Schechtman

We tried to understand some notions, and their relations: Combinatorial optimization problems Combinatorial optimization problems Approximation: relaxation by semi-definite programs. Approximation: relaxation by semi-definite programs. Integrality gaps Integrality gaps Hardness of approximation Hardness of approximation Conclusion