Download presentation
Published byBrook Pope Modified over 9 years ago
1
Conditional Inapproximability and Limited Independence
a doctoral thesis, by Per Austrin KTH School of Computer Science and Communication opponent: Ryan O’Donnell Carnegie Mellon University
2
Conditional Inapproximability and Limited Independence
a doctoral thesis, by Per Austrin KTH School of Computer Science and Communication opponent: Ryan O’Donnell Carnegie Mellon University
3
Carnegie Mellon University
(Turing Award) (Gödel Prize x 2) (Gödel Prize, Royal Swedish Acad. of Sci.) ? opponent: Ryan O’Donnell Carnegie Mellon University
4
Carnegie Mellon University
(Turing Award) (Gödel Prize x 2) (Gödel Prize, Royal Swedish Acad. of Sci.) ? opponent: Ryan O’Donnell Carnegie Mellon University
5
Carnegie Mellon University
Theoretical Computer Science: (Turing Award) (Gödel Prize x 2) (Gödel Prize, Royal Swedish Acad. of Sci.) ? opponent: Ryan O’Donnell Carnegie Mellon University
6
Theoretical Computer Science:
Which algorithmic problems can be solved efficiently?
7
Problem: 3Sat Input: Alg’s goal: an assignment satisfying as many constraints as possible.
8
“Efficient” = “polynomial time” = # steps always ≤ nC
9
Question: Doable in nC steps?
Input: Obvious algorithm: ≈ 2n steps. Question: Doable in nC steps?
10
Answer: No. Cook’s Theorem: 3Sat is “NP-hard” NP-hard = Not doable in polynomial time assuming “P ≠ NP”. “P ≠ NP”: Everyone knows it’s true.
11
Traveling Salesperson
Polynomial Time Maximum Matching Linear Programming Primality ····· 1000’s of problems NP-hard 3Sat Traveling Salesperson Chromatic Number ····· 1000’s of problems any natural problems in here?
12
Not known to be in P or NP-hard
1. Factoring 2. Graph Isomorphism 3. · · · · · ? Handbook on Algorithms and Theory of Computation [ALR99]: “The vast majority of natural problems in NP have resolved themselves as being either in P or NP-complete. Unless you uncover a specific connection to one of [the above] intermediate problems, it is more likely offhand that your problem simply needs more work.” NP-Completeness Column [Joh05]: 3. Precedence Constrained Processor Scheduling
13
Traveling Salesperson
Exact optimization Approximation? NP-hard 3Sat Traveling Salesperson Chromatic Number ····· 1000’s of problems
14
Not known to be in P or to be NP-hard.
Approximation? 95%-approximating 2Sat ? 90%-approximating 2CSP ? 15%-approximating 6CSP ? Not known to be in P or to be NP-hard.
15
Results from Austrin’s Thesis
95%-approximating 2Sat ? Hard. 90%-approximating 2CSP ? Hard. 15%-approximating 6CSP ? Hard.
16
* Not “NP-hard”, only “UG-hard”.
Results from Austrin’s Thesis 95%-approximating 2Sat ? Hard.* 90%-approximating 2CSP ? Hard.* 15%-approximating 6CSP ? Hard.* * Not “NP-hard”, only “UG-hard”.
17
Results from Austrin’s Thesis
95%-approximating 2Sat Hard.* %-approximating 2Sat Hard.* Theorem [LLZ’02]: %-approximating 2Sat can be done in polynomial time. αLLZ + αLLZ
18
Definition of αLLZ = …
19
This is* the approximability threshold of efficient algorithms for 2Sat!
20
Remainder of the talk: 1. Definitions 2. Statements of main results
3. Remarks about proof techniques
21
Remainder of the talk: 1. Definitions 2. Statements of main results
3. Remarks about proof techniques
22
Max-CSP(P) Constraint Satisfaction Problem
villkorssatisfieringsproblem, P is a predicate on k binary inputs
23
Max-CSP(P) Villkorssatisfieringsproblem villkorssatisfieringsproblem,
P is a predicate on k binary inputs
24
Max-CSP(P) P : {0,1}k {acc, rej}
Input “constraints” villkorssatisfieringsproblem, P is a predicate on k binary inputs
25
Max-CSP(P) P : {0,1}k {acc, rej}
Examples: Max-kSat: P = “ORk” Max-kLin: P = “XORk” Max-kAND: P = “ANDk” Max-kCSP: any mix of k-ary preds
26
Max-CSP(P) Many many many natural variants exist:
- constraints have different “weights” - negated variables not allowed - variables are {0, 1, 2, …, q-1}-valued - have to use values {0, …, q-1} “frugally” ax-Cut, Vertex-Cover, Graph-Coloring, Sparsest-Cut, Max-Clique, …
27
Approximation Algorithms
On input I , guaranteed to output assignment satisfying ≥ α · Opt(I ) constraints. Goal: find poly-time such algorithms, or, prove it’s NP-hard
28
Approximation Algorithms
Trivial approximation for Max-CSP(P): α-approximation, where (Because choosing x1, …, xn randomly satisfies α-fraction of all constraints in expectation.) E.g.: (3/4)-approximation, for Max-2Sat. Do example: Max-2Sat
29
Approximation Algorithms
“Max-CSP(P) is approximation-resistant”: = “Non-trivial approximation is NP-hard.” E.g.: Max-3Sat is approximation-resistant. [Håstad’97]
30
Pairwise Independence
Let μ be a probability distribution on {0,1}k. We say μ is pairwise independent if the marginal on (Xi, Xj) is uniform on {0,1}2, for all 1 ≤ i < j ≤ k, when (X1, …, Xk) ~ μ. t-wise independence, non-uniform marginals, etc.
31
UG-hard A problem is said to be “UG-hard” if it is at least as hard as the “Unique-Label-Cover Problem”. UG Conjecture [Khot’02]: “The Unique-Label-Cover Problem is NP-hard.” Outstanding open problem in TCS, b/c we don’t “know” the answer. Defining the “Unique-Label-Cover Problem” now would kind of kill the rhetorical flow.
32
Remainder of the talk: 1. Definitions 2. Statements of main results
3. Remarks about proof techniques
33
Remainder of the talk: 1. Definitions 2. Statements of main results
3. Remarks about proof techniques
34
Thesis Main Results 1. 2-CSP hardness
2. Approximation-resistant k-CSPs 3. Randomly supported pairwise independence 4. A technical result I’ll mention only briefly
35
2-CSP Hardness Let P : {0,1}2 {acc, rej}. Let β(P) = min [somewhat complicated numerical program]. Then β(P)-approximating Max-CSP(P) is UG-hard. [Result 1] positive configuration family Θ
36
2-CSP Hardness In particular:
β(OR2) = αLLZ = … (matching the [LLZ’02] algorithm) β(AND2) ≤ …, (nearly matching the …-approx. algorithm for Max-CSP(AND2) [LLZ’02]) [Result 1]
37
More on 2-ary Max-CSP(P)
Let β(P) = min [somewhat complicated numerical program]. Let α(P) = min [somewhat complicated numerical program]. Theorem: ∃ poly-time α(P)-approx alg. Conjecture: α(P) = β(P) for all 2-ary P. Then assuming the UG Conjecture, β(P)-approximating Max-CSP(P) is hard. [Result 1] positive configuration family Θ all configuration families Θ
38
Presaged… [Raghavendra’08]:
Let γ(P) = min [very complicated numerical program], α(P) ≤ γ(P) ≤ β(P). Theorem: ∃ poly-time γ(P)-approx alg. and also (γ(P)+)-approximating is UG-hard. Then assuming the UG Conjecture, β(P)-approximating Max-CSP(P) is hard. [Result 1]
39
Approximation-resistant k-CSPs
Let P : {0,1}k {acc, rej}. Suppose ∃ pairwise independent distribution μ on {0,1}k such that supp(μ) ⊆ P-1(acc). Then assuming the UG Conjecture, Max-CSP(P) is approximation-resistant. [Result 2, with Mossel]
40
Approximation-resistant k-CSPs
Q: How small a subset of {0,1}k can support a pairwise independent distribution? A: RoundUp4(k) points suffice (assuming the Hadamard Conjecture). [Result 2]
41
Approximation-resistant k-CSPs
( +)-UG-hardness for some 6-ary CSP ( +)-UG-hardness for some 7-ary CSP ( )-UG-hardness for some k-ary CSP [Result 2] Cor’s: Previous best: , , NP-hardness [ST’00]. Best alg.: approx. for Max-kCSP [CMM’07].
42
Randomly supported pairwise independence
[Result 3, with Håstad] Randomly supported pairwise independence Q: Does a random subset of {0,1}k of size S support a pairwise indep. distr.? Thm: Yes, whp, if S ≥ C · k2. No, whp, if S ≤ c · k2.
43
Thm: Yes, whp, if S ≥ C(q) · k2. No, whp, if S ≤ c(q) · k2.
[Result 3] More generally… Q: Does a random subset of {0, 1, …, q-1}k of size S support a pairwise indep. distr.? Thm: Yes, whp, if S ≥ C(q) · k2. No, whp, if S ≤ c(q) · k2. (& slightly weaker results for t-wise independence)
44
Remainder of the talk: 1. Definitions 2. Statements of main results
3. Remarks about proof techniques
45
Remainder of the talk: 1. Definitions 2. Statements of main results
3. Remarks about proof techniques
46
Proof remarks for Result 3
Thm: C(q) k2 random pts in {0, 1, …, q-1}k whp support a pairwise indep. distr. Pf sketch: Need to show a certain random convex body in ℝq2k2 contains origin whp. Uses “hypercontractivity” to show that quadratic polys of discrete rv’s are fairly concentrated around expectation.
47
Proofs for Hardness Results, 1 & 2
[Håstad’97] method for showing hardness: PCP Technology Discrete Fourier (“Label-Cover” is NP-hard) Analysis Wizardry +
48
Proofs for Hardness Results, 1 & 2
Post-2002 method for showing hardness*: PCP Technology Discrete Fourier (“Label-Cover” is NP-hard) Analysis Wizardry + UG Conjecture [Khot’02] (“Unique-L-C is NP-hard”) “Invariance Principle” [MOO’05,Mos’08]
49
Proofs for Hardness Results, 1 & 2
Post-2002 method had led to some new results: • … UG-hardness for “Max-Cut” • UG-hardness of C-coloring 3-colorable graphs (for all const C) Based on “straightforward” use of Invariance Principle.
50
Proofs for Hardness Results, 1 & 2
Key to Austrin’s new hardness results: Heroically exploit the somewhat scary Invariance Principle to its ultimate limits. (Thesis Result 4: Preliminary work on Invariance Principle generalization.)
51
Thanks for your attention.
Time for questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.