Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Algorithmic Proof of the Lopsided Lovasz Local Lemma Nick Harvey University of British Columbia Jan Vondrak IBM Almaden TexPoint fonts used in EMF.

Similar presentations


Presentation on theme: "An Algorithmic Proof of the Lopsided Lovasz Local Lemma Nick Harvey University of British Columbia Jan Vondrak IBM Almaden TexPoint fonts used in EMF."— Presentation transcript:

1 An Algorithmic Proof of the Lopsided Lovasz Local Lemma Nick Harvey University of British Columbia Jan Vondrak IBM Almaden TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A

2 Let E 1, E 2,…, E n be events in a discrete probability space. Let G be a graph on { 1,…, n }. Let ¡ ( i ) be neighbors of i, and let ¡ + ( i ) = ¡ ( i ) [ { i }. G is called a dependency graph if, for all i, E i is jointly independent of { E j : j  ¡ + ( i ) }. Erdos-Lovasz ‘75: Suppose | ¡ + ( i )| · d and P [ E i ] · p 8 i. If ped · 1 then P [ Å i E i c ] ¸ ( 1 - 1 / d ) n. Lovasz Local Lemma E1E1 E2E2 E3E3 E4E4 E.g., Pairwise Independent Events

3 Example: k-SAT Let Á be a k-SAT formula Pick a uniformly random assignment to the variables. Let E i be the event clause i is unsatisfied. Note P [ E i ]= 2 - k = p. Suppose that each variable appears in 2 k / ek clauses. Dependency graph: | ¡ + ( i )| · 2 k / e = d. Eg: Since ped · 1, LLL implies P [ Å i E i c ]> 0, so Á is satisfiable. E 1 Clause 1 unsatisfied E 2 Clause 2 unsatisfied E 3 Clause 3 unsatisfied E 4 Clause 4 unsatisfied Á = ( x 1 Ç x 2 Ç x 3 ) Æ ( x 1 Ç x 4 Ç x 5 ) Æ ( x 4 Ç x 5 Ç x 6 ) Æ ( x 2 Ç x 3 Ç x 6 )

4 Stronger forms of LLL Symmetric LLL: P [ E i ] · 1 /( e ¢ max j | ¡ + ( j )|) 8 i. Asymmetric LLL:  j 2 ¡ + ( i ) P [ E j ] · 1 / 4 8 i. General LLL: 9 x 1,..., x n 2 ( 0, 1 ) such that P [ E i ] · x i ¢  j 2 ¡ ( i ) ( 1 - x j ) 8 i.

5 Symmetric LLL: P [ E i ] · 1 /( e ¢ max j | ¡ + ( j )|) 8 i. Asymmetric LLL:  j 2 ¡ + ( i ) P [ E j ] · 1 / 4 8 i. General LLL: 9 y 1,..., y n > 0 such that P [ E i ] · y i /  j 2 ¡ + ( i ) ( 1 + y j ) 8 i. Stronger forms of LLL

6 Symmetric LLL: P [ E i ] · 1 /( e ¢ max j | ¡ + ( j )|) 8 i. Asymmetric LLL:  j 2 ¡ + ( i ) P [ E j ] · 1 / 4 8 i. General LLL: 9 y 1,..., y n > 0 such that P [ E i ] · y i /  J µ ¡ + ( i ) y J 8 i. where y J =  j 2 J y j. Let Ind be the independent sets of the dependency graph. Cluster Expansion: 9 y 1,..., y n > 0 such that P [ E i ] · y i /  J µ ¡ + ( i ) y J 8 i. J 2 Ind Stronger forms of LLL

7 Let Ind be the independent sets of the dependency graph. Let p i = P [ E i ]. For S µ V, let Shearer ‘85: If q I > 0 8 I 2 Ind then P [ Å i E i c ] ¸ q ; > 0. Moreover, for every dependency graph, this is the optimal criterion under which the LLL is true. “multivariate independence polynomial” Stronger forms of LLL

8 Algorithmic Results LLL gives a non-constructive proof that Å i E i c  ;. Can we efficiently find a point in that set? Long history: Beck ‘91, Alon ‘91, Molloy-Reed ‘98, Czumaj-Scheideler ‘00, Srinivasan ‘08 Can find a satisfying assignment for k-SAT instances where each variable appears in · 2 k / 7 clauses.

9 Algorithmic Results Efficiently finding a point in Å i E i c. Moser-Tardos ‘10: Algorithm for General LLL in “variable model”. – Probability space has product distribution on variables. – Each events is a function of some subset of the variables. – Dependency graph connects events that share variables. Many extensions: (but still in variable model) – Cluster Expansion criterion [Pegden ‘13] – Shearer’s criterion [Kolipaka-Szegedy ‘11]

10 Algorithmic Results Efficiently finding a point in Å i E i c. Moser-Tardos ‘10: Algorithm for General LLL in “variable model”. – Probability space has product distribution on variables. – Each events is a function of some subset of the variables. – Dependency graph connects events that share variables. Many extensions in other directions – Derandomization [Chandrasekaran et al. ‘10] – Exponentially many events [Haeupler, Saha, Srinivasan ‘10] – Column-sparse IPs [Harris, Srinivasan ‘13] – Distributed algorithms [Chung, Pettie, Su ‘14]

11 Extending LLL Beyond Dependency Graphs The definition of a dependency graph is equivalent to P [ E i | Å j 2 J E j c ] = P [ E i ] 8 J µ [n] n ¡ + ( i ). Define a lopsidependency graph to be one satisfying P [ E i | Å j 2 J E j c ] · P [ E i ] 8 J µ [n] n ¡ + ( i ). Intuitively, edges between events that are “negatively dependent” (Easy) Theorem: (Erdos-Spencer ‘91) All existential forms of the LLL remain true with lopsidependency graphs. J = any set of non-neighbors

12 Extending LLL Beyond Dependency Graphs The definition of a dependency graph is equivalent to P [ E i | Å j 2 J E j c ] = P [ E i ] 8 J µ [n] n ¡ + ( i ). Define a lopsidependency graph to be one satisfying P [ E i | Å j 2 J E j c ] · P [ E i ] 8 J µ [n] n ¡ + ( i ). Intuitively, edges between events that are “negatively dependent” Eg: Let ¼ be a permutation on { 1, 2 }. Let E 1 be “ ¼ ( 1 )= 1 ” and E 2 be “ ¼ ( 2 )= 2 ”. Then P [ E 1 | E 2 c ] = 0 < 1 / 2 = P [ E 1 ]. J = any set of non-neighbors E 1 and E 2 are “positively dependent”

13 Extending LLL Beyond Dependency Graphs The definition of a dependency graph is equivalent to P [ E i | Å j 2 J E j c ] = P [ E i ] 8 J µ [n] n ¡ + ( i ). Define a lopsidependency graph to be one satisfying P [ E i | Å j 2 J E j c ] · P [ E i ] 8 J µ [n] n ¡ + ( i ). Intuitively, edges between events that are “negatively dependent” Eg: Let ¼ be permutation on [ n ]. [Erdos-Spencer ‘91, Harris-Srinivasan ‘14] Let E i be “ ¼ ( a i )= b i ”. Add edges between E i and E j if a i = a j or b i = b j. J = any set of non-neighbors

14 Define a lopsidependency graph to be one satisfying P [ E i | Å j 2 J E j c ] · P [ E i ] 8 J µ [n] n ¡ + ( i ). Definition: A lopsided association graph is a graph where P [ E i | F ] ¸ P [ E i ] 8 F 2F i, where F i contains all monotone functions of { E i : j  ¡ + ( i ) }. Easy: Lopsided association ) Lopsidependency. Analogous to “negative association” vs “negative cylinder dependent” Lopsided Association

15 Define a lopsidependency graph to be one satisfying P [ E i | Å j 2 J E j c ] · P [ E i ] 8 J µ [n] n ¡ + ( i ). Definition: A lopsided association graph is a graph where P [ E i | F ] ¸ P [ E i ] 8 F 2F i, where F i contains all monotone functions of { E i : j  ¡ + ( i ) }. Easy: Lopsided association ) Lopsidependency. Main Result: For every probability space with a lopsided association graph, all existential forms of LLL have an algorithmic proof. Lopsided Association

16 Algorithmic LLL Symmetric General Cluster Expansion Shearer Variable model Dependency graph Lopsided association graph Lopsidependency graph Moser-Tardos Pegden Kolipaka-Szegedy Other Distributions

17 Symmetric General Cluster Expansion Shearer Variable model Dependency graph Lopsided association graph Lopsidependency graph Moser-Tardos Pegden Kolipaka-Szegedy Permutations Harris-Srinivasan Algorithmic LLL beyond variable model

18 Symmetric General Cluster Expansion Shearer Variable model Perfect Matchings in K n Dependency graph Lopsided association graph Lopsidependency graph Moser-Tardos Pegden Kolipaka-Szegedy Permutations Achlioptas- Iliopoulos Spanning Trees in K n ? Algorithmic LLL beyond variable model * Needs slack + Hamilton cycles in K n...

19 Symmetric General Cluster Expansion Shearer Variable model Perfect Matchings in K n Dependency graph Lopsided association graph Lopsidependency graph Moser-Tardos Pegden Kolipaka-Szegedy Permutations Achlioptas- Iliopoulos Spanning Trees in K n ? * Needs slack + Hamilton. Cycles... Algorithmic LLL beyond variable model

20 Symmetric General Cluster Expansion Shearer Variable model Perfect Matchings in K n Dependency graph Lopsided association graph Lopsidependency graph Permutations Spanning Trees in K n All probability spaces Our Results * Need slack in Shearer’s criterion * Intractability?

21 Resampling Oracles Need some algorithmic access to probability space ¹ Def: A resampling oracle for E i is r i :  !  satisfying – (R1) If ! » ¹  E i then r i ( ! ) » ¹ Removes conditioning on E i – (R2) If j  ¡ + ( i ) and E j does not occur in !, then E j does not occur in r i ( ! ). Cannot cause non-neighbors to occur Roughly, r i implements a coupling between ¹  E i and ¹ such that (R2) holds.

22 Resampling Oracles Need some algorithmic access to probability space ¹ Def: A resampling oracle for E i is r i :  !  satisfying – (R1) If ! » ¹  E i then r i ( ! ) » ¹ – (R2) If j  ¡ + ( i ) and E j does not occur in !, then E j does not occur in r i ( ! ). Eg: Moser-Tardos Variable model E 1 Clause 1 unsatisfied E 2 Clause 2 unsatisfied E 3 Clause 3 unsatisfied E 4 Clause 4 unsatisfied Á = ( x 1 Ç x 2 Ç x 3 ) Æ ( x 1 Ç x 4 Ç x 5 ) Æ ( x 4 Ç x 5 Ç x 6 ) Æ ( x 2 Ç x 3 Ç x 6 ) r i ( ! ) : resample all variables in clause i.

23 Resampling Oracles Need some algorithmic access to probability space ¹ Def: A resampling oracle for E i is r i :  !  satisfying – (R1) If ! » ¹  E i then r i ( ! ) » ¹ – (R2) If j  ¡ + ( i ) and E j does not occur in !, then E j does not occur in r i ( ! ). Theorem: For any probability space and any events, TFAE – Existence of a lopsided association graph – Existence of resampling oracles. No guarantees it is compactly representable or efficient. What if events involve some NP-hard problem?

24 Resampling Oracles Def: A resampling oracle for E i is r i :  !  satisfying – (R1) If ! » ¹  E i then r i ( ! ) » ¹ – (R2) If j  ¡ + ( i ) and E j does not occur in !, then E j does not occur in r i ( ! ). Theorem: For any probability space and any events, TFAE – Existence of a lopsided association graph – Existence of resampling oracles. No guarantees it is compactly representable or efficient. What if events involve some NP-hard problem?

25 Spanning Trees in K n Let T be a uniformly random spanning tree in K n For edge set A, let E A = { A µ T }. Lopsidependency Graph: Make E A a neighbor of E B, unless A and B are vertex-disjoint. Resampling oracle r A : – If T uniform conditioned on E A, want r A ( T ) uniform. – But, should not disturb edges that are vtx-disjoint from A. Dependency

26 E A = { A µ T }. Resampling oracle r A ( T ): – If T uniform conditioned on E A, want r A ( T ) uniform. – But, should not disturb edges that are vtx-disjoint from A. A T – Contract edges of T vtx-disjoint from A – Delete remaining edges vtx-disjoint from A – Let r A ( T ) be a uniformly random spanning tree in resulting graph. rA(T )rA(T )

27 Main Theorem Given – any probability space and events – a lopsided association graph – resampling oracles for the events Suppose that General LLL, Cluster Expansion or Shearer * holds. There is an algorithm to find a point in Å i E i c, running in oracle-polynomial y time, whp. * Need slack for Shearer’s condition. y Actually depending on parameters in General LLL, etc.

28 Main Theorem Given – any probability space and events – a lopsided association graph – resampling oracles for the events General LLL: 9 x 1,..., x n 2 ( 0, 1 ) such that P [ E i ] · x i ¢  j 2 ¡ ( i ) ( 1 - x j ) 8 i. There is an algorithm to find a point in Å i E i c, using oracle calls, whp.

29 Main Theorem Given – any probability space and events – a lopsided association graph – resampling oracles for the events Cluster Expansion: 9 y 1,..., y n > 0 such that P [ E i ] · y i /  J µ ¡ + ( i ) y J 8 i. There is an algorithm to find a point in Å i E i c, using oracle calls, whp. J 2 Ind

30 The Algorithm MaximalSetResample Draw ! from ¹ Repeat J Ã ; While there is i  ¡ + ( J ) s.t. E i occurs in ! Pick smallest such i J Ã J [ { i } ! Ã r i ( ! ) End Until J = ; Similar to parallel form of Moser-Tardos

31 Analysis Review of Moser-Tardos E3E3 E9E9 E2E2 E5E5 E1E1 E7E7 E1E1 E3E3 E9E9 … “Log” or “History” of resampling operations Start Dependency graph edges

32 Analysis Review of Moser-Tardos E3E3 E9E9 E2E2 E5E5 E1E1 E7E7 E1E1 E3E3 E9E9 … “Log” or “History” of resampling operations Start Dependency graph edges Witness tree

33 MaximalSetResample Draw ! from ¹ Repeat J Ã ; While there is i  ¡ + ( J ) s.t. E i occurs in ! Pick smallest such i J Ã J [ { i } ! Ã r i ( ! ) End Until J = ; J 1, J 2, J 3, … Resampling sequence: Ji+1 µ ¡+(Ji)Ji+1 µ ¡+(Ji) J i 2 Ind,

34 Trivially, E[ # iterations ] = § Pr[ I 1, I 2,… is a prefix of resampling sequence ] Claim: Pr[ I 1,…, I k is a prefix ] ·  1 · i · k p I i Like Moser-Tardos, this is a coupling argument. But instead of using fact that variable have “fresh samples” whenever examined, we use the fact that r i returns the state to the underlying distribution ¹. I 1, I 2,…

35 Trivially, E[ # iterations ] = § Pr[ I 1, I 2,… is a prefix of resampling sequence ] Claim: Pr[ I 1,…, I k is a prefix ] ·  1 · i · k p I i Proof: Like Moser-Tardos, using r i returns state to ¹. Claim: Let P [ E i ] · ( 1- ² ) x i ¢  j 2 ¡ ( i ) ( 1 - x j ) (General LLL) Then §  1 · i · k p I i · Proof: Key idea: inductively show that §  1 · i · k p I i · I 1, I 2,… I 1,…, I k I 1 = J

36 Comparison to Fotis’ work Pros for Fotis: – Don’t need to sample from ¹ (there is no ¹ ) – Can handle scenarios that are not even a lopsidependency graph, e.g., Hamilton cycles, Vtx Coloring. Pros for us: – Characterize when resampling operations exist – Gives new proof of General LLL with Dependency Graphs, in full generality. (Even for Shearer & Lopsided Association Graphs). – Don’t need any slack in General LLL or Cluster Expansion conditions – “Fractional transitions” seem easier in our setting


Download ppt "An Algorithmic Proof of the Lopsided Lovasz Local Lemma Nick Harvey University of British Columbia Jan Vondrak IBM Almaden TexPoint fonts used in EMF."

Similar presentations


Ads by Google