Download presentation
Presentation is loading. Please wait.
Published byWillis Harrington Modified over 9 years ago
1
The satisfiability threshold and clusters of solutions in the 3-SAT problem Elitza Maneva IBM Almaden Research Center
2
3-SAT Variables: x 1, x 2, …, x n take values {TRUE, FALSE} Constraints: (x 1 or x 2 or not x 3 ), (not x 2 or x 4 or not x 6 ), … (x 1 x 2 x 3 ) ( x 2 x 4 x 6 ) … x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 x8x8 ___ 1 1 0 0 0
3
PLR PLR Random walk Random walk Belief propagation Belief propagation Survey propagation Survey propagationNotsatisfiable Satisfiable Satisfiable Notsatisfiable Random 3-SAT 01.633.953.524.274.51 Myopic Myopic x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 x8x8 n m = n Red = proved, green = unproved
4
Rigorous bounds for random 3-SAT 1999: [Friedgut] there is a sharp threshold of satisfiability c (n) 2002 Kaporis Kirousis Lalas 2002 Hajiaghayi Sorkin 3.52
5
Rigorous bounds for random 3-SAT 01.633.524.51 5.19 Pure Literal Rule Algorithm: If any variable appears only positive or only negative assign it 1 or 0 respectively Simplify the formula by removing the satisfied clauses Repeat (x 1 x 2 x 3 ) ( x 2 x 4 x 5 ) (x 1 x 2 x 4 ) (x 3 x 4 x 5 ) 1 1 _____ 0 1
6
Rigorous bounds for random 3-SAT 01.633.524.51 5.19 Myopic Algorithms: Choose a variable according to # positive and negative occurrences Assign the variable the more popular value Simplify the formula by 1. removing the satisfied clauses 2. removing the FALSE literals 3. assigning variables in unit clauses 4. assigning pure variables Repeat Best rule: maximum |# positive occurr. – # negative occurr.|
7
Rigorous bounds for random 3-SAT 01.633.524.51 E [# solutions] = 2 n Pr [00…0 is a solution] = = 2 n (1-1/8) m = = (2 (7/8) ) n For >5.191, E [# solutions] 0, so Pr [satisfiable] 0 5.19
8
Rigorous bounds for random 3-SAT 01.633.524.51 E [# positively prime solutions] 0 Positively prime solution: a solution in which no variable assigned 1 can be converted to 0. Fact: If there exists a solution, there exists a positively prime solution. 5.19 4.67
9
Rigorous bounds for random 3-SAT 01.633.524.51 E [# symmetrically prime solutions] 0 5.19 4.67
10
PLR PLR Random walk Random walk Belief propagation Belief propagation Survey propagation Survey propagation Satisfiable Random 3-SAT 01.633.953.524.274.51 Myopic Myopic x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 x8x8 n m = n Red = proved, green = unproved
11
Random Walk Algorithms [Alekhnovich, Ben-Sasson `03] Simple Random Walk: Pick an unsatisfied clause Pick a variable in the clause Flip the variable Theorem: Finds a solution in O(n) steps for < 1.63. [Seitz, Alava, Orponen `05] [Ardelius, Aurell `06] ASAT: Pick an unsatisfied clause Pick a variable in the clause Flip it only with prob. p if number of unsatisfied clauses does not increase Experiment: Takes O(n) steps for < 4.21.
12
PLR PLR Random walk Random walk Belief propagation Belief propagation Survey propagation Survey propagation Satisfiable Random 3-SAT 01.633.953.524.274.51 Myopic Myopic x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 x8x8 n m = n Red = proved, green = unproved
13
We can find solutions via inference Suppose the formula is satisfiable. Consider the uniform distribution over satisfying assignments: over satisfying assignments: Pr[x 1, x 2, …, x n ] (x 1, x 2, …, x n ) Pr[x 1, x 2, …, x n ] (x 1, x 2, …, x n ) Simple Claim: If we can compute Pr[x i =1], then we can find a solution fast. Decimation: Assign variables one by one to a value that has highest probability.
14
Fact: We cannot hope to compute Pr[x i =1] exactly Heuristics for guessing the best variable to assign: 1.Pure Literal Rule (PLR): Choose a variable that appears always positive / always negative. 2. Myopic Rule: Choose a variable based on number of positive and negative occurrences. 3. Belief Propagation: Estimate Pr[x i =1] by belief propagation and choose variable with largest estimated bias.
15
Computing Pr[x 1 =0] on a tree formula x1x1 10819211 11111111 11 11 34433412123648 #Solutions with 0 #Solutions with 1 #Solns with 0 #Solns with 1
16
Vectors can be normalized x1x1.36.64.5.5.43.57.43.57.43.57.57.43
17
… and thought of as messages x1x1 Vectors can be normalized
18
What if the graph is not a tree? Belief propagation
19
x 11 x5x5 x1x1 x4x4 x 10 x6x6 x9x9 x8x8 x7x7 x3x3 x2x2 Pr[x 1, …, x n ] Π a a (x N( a ) ) (x 1, x 2, x 3 )
20
Belief Propagation [Pearl ’88] x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 n m Given: Pr[x 1 …x 7 ] a (x 1, x 3 ) b (x 1, x 2 ) c (x 1, x 4 ) … Goal: Compute Pr[x 1 ] (i.e. marginal) Message passing rules: M i c (x i ) = Π M b i (x i ) M c i (x i ) = Σ c (x N(c) ) Π M j c (x j ) Estimated marginals: i (x i ) = Π M c i (x i ) x j : j N( c )\ij N( c )\i c N(i) b N(i)/ c i.e. Markov Random Field (MRF) Belief propagation is a dynamic programming algorithm. It is exact only when the recurrence relation holds, i.e.: 1.if the graph is a tree. 2.if the graph behaves like a tree: large cycles
21
Applications of belief propagation Statistical learning theoryStatistical learning theory VisionVision Error-correcting codes (Turbo, LDPC, LT)Error-correcting codes (Turbo, LDPC, LT) Lossy data-compressionLossy data-compression Computational biologyComputational biology Sensor networksSensor networks
22
PLR PLR Random walks Random walks Belief propagation Belief propagation Survey propagation Survey propagation Satisfiable Limitations of BP 01.633.953.524.274.51 Myopic Myopic x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 x8x8 n m = n
23
Reason for failure of Belief Propagation Messages from different neighbors are assumed to be almost independent i.e. there are no long-range correlationsMessages from different neighbors are assumed to be almost independent i.e. there are no long-range correlations PLR PLR Random walks Random walks Belief propagation Belief propagation Survey propagation Survey propagation 01.633.953.524.274.51 Myopic Myopic No long-range correlations Long-range correlations exist
24
Reason for failure of Belief Propagation Messages from different neighbors are assumed to be almost independent i.e. there are no long-range correlationsMessages from different neighbors are assumed to be almost independent i.e. there are no long-range correlations Fix: 1-step Replica Symmetry Breaking Ansatz The distribution can be decomposed into “phases”The distribution can be decomposed into “phases” There are no long-range correlations within a phaseThere are no long-range correlations within a phase Each phase consists of similar assignments – “clusters”Each phase consists of similar assignments – “clusters” Messages become distributions of distributionsMessages become distributions of distributions An approximation yields 3-dimensional messages:An approximation yields 3-dimensional messages: Survey Propagation [Mezard, Parisi, Zecchina ‘02] Survey propagation finds a phase, then WalkSAT is used to find a solution in the phaseSurvey propagation finds a phase, then WalkSAT is used to find a solution in the phase
25
Reason for failure of Belief Propagation Messages from different neighbors are assumed to be almost independent i.e. there are no long-range correlationsMessages from different neighbors are assumed to be almost independent i.e. there are no long-range correlations Fix: 1-step Replica Symmetry Breaking Ansatz The distribution can be decomposed into “phases”The distribution can be decomposed into “phases” Pr[x 1, x 2, …, x n ] = p Pr [x 1, x 2, …, x n ] Pr[x 1, x 2, …, x n ] = p Pr [x 1, x 2, …, x n ]
26
fixed variables
27
Space of solutions Satisfying assignments in {0, 1} n 01 1 0 10 11 01101 phases
28
Survey propagation.12.81.0701
29
M c i = ———————— M u i c = (1- (1- M b i )) (1-M b i ) M s i c = (1- (1- M b i )) (1-M b i ) M i c = (1- M b i ) M u j c M u j c +M s j c +M j c j N(c)\i b N s a (i)b N u a (i) b N s c (i)b N u c (i) b N(i)\c x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 x8x8 You have to satisfy me with prob. 60% I’m 0 with prob 10%, 1 with prob 70%, whichever (i.e. ) 20%
30
Combinatorial interpretation Can survey propagation be thought of as inference on cluster assignments? Not precisely, but close. We define a related concept of core/cover assignments Assignments in the same cluster share the same core However, different cluster may have the same core
31
Finding the core of a solution 0 0 0 1 1 1 0 0
32
0 0 0 1 1 1 0 0
33
0 0 0 1 0 1 0 0 unconstrained variables
34
Finding the core of a solution 0 0 0 1 1 0 0
35
0 0 0 1 1 0
36
0 0 0 1 0
37
0 0 0 1 Such a fully constrained partial assignment is called a cover.
38
Partial assignments {0,1, } n {0, 1} n assignments 01 1 0 10 11 01101 # stars core core Extending the space of assignments
39
Theorem: Survey propagation is equivalent to belief propagation on the uniform distribution over cover assignments. Survey propagation is a belief propagation algorithm [Maneva, Mossel, Wainwright ‘05] [Braunstein, Zecchina ‘05] But, we still need to look at all partial assignments.
40
Peeling Experiment for 3-SAT, n =10 5
41
Clusters and partial assignments Partial assignments {0, 1} n assignments # stars 01101 01 1 0 10 11 01101
42
01 1 1 4 2 3 n()n()n()n() 3. A family of belief propagation algorithms: 0 1 Vanilla BP SP Pr[ ] (1- ) n()n()n()n() no()no()no()no() Definition of the new distribution Formula 2. Weight of partial assignments: no()no()no()no() 1. Includes all assignments without contradictions or implications
43
Partial assignments {0, 1} n assignments 01 1 0 10 11 01101 # stars core core =0 =1 Pr[ ] (1- ) n()n()n()n() no()no()no()no() 0 1 Vanilla BP SP This is the correct picture for 9-SAT and above. [Achlioptas, Ricci-Tersenghi ‘06]
44
Clustering for k-SAT What is known? 2-SAT: a single cluster 3-SAT to 7-SAT: not known 8-SAT and above: exponential number of clusters (with second moment method) [Mezard, Mora, Zecchina `05] [Mezard, Mora, Zecchina `05] [Achlioptas, Ricci-Tersenghi `06] [Achlioptas, Ricci-Tersenghi `06] 9-SAT and above: clusters have non-trivial cores (with differential equations method) [Achlioptas, Ricci-Tersenghi `06] [Achlioptas, Ricci-Tersenghi `06]
45
1111 111 111 11 11111111 1 11111111 1111 1 1 01 1 10100111 011 010 10 0 Convex geometry / Antimatroid Total weight is 1 for every
46
Rigorous bounds for random 3-SAT 03.524.51 5.19 E [total weight of partial assignments] 0 ( = 0.8) Fact: If there exists a solution, the weight of partial assignments is at least 1. 4.9
47
Rigorous bounds for random 3-SAT Theorem [ Maneva, Sinclair ] For > 4.453 one of the following holds: 1.there are no satisfying assignments with high probability; 2.the core of every satisfying assignment is ( , ,…, ). 4.453 03.524.51 5.19
48
PLR PLR Random walk Random walk Belief propagation Belief propagation Survey propagation Survey propagation Satisfiable Random 3-SAT 01.633.953.524.274.51 Myopic Myopic x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x7x7 x8x8 n m = n Red = proved, green = unproved
49
Challenges Improve the bounds on the threshold Prove algorithms work with high probability Find an algorithm for certifying that a formula with n clauses for large has no solution
50
Thank you
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.