Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using Problem Structure for Efficient Clause Learning Ashish Sabharwal, Paul Beame, Henry Kautz University of Washington, Seattle April 23, 2003.

Similar presentations


Presentation on theme: "Using Problem Structure for Efficient Clause Learning Ashish Sabharwal, Paul Beame, Henry Kautz University of Washington, Seattle April 23, 2003."— Presentation transcript:

1 Using Problem Structure for Efficient Clause Learning Ashish Sabharwal, Paul Beame, Henry Kautz University of Washington, Seattle April 23, 2003

2 University of Washington2 The SAT Approach Input p 2 D CNF encoding f SAT solver f  SATf  SAT p : Instance D : Domain graph problem, AI planning, model checking p bad p good

3 April 23, 2003University of Washington3 Key Facts Problem instances typically have structure –Graphs, precedence relations, cause and effects –Translation to CNF flattens this structure Best complete SAT solvers are –DPLL based clause learners; branch and backtrack –Critical: Variable order used for branching

4 April 23, 2003University of Washington4 Natural Questions Can we extract structure efficiently? –In translation to CNF formula itself –From CNF formula –From higher level description How can we exploit this auxiliary information? –Tweak SAT solver for each domain –Tweak SAT solver to use general “guidance”

5 April 23, 2003University of Washington5 Our Approach Input p 2 D CNF encoding f SAT solver f  SATf  SAT Encode “structure” as branching sequence p bad p good Branching sequence

6 April 23, 2003University of Washington6 Related Work Exploiting structure in CNF formula –[GMT’02] Dependent variables –[OGMS’02]LSAT (blocked/redundant clauses) –[B’01] Binary clauses –[AM’00]Partition-based reasoning Exploiting domain knowledge –[S’00] Model checking –[KS’96]Planning (cause vars / effect vars)

7 April 23, 2003University of Washington7 Our Result, Informally –Structure can be efficiently retrieved from high level description (pebbling graph) –Branching sequence as auxiliary information can be easily exploited Given a pebbling graph G, can efficiently generate a branching sequence B G that dramatically improves the performance of current best SAT solvers on f G.

8 April 23, 2003University of Washington8 Preliminaries: CNF Formula f = (x 1 OR x 2 OR : x 9 ) AND ( : x 3 OR x 9 ) AND ( : x 1 OR : x 4 OR : x 5 OR : x 6 ) Conjunction of clauses

9 April 23, 2003University of Washington9 Preliminaries: DPLL DPLL(CNF formula f) { Simplify(f); If (conflict) return UNSAT; If (all-vars-assigned) {return SAT assignment; exit} Pick unassigned variable x; Try DPLL(f | x=0 ), DPLL(f | x=1 ) }

10 April 23, 2003University of Washington10 Prelim: Clause Learning DPLL: Change “if (conflict) return UNSAT” to “if (conflict) {learn conflict clause; return UNSAT}” x 2 = 1, x 3 = 0, x 6 = 0 ) conflict “Learn” ( : x 2 OR x 3 OR x 6 )

11 April 23, 2003University of Washington11 Prelim: Branching Sequence B = (x 1, x 4, : x 3, x 1, : x 8, : x 2, : x 4, x 7, : x 1, x 2 ) DPLL: Change “Pick unassigned var x” to “Pick next literal x from B; delete it from B; if x already assigned, repeat” How “good” is B? –Depends on backtracking process, learning scheme Different from “branching order”

12 April 23, 2003University of Washington12 Prelim: Pebbling Formulas (a 1 OR a 2 )(b 1 OR b 2 ) (e 1 OR e 2 ) (t 1 OR t 2 ) (f1)(f1) (c 1 OR c 2 OR c 3 ) Target(s) Sources E ABC F T Node E is pebbled if (e 1 OR e 2 ) = 1 f G = Pebbling(G) Source axioms: A, B, C are pebbled Pebbling axioms: A and B are pebbled ) E is pebbled … Target axioms: T is not pebbled

13 April 23, 2003University of Washington13 Prelim: Pebbling Formulas Can have –Multiple targets –Unbounded fanin –Large clause labels Pebbling(G) is unsatisfiable Removing any clause from subgraph of each target makes it satisfiable

14 April 23, 2003University of Washington14 Grid vs. Randomized Pebbling (a1  a2)(a1  a2) b1b1 (c 1  c 2  c 3 ) (d 1  d 2  d 3 ) l1l1 (h1  h2)(h1  h2) (i 1  i 2  i 3  i 4 ) e1e1 (g1  g2)(g1  g2) f1f1 (n1  n2)(n1  n2) m1m1 (a1  a2)(a1  a2)(b1  b2)(b1  b2)(c1  c2)(c1  c2)(d1  d2)(d1  d2) (e1  e2)(e1  e2) (h1  h2)(h1  h2) (t1  t2)(t1  t2) (i1  i2)(i1  i2) (g1  g2)(g1  g2)(f1  f2)(f1  f2)

15 April 23, 2003University of Washington15 Why Pebbling? Practically useful –precedence relations in tasks, fault propagation in circuits, restricted planning problems Theoretically interesting –Used earlier for separating proof complexity classes –“Easy” to analyze Hard for current best SAT solvers like zChaff –Shown by our experiments

16 April 23, 2003University of Washington16 Our Result, Again –Efficient :  (|f G |) –zChaff : One of the current best SAT solvers Given a pebbling graph G, can efficiently generate a branching sequence B G such that zChaff(f G, B G ) is empirically exponentially faster than zChaff(f G ).

17 April 23, 2003University of Washington17 The Algorithm Input: –Pebbling graph G Output: –Branching sequence B G, |B G | =  (|f G |), that works well for 1UIP learning scheme and fast backtracking [f G : CNF encoding of pebbling(G)]

18 April 23, 2003University of Washington18 The Algorithm: GenSeq(G) 1.Compute node heights 2.Foreach u 2 {unit clause labeled nodes} bottom up Add u to G.sources GenSubseq(u) 3.Foreach t 2 {targets} bottom up GenSubseq(t)

19 April 23, 2003University of Washington19 The Algorithm: GenSubseq(v) // trivial wrapper 1.If (|v.preds| > 0) –GenSubseq(v, |v.preds|)

20 April 23, 2003University of Washington20 The Algorithm: GenSubseq(v, i) 1. u = v.preds[i] // by increasing height 2. if i=1 // lowest pred a. GenSubseq(u) if unvisited non-source b. return 3. Output u.labels// higher pred 4. GenSubseq(u) if unvisitedHigh non-source 5. GenSubseq(v, i-1)// recurse on i-1 6. GenPattern(u, v, i-1)// repetitive pattern

21 April 23, 2003University of Washington21 Results: Grid Pebbling –Pure DPLL upto 60 variables –DPLL +upto 60 variables branching seq –Clause learningupto 4,000 variables (original zChaff) –Clause learningupto 2,000,000 variables + branching seq

22 April 23, 2003University of Washington22 Results: Randomized Pebl. –Pure DPLL upto 35 variables –DPLL +upto 50 variables branching seq –Clause learningupto 350 variables (original zChaff) –Clause learningupto 1,000,000 variables + branching seq

23 April 23, 2003University of Washington23 Summary High level problem description is useful –Domain knowledge can help SAT solvers Branching sequence –One good way to encode structure Pebbling problems: Proof of concept –Can efficiently generate good branching sequence –Structure use improves performance dramatically

24 April 23, 2003University of Washington24 Open Problems Other domains? –STRIPS planning problems (layered structure) –Bounded model checking Variable ordering strategies from BDDs? Other ways of exploiting structure? –branching “order” –something to guide learning? –Domain-based tweaking of SAT algorithms


Download ppt "Using Problem Structure for Efficient Clause Learning Ashish Sabharwal, Paul Beame, Henry Kautz University of Washington, Seattle April 23, 2003."

Similar presentations


Ads by Google