“Using Weighted MAX-SAT Engines to Solve MPE” -- by James D. Park Shuo (Olivia) Yang.

Slides:



Advertisements
Similar presentations
Complexity ©D.Moshkovits 1 Where Can We Draw The Line? On the Hardness of Satisfiability Problems.
Advertisements

Local Search Jim Little UBC CS 322 – CSP October 3, 2014 Textbook §4.8
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) Oct, 5, 2012.
1 Constraint Satisfaction Problems A Quick Overview (based on AIMA book slides)
Propositional and First Order Reasoning. Terminology Propositional variable: boolean variable (p) Literal: propositional variable or its negation p 
Proofs from SAT Solvers Yeting Ge ACSys NYU Nov
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Mar, 4, 2015 Slide credit: some slides adapted from Stuart.
CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination -Hareesh Lingareddy University of South Carolina.
Daniel Kroening and Ofer Strichman 1 Decision Procedures An Algorithmic Point of View SAT.
1/30 SAT Solver Changki PSWLAB SAT Solver Daniel Kroening, Ofer Strichman.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Properties of SLUR Formulae Ondřej Čepek, Petr Kučera, Václav Vlček Charles University in Prague SOFSEM 2012 January 23, 2012.
ECE 667 Synthesis & Verification - SAT 1 ECE 667 ECE 667 Synthesis and Verification of Digital Systems Boolean SAT CNF Representation Slides adopted (with.
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) February, 3, 2010.
CS21 Decidability and Tractability
1 Boolean Satisfiability in Electronic Design Automation (EDA ) By Kunal P. Ganeshpure.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 1 Ryan Kinworthy CSCE Advanced Constraint Processing.
NP-Complete Problems Reading Material: Chapter 10 Sections 1, 2, 3, and 4 only.
Inference and Resolution for Problem Solving
NP-Complete Problems Problems in Computer Science are classified into
1 Discrete Structures CS 280 Example application of probability: MAX 3-SAT.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graphs.
SAT Solving Presented by Avi Yadgar. The SAT Problem Given a Boolean formula, look for assignment A for such that.  A is a solution for. A partial assignment.
Linear Systems The definition of a linear equation given in Chapter 1 can be extended to more variables; any equation of the form for real numbers.
SAT Solver Math Foundations of Computer Science. 2 Boolean Expressions  A Boolean expression is a Boolean function  Any Boolean function can be written.
1 Efficient Stochastic Local Search for MPE Solving Frank Hutter The University of British Columbia (UBC), Vancouver, Canada Joint work with Holger Hoos.
The Theory of NP-Completeness 1. What is NP-completeness? Consider the circuit satisfiability problem Difficult to answer the decision problem in polynomial.
1 The Theory of NP-Completeness 2012/11/6 P: the class of problems which can be solved by a deterministic polynomial algorithm. NP : the class of decision.
Nattee Niparnan. Easy & Hard Problem What is “difficulty” of problem? Difficult for computer scientist to derive algorithm for the problem? Difficult.
Boolean Satisfiability and SAT Solvers
Design Techniques for Approximation Algorithms and Approximation Classes.
Lecture 22 More NPC problems
1 Chapter 8 Inference and Resolution for Problem Solving.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE COS302 MICHAEL L. LITTMAN FALL 2001 Satisfiability.
Explorations in Artificial Intelligence Prof. Carla P. Gomes Module 3 Logic Representations (Part 2)
Solvers for the Problem of Boolean Satisfiability (SAT) Will Klieber Aug 31, 2011 TexPoint fonts used in EMF. Read the TexPoint manual before you.
INHERENT LIMITATIONS OF COMPUTER PROGRAMS CSci 4011.
1 Agenda Modeling problems in Propositional Logic SAT basics Decision heuristics Non-chronological Backtracking Learning with Conflict Clauses SAT and.
Complexity 25-1 Complexity Andrei Bulatov Counting Problems.
Mechanical Engineering Department 1 سورة النحل (78)
Learning the Structure of Related Tasks Presented by Lihan He Machine Learning Reading Group Duke University 02/03/2006 A. Niculescu-Mizil, R. Caruana.
Explorations in Artificial Intelligence Prof. Carla P. Gomes Module Logic Representations.
Probabilistic Networks Chapter 14 of Dechter’s CP textbook Speaker: Daniel Geschwender April 1, 2013 April 1&3, 2013DanielG--Probabilistic Networks1.
NP-Complete Problems. Running Time v.s. Input Size Concern with problems whose complexity may be described by exponential functions. Tractable problems.
On the Relation between SAT and BDDs for Equivalence Checking Sherief Reda Rolf Drechsler Alex Orailoglu Computer Science & Engineering Dept. University.
First-Order Logic and Inductive Logic Programming.
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Oct, 30, 2015 Slide credit: some slides adapted from Stuart.
Tommy Messelis * Stefaan Haspeslagh Patrick De Causmaecker *
Automated discovery in math Machine learning techniques (GP, ILP, etc.) have been successfully applied in science Machine learning techniques (GP, ILP,
Learning and Acting with Bayes Nets Chapter 20.. Page 2 === A Network and a Training Data.
Strings Basic data type in computational biology A string is an ordered succession of characters or symbols from a finite set called an alphabet Sequence.
Heuristics for Efficient SAT Solving As implemented in GRASP, Chaff and GSAT.
28.
NPC.
NP Completeness Piyush Kumar. Today Reductions Proving Lower Bounds revisited Decision and Optimization Problems SAT and 3-SAT P Vs NP Dealing with NP-Complete.
SAT Solving As implemented in - DPLL solvers: GRASP, Chaff and
Today Graphical Models Representing conditional dependence graphically
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness Proofs.
Non-LP-Based Approximation Algorithms Fabrizio Grandoni IDSIA
CSC 444 Presentation (11/30/10) Bayesian reasoning using MAX-SAT and model-counting Shuo Yang Karl L. Stratos (a.k.a. Jang Sun Lee)
SUBSET-SUM Instance: A set of numbers denoted S and a target number t.
Inference in Bayesian Networks
Richard Anderson Lecture 26 NP-Completeness
(xy)(yz)(xz)(zy)
Computer Science cpsc322, Lecture 14
Directional Resolution: The Davis-Putnam Procedure, Revisited
Where Can We Draw The Line?
Decision Procedures An Algorithmic Point of View
Chapter 11 Limitations of Algorithm Power
Presentation transcript:

“Using Weighted MAX-SAT Engines to Solve MPE” -- by James D. Park Shuo (Olivia) Yang

MPE and MAX-SAT MPE (Most Probable Explanation) – The problem of finding the variable instantiation of a Bayesian network that has the highest probability given some evidence. – NP-complete problem Weighted MAX-SAT – The problem of taking a set of clauses with associated weight, and finding the instantiation that produces the largest sum of the weights of satisfied clauses. – Resolve conflicts in a knowledge base.

Using Weighted MAX-SAT to solve MPE Reduce MPE to weighted MAX-SAT Apply incomplete methods for MAX-SAT to produce solution (Local search algorithms) – Discrete Lagrangian Multiplier – Guided Local Search MAX-SAT algorithms proved to me more powerful in solving large problems.

MPE and MAX-SAT Instantiation – An instantiation of a set of variable s is a function that assigns a value to each variable in the set. Compatible – Two instantiation are compatible if they agree on the assignment of all the variables that they have in common. Eg: – A={a1,a2,a3}, B={b1,b2} – Instantiation {a1} is compatible with instantiations (a1,b2),(a1,b1),(b1),(b2), but not with (a2,b1) and (a2,b2)

MPE and MAX-SAT Conditional probability table (CPT) – A conditional probability table T for a variable V with a set of parent variables P is a function that maps each instantiation of V U P to real value in [0,1] such that for any instantiation p of P, ∑ v T({v}Up) = 1 where v ranges over the values of V. Eg: the CPT for Bayesian network A->B is APr(A) a10.3 a20.5 a30.2 ABPr(B|A) a1b10.2 a1b20.8 a2b11 a2b20 a3b10.6 a3b20.4

MPE and MAX-SAT Bayesian network – A pair (G,P), where G is a directed acyclic graph whose nodes are variables, and P is a set which consists of the CPT of each variable in G, where the parents of each CPT correspond to the parents of the corresponding variable in the graph. – The probability of a complete instantiation is the product of the entry of each CPT that is compatible with the instantiation. Eg: The probability of the instantiation (a3, b1) in the given example is Pr(a3)*Pr(a3|b1) = 0.2*0.6 = 0.12

MPE and MAX-SAT Literal – a variable or its negation Clause – Disjunction of literals and a weighted clauses Weighted CNF formula – A set of weighted clauses – the weight of a completed instantiation of a weighted CNF formula is the sum of the weight of the satisfied clauses. Eg: – Given weighted CNF : (x v ~y v ~z) 3 ^ (~x) 10.1 ^ (y) 0.5 – The instantiation (x, y, ~z) has weight: = 3.5

MPE Given a Bayesian Network and an instantiation of a subset of the variables (the evidence), find the (not necessarily unique) complete instantiation with the highest probability that is compatible with the evidence.

Reduce MPE to MAX-SAT Weight clauses – The only instantiations that do not satisfy the clauses l1 v l2 v … v ln are the instantiations in which each literal in the clause evaluates to false. – Each row in the CPT generates a weighted clauses which contains the negation of each variables in the row and is weighted with the negative log of the conditional probability.

Reduce MPE to MAX-SAT Theorem 1 – for any instantiation I of a positive Bayesian Network which contains only binary variables, the sum of the weights of the clauses that I leaves unsatisfied in the induced weighted CNF entry is compatible with instantiation. Maximizing the weight of the SAT clauses, minimizes the sum of the UNSAT clauses which is equivalent to maximizing the probability in the original network.

Reduce MPE to MAX-SAT Weight clauses (eg.) – Given network C->D with CPT: – Induces the weighted CNF expression (~c) -log.3 ^ (c) -log.7 ^ (~c v ~d) -log.2 ^ (~c v d) -log.8 ^ (c v ~d) -log.1 ^ (c v d) -log.9 – Given instantiation c, ~d. – Unsatisfied clauses: (~c) -log.3 and (~c v d) -log.8 – Sum of weights of unsatisfied clauses is –log.24 = negative log of the probability of (c, ~d) CPr(C) c0.3 ~c0.7 CDPr(D|C) cd0.2 c~d0.8 ~cd0.1 ~c~d0.9

Reduce MPE to MAX-SAT Handling Zero – Log of zero is undefined – Weight –log0 with a sufficiently large value w 0 – Theorem 2 Let w 0 be a greater than the sum of the negative logs of the non-zero CPT entries of a Bayesian network with binary variables. Let the weight of the clauses whose associated CPT entries are zero have weight w 0, with the other weights as before. Then, any positive probability instantiation I have a higher score for the weighted CNF expression than any zero probability instantiation. Additionally, the sum of the weights of the unsatisfied clauses for I remains –logPr(I)

Reduce MPE to MAX-SAT Beyond binary variables – Any locally maximal instantiation of the induced weighted CNF expression satisfies the constraint that exactly one of the indicator variables is true for each variable. Additionally, the weight of the unsatisfied clauses for positive probability instantiation I remains –logPr(I). – Eg:

Reduce MPE to MAX-SAT Entering evidence – Replacing the propositional variables that correspond to the evidence to their appropriate values – Dropping any clause that contains true, and removing false from all clauses. – Eg: Entering evidence (a1), then a1=true, a2 = a3 = false The weighted CNF formula () -log.3 ^ (~b1) -log.2 ^ (~b2) -log.8

MAX-SAT Algorithms Local search – general optimization technique Hill climbing – works by repeatedly improving the current solution by moving the a better neighboring solution. Neighbors of an instantiation – Those instantiations produced by changing which indicator corresponding to a particular network variable is set Eg: – The neighbors of (a1, ~a2, ~a3,b1, ~b2) are (a1,~a2,~a3, ~b1,b2), (~a1, a2, ~a3, b1,~b2), and (~a1, ~a2, a3, b1, ~b2)

MAX-SAT Algorithms p: problem instance, hillClimb: routine iterates through the variables, selecting the best change for each until a local minimum is reached Also compute the true score at each step and remembers the best state encountered The MAX-SAT algorithm is considered differ only in the way in which they generate the objective functions.

MAX-SAT Algorithms Discrete Lagrangian Multipliers – Cost function: C ranges over the unsatisfied clauses. Initially, each is zero, each time a local minimum is encountered the corresponding to the unsatisfied clauses are increased by adding a constant.

MAX-SAT Algorithms Guided Local Search – Heuristically developed method for solving combinatorial optimization problems. – Initial coast function: number of un satisfied clauses – Objective function, where C ranges over the unsatisfied clauses – When a local minimum is reached the weights of some of the unsatisfied clauses are increases. – GLS provided the best overall performance in the experimental test comparing with SLS and DLM.