Presentation is loading. Please wait.

Presentation is loading. Please wait.

Logic Programming & LPMLN

Similar presentations


Presentation on theme: "Logic Programming & LPMLN"β€” Presentation transcript:

1 Logic Programming & LPMLN

2 Table of Contents Logic programming Answer set programming
Markov logic network LPMLN

3 Logic Programming

4 Definitions Term: 𝑐 π‘₯ 𝑓( 𝑑 1 , …, 𝑑 𝑛 ) Atom: 𝑃 𝑑 1 ,…, 𝑑 𝑛 Literal: 𝐴 | ¬𝐴 Substitution: πœƒ={π‘₯ / π‘Ž, 𝑦 / 𝑧}

5 Rule & Fact (Β¬π‘βˆ§Β¬π‘žβˆ§β€¦βˆ§Β¬π‘‘βˆ§π‘’) or π‘βˆ§π‘žβˆ§β€¦βˆ§π‘‘ →𝑒 u :- p, q, ..., t. (Rule) u :- . (Fact) head :- body.

6 Basic Datalog 𝐿 0 :βˆ’ 𝐿 1 , 𝐿 2 ,…, 𝐿 𝑛
𝐿 0 :βˆ’ 𝐿 1 , 𝐿 2 ,…, 𝐿 𝑛 𝐿 𝑖 is a literal of the form 𝑝 𝑖 𝑑 1 , 𝑑 2 , , 𝑑 π‘˜ 𝑖 𝑑 𝑗 is either a constant or a variable Safety condition for program 𝑃 Each fact of 𝑃 is ground Each variable which occurs in the head of a rule of 𝑃 must also occur in the body of the same rule These conditions guarantee that the set of all facts that can be derived from a Datalog program is finite

7 Datalog Rule Example If X is a parent of Y and if Y is a parent of Z, then X is a grandparent of Z. grandpar(Z, X) :- par(Y, X), par(Z, Y). Same Generation Cousin sgc(X, X) :- person(X). sgc(X, Y) :- par(X, X1), sgc(X1, Y1), par(Y, Y1).

8 Herbrand Model A set of ground facts 𝐼 that satisfy a clause 𝐢 or a set of clause 𝑆 is called a Herbrand model for 𝐢 or 𝑆 Intersection property: If 𝐼 1 and 𝐼 2 are Herbrand model, then 𝐼 1 ∩ 𝐼 2 is also Herbrand model Least Herbrand model: The intersection of all Herbrand models

9 Proof Theory Elementary Production Principle Proof Tree
𝐿 0 :- 𝐿 1 , 𝐿 2 ,…, 𝐿 𝑛 𝐹 1 , 𝐹 2 ,…, 𝐹 𝑛 are facts If there exists a substitution πœƒ that 𝐿 𝑖 πœƒ= 𝐹 𝑖 then 𝐿 0 πœƒ can be inferred as a new fact Proof Tree

10 Bottom-up Evaluation r1: sgc(X, X) :- person(X). r2: sgc(X, Y) :- par(X, X1), sgc(X1, Y1), par(Y, Y1). person(a). person(b). person(c). par(a, c). par(b, c). sgc(a, a). sgc(b, b). sgc(c, c). sgc(a, b) :- par(a, c), sgc(c, c), par(b, c).

11 Top-down Evaluation ?- sgc(a, Y). person(a). -> Y = {a} par(a, X1). -> X1 = {c} sgc(c, Y1). person(c). -> Y1 = {c} par(c, X1). -> X1 = {} ... -> Y1 = {c} par(Y, Y1) -> Y = {b} -> Y = {a, b}

12 Problem Solving Example
word(d,o,g). word(r,u,n). word(t,o,p). word(f,i,v,e). word(f,o,u,r). word(l,o,s,t). word(m,e,s,s). word(u,n,i,t). word(b,a,k,e,r). word(f,o,r,u,m). word(g,r,e,e,n). word(s,u,p,e,r). word(p,r,o,l,o,g). word(v,a,n,i,s,h). word(w,o,n,d,e,r). word(y,e,l,l,o,w).

13 Problem Solving Example
solution(L1,L2,L3,L4,L5,L6,L7,L8,L 9,L10,L11,L12,L13,L14,L15,L16):- word(L1,L2,L3,L4,L5), word(L9,L10,L11,L12,L13,L14), word(L1,L6,L9,L15), word(L3,L7,L11), word(L5,L8,L13,L16).

14 Bottom-up Evaluation Optimization
NaΓ―ve approach: iteratively apply rules to all facts until no new fact can be derived Problem One fact is derived many times The fact irrelevant to goal is also derived Solution Semi-naΓ―ve approach Magic set rewriting

15 Answer Set Programming

16 Rules with Negation Closed world assumption
If a fact does not logically follow from a set of Datalog clauses, then we conclude that the negation of this fact is true. Problem: multiple minimal Herbrand models boring(chess) :- not interesting(chess) 𝐻 π‘Ž = π‘–π‘›π‘‘π‘’π‘Ÿπ‘’π‘ π‘‘π‘–π‘›π‘” π‘β„Žπ‘’π‘ π‘  , 𝐻 𝑏 ={π‘π‘œπ‘Ÿπ‘–π‘›π‘” π‘β„Žπ‘’π‘ π‘  }

17 Stable Model Ground program
Assume a set of facts 𝑀 of program Ξ , the reduct of Ξ  according to 𝑀 ( Ξ  𝑀 ) is defined as follows For all literals 𝐿 𝑖 with negation and | 𝐿 𝑖 | is in 𝑀, drop it from rule body For all rules whose body contains negation, drop the rule Thus all the rule contains no negation If the minimal Herbrand model coincides with 𝑀, then 𝑀 is a stable model of Ξ 

18 Examples p :- not q q :- not p {} -> {p, q} {p} -> {p} {q} -> {q} {p, q} -> {} Stable models are {p}, {q} a :- not a {} -> {a} {a} -> {} No stable model

19 Constraints a :- B, not a. If B is true, then the rule becomes a :- not a. which means there’s no stable model for the program We can write such a rule as :- B

20 Multiple Literals in Rule Head
If the minimal Herbrand models of reduct Ξ  𝑀 contains 𝑀, then 𝑀 is a stable model of Ξ  For example a; b :- c, d. c. d. Only {a, c, d} and {b, c, d} are stable models, {a, b, c, d} is not.

21 Example: Solving Hamiltonian Cycle
vertex(a). vertex(b). vertex(c). vertex(d). edge(a, b). edge(b, c). edge(c, d). edge(d, a). edge(b, d). in(X, Y) :- edge(X, Y), not nin(X, Y). nin(X, Y) :- edge(X, Y), not in(X, Y). :- in(X, Y), in(X1, Y), X != X1. :- in(X, Y), in(X, Y1), Y != Y1. reachable(X, X) :- vertex(X). reachable(X, Y) :- vertex(X), vertex(Y), reachable(X, X1), in(X1, Y). :- vertex(X), vertex(Y), not reachable(X, Y). Online ASP Solver:

22 Markov Logic Network

23 Markov network Node: random variable Edge: variable dependency
Potential function πœ™: non-negative function for clique 𝑃 𝑋=π‘₯ = 1 𝑍 Ξ  π‘˜ πœ™ π‘˜ π‘₯ π‘˜ , 𝑍= Ξ£ π‘₯βˆˆπ’³ Ξ  π‘˜ πœ™ π‘˜ ( π‘₯ π‘˜ ) Log-linear model 𝑃 𝑋=π‘₯ = 1 𝑍 exp⁑( Ξ£ 𝑗 𝑀 𝑗 𝑓 𝑗 π‘₯ )

24 Markov Logic Network Definition
A Markov logic network 𝐿 is a set of (𝐹 𝑖 , 𝑀 𝑖 ), where 𝐹 𝑖 is a formula in first- order logic and 𝑀 𝑖 is a real number. Together with a finite set of constants 𝐢={ 𝑐 1 , 𝑐 2 ,…, 𝑐 𝑛 } defines a Markov network 𝑀 𝐿,𝐢 . 1. 𝑀 𝐿,𝐢 contains one binary node for each possible grounding of each predicate appearing in 𝐿. The value of the node is 1 if the ground atom is true, and 0 otherwise. 2. 𝑀 𝐿,𝐢 contains one feature for each possible grounding of each formula 𝐹 𝑖 in 𝐿. The value of this feature is 1 if the ground formula is true, and 0 otherwise. The weight of the feature is the 𝑀 𝑖 associated with 𝐹 𝑖 in 𝐿.

25 Example

26 Example Given constant A and B, consider last 2 formulas in the table.
Ground atoms Fr(A, A), Fr(A, B), Fr(B, A), Fr(B, B), Sm(A), Sm(B) Ground formulas Β¬π‘†π‘š 𝐴 βˆ¨πΆπ‘Ž(𝐴) Β¬π‘†π‘š 𝐡 βˆ¨πΆπ‘Ž(𝐡) Β¬πΉπ‘Ÿ 𝐴,𝐴 βˆ¨π‘†π‘š 𝐴 ∨¬Sm 𝐴 Β¬πΉπ‘Ÿ 𝐴,𝐴 βˆ¨Β¬π‘†π‘š 𝐴 ∨Sm 𝐴 Β¬πΉπ‘Ÿ 𝐡,𝐡 βˆ¨π‘†π‘š 𝐡 ∨¬Sm 𝐡 Β¬πΉπ‘Ÿ 𝐡,𝐡 βˆ¨Β¬π‘†π‘š 𝐡 ∨Sm 𝐡 Β¬πΉπ‘Ÿ 𝐴,𝐡 βˆ¨π‘†π‘š 𝐴 ∨¬Sm 𝐡 Β¬πΉπ‘Ÿ 𝐴,𝐡 βˆ¨Β¬π‘†π‘š 𝐴 ∨Sm 𝐡 Β¬πΉπ‘Ÿ 𝐡,𝐴 βˆ¨π‘†π‘š 𝐡 ∨¬Sm 𝐴 Β¬πΉπ‘Ÿ 𝐡,𝐴 βˆ¨Β¬π‘†π‘š 𝐡 ∨Sm 𝐴

27 Example

28 LPMLN Weighted Rules under the Stable Model Semantics

29 LPMLN Rule and Definition
Rule format 𝑀 :𝑅 Hard rule 𝑀=𝛼 For a LPMLN program Ξ  Ξ  : 𝑅 𝑀:π‘…βˆˆΞ } Ξ  𝐼 : the set of rules 𝑀:𝑅 such that πΌβŠ¨π‘… 𝑆𝑀 Ξ  : 𝐼 𝐼 𝑖𝑠 π‘‘β„Žπ‘’ π‘ π‘‘π‘Žπ‘π‘™π‘’ π‘šπ‘œπ‘‘π‘’π‘™ π‘œπ‘“ Ξ  𝐼 }

30 LPMLN Definition

31 Example

32 Example

33 Relating LPMLN to ASP & MLN

34 LPMLN Implementation

35 Reformulating LPMLN Based on Penalty

36 Turning LPMLN into ASP

37 Turning LPMLN into MLN

38 References Ceri, Stefano, Georg Gottlob, and Letizia Tanca. "What you always wanted to know about Datalog (and never dared to ask)."Β IEEE transactions on knowledge and data engineering1.1 (1989): Brewka, Gerhard, Thomas Eiter, and MirosΕ‚aw TruszczyΕ„ski. "Answer set programming at a glance."Β Communications of the ACMΒ 54.12 (2011): Richardson, Matthew, and Pedro Domingos. "Markov logic networks."Β Machine learningΒ  (2006): Lee, Joohyung, and Yi Wang. "Weighted Rules under the Stable Model Semantics."Β KR Lee, Joohyung, Samidh Talsania, and Yi Wang. "Computing LP MLN using ASP and MLN solvers."Β Theory and Practice of Logic ProgrammingΒ  (2017):

39 Thank you


Download ppt "Logic Programming & LPMLN"

Similar presentations


Ads by Google