Presentation is loading. Please wait.

Presentation is loading. Please wait.

Review Markov Logic Networks Mathew Richardson Pedro Domingos Xinran(Sean) Luo, u0866707.

Similar presentations


Presentation on theme: "Review Markov Logic Networks Mathew Richardson Pedro Domingos Xinran(Sean) Luo, u0866707."— Presentation transcript:

1 Review Markov Logic Networks Mathew Richardson Pedro Domingos Xinran(Sean) Luo, u0866707

2 Overview Markov Networks First-order Logic Markov Logic Networks Inference Learning Experiments

3 Markov Networks Also known as Markov random fields. Composed of ◦ An undirected graph G ◦ A set of potential function φ k Function: And x {k} is the state of kth clique. Z is partition function:

4 Markov Networks Log-linear models: each clique potential function is replaced by an exponentiated weighted sum of features of the state:

5 Overview Markov Networks First-order Logic Markov Logic Networks Inference Learning Experiments

6 First-order Logic A set of sentences or formulas in first- order logic. Constructed by the symbols: connective, quanitfier, constants, variables, functions, predicates, etc.

7 Syntax for First-Order Logic Connective → ∨ | ∧ | ⇒ | ⇔ Quanitfier → ∃ | ∀ Constant → A | John | Car1 Variable → x | y | z |... Predicate → Brother | Owns |... Function → father-of | plus |...

8 Overview Markov Networks First-order Logic Markov Logic Networks Inference Learning Experiments

9 Markov Logic Networks A Markov Logic Network (MLN) L is a set of pairs (F i, w i ) where ◦ F i is a formula in first-order logic ◦ w i is a real number

10 Features of Markov Logic Network It defines a Markov network M L,C with: ◦ For each possible grounding of each predicate in L, there is a binary node in M L,C. If the ground atom is true, the node is 1. Otherwise, 0. ◦ For each possible grounding of each formula in L, there is a feature node in M L,C. If the ground formula is true, the feature is 1. Otherwise, 0.

11 Ground Term A ground term is a term containing no variables. Ground Markov Network: MLNs have certain regularities in structure and parameters. MLN is template for ground Markov networks

12 Example of an MLN Suppose we have two constants: Anna (A) and Bob (B) Cancer(A) Smokes(A)Smokes(B) Cancer(B)

13 Example of an MLN Suppose we have two constants: Anna (A) and Bob (B) Friends(A,A) Friends(B,A) Friends(A,B) Friends(B,B)

14 Example of an MLN Suppose we have two constants: Anna (A) and Bob (B) Cancer(A) Smokes(A)Friends(A,A) Friends(B,A) Smokes(B) Friends(A,B) Cancer(B) Friends(B,B)

15 MLNs and First-Order Logic First-order KB  assign a weight to each formula  MLN. Satisfiable KB + positive weights to each formula  MLN represents a uniform distribution over the worlds. MLN produce useful results even contains contradictions.

16 Overview Markov Networks First-order Logic Markov Logic Networks Inference Learning Experiments

17 Inference Already know the probability of formula F 1, what is the probability of F 2 ? Two steps (Approximate): ◦ Find the minimal subset of the ground network. ◦ (MCMC-Gibbs algorithm) Sampling one ground atom given its Markov blanket (the set of ground atoms that appear in some grounding of a formula with it).

18 Inference The probability of a ground atom X l when its Markov blanket B l is in state b l is: is the value of 0 or 1.

19 Overview Markov Networks First-order Logic Markov Logic Networks Inference Learning Experiments

20 Learning Data is from a relational database Strategy: ◦ Counting the number of true groundings of formula in DB. ◦ Use Pseudo-Likelihood to get gradient. is the number of true groundings of the ith formula when we force Xl =0 and leave the remaining data unchanged, and similarly for

21 Overview Markov Networks First-order Logic Markov Logic Networks Inference Learning Experiments

22 Experiments Hand-built knowledge base (KB) ILP: CLAUDIEN Markov logic networks (MLNs) ◦ Using KB ◦ Using CLAUDIEN ◦ Using KB + CLAUDIEN Bayesian network learner Naïve Bayes

23 Results

24 Summary Markov logic networks combine first- order logic and Markov networks ◦ Syntax: First-order logic + Positive Weights ◦ Semantics: Templates for Markov networks Inference: Minimal subset + Gibbs Learning: Pseudo-likelihood


Download ppt "Review Markov Logic Networks Mathew Richardson Pedro Domingos Xinran(Sean) Luo, u0866707."

Similar presentations


Ads by Google