1 Semantics of Datalog With Negation Local Stratification Stable Models Well-Founded Models.

Slides:



Advertisements
Similar presentations
Artificial Intelligence
Advertisements

CSE 636 Data Integration Conjunctive Queries Containment Mappings / Canonical Databases Slides by Jeffrey D. Ullman.
Lecture 11: Datalog Tuesday, February 6, Outline Datalog syntax Examples Semantics: –Minimal model –Least fixpoint –They are equivalent Naive evaluation.
CPSC 504: Data Management Discussion on Chandra&Merlin 1977 Laks V.S. Lakshmanan Dept. of CS UBC.
Justification-based TMSs (JTMS) JTMS utilizes 3 types of nodes, where each node is associated with an assertion: 1.Premises. Their justifications (provided.
Hoare’s Correctness Triplets Dijkstra’s Predicate Transformers
Answer Set Programming Overview Dr. Rogelio Dávila Pérez Profesor-Investigador División de Posgrado Universidad Autónoma de Guadalajara
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Proofs, Recursion and Analysis of Algorithms Mathematical Structures for Computer Science Chapter 2 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProofs,
1 Database Management Systems 3ed, R. Ramakrishnan and J. Gehrke Deductive Databases Chapter 25.
Copyright © Cengage Learning. All rights reserved. CHAPTER 5 SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION.
Parallel Scheduling of Complex DAGs under Uncertainty Grzegorz Malewicz.
Copyright © 2006 Addison-Wesley. All rights reserved.1-1 ICS 410: Programming Languages Chapter 3 : Describing Syntax and Semantics Axiomatic Semantics.
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
1 Intractable Problems Time-Bounded Turing Machines Classes P and NP Polynomial-Time Reductions.
Conversion of CFG to PDA Conversion of PDA to CFG
Winter 2002Arthur Keller – CS 18014–1 Schedule Today: Feb. 26 (T) u Datalog. u Read Sections Assignment 6 due. Feb. 28 (TH) u Datalog and SQL.
Normal Forms for CFG’s Eliminating Useless Variables Removing Epsilon
CSE 636 Data Integration Datalog Rules / Programs / Negation Slides by Jeffrey D. Ullman.
1 Datalog Logical Rules Recursion SQL-99 Recursion.
1 Decidability Turing Machines Coded as Binary Strings Diagonalizing over Turing Machines Problems as Languages Undecidable Problems.
Dr. Muhammed Al-Mulhem 1ICS ICS 535 Design and Implementation of Programming Languages Part 1 Fundamentals (Chapter 4) Axiomatic Semantics ICS 535.
1 Datalog Rules Programs Negation. 2 Review of Logical If-Then Rules h(X,…) :- a(Y,…) & b(Z,…) & … head body subgoals “The head is true if all the subgoals.
Nondeterministic Finite Automata
Normal forms for Context-Free Grammars
Introduction to Finite Automata Adapted from the slides of Stanford CS154.
Decision Properties of Regular Languages
Credit: Slides are an adaptation of slides from Jeffrey D. Ullman 1.
Logical Rules Recursion
1 Datalog Logical Rules Recursion. 2 Logic As a Query Language uIf-then logical rules have been used in many systems. wMost important today: EII (Enterprise.
Credit: Slides are an adaptation of slides from Jeffrey D. Ullman 1.
Properties of Context-Free Languages
What it is? Why is it a legitimate proof method? How to use it?
Called as the Interval Scheduling Problem. A simpler version of a class of scheduling problems. – Can add weights. – Can add multiple resources – Can ask.
1 Data-Flow Analysis Proving Little Theorems Data-Flow Equations Major Examples.
MATH 224 – Discrete Mathematics
DECIDABILITY OF PRESBURGER ARITHMETIC USING FINITE AUTOMATA Presented by : Shubha Jain Reference : Paper by Alexandre Boudet and Hubert Comon.
1 Inference Rules and Proofs (Z); Program Specification and Verification Inference Rules and Proofs (Z); Program Specification and Verification.
Logical Query Languages Motivation: 1.Logical rules extend more naturally to recursive queries than does relational algebra. u Used in SQL recursion. 2.Logical.
Web Science & Technologies University of Koblenz ▪ Landau, Germany Stable Models See Bry et al 2007.
Datalog Inspired by the impedance mismatch in relational databases. Main expressive advantage: recursive queries. More convenient for analysis: papers.
CSE 636 Data Integration Conjunctive Queries Containment Mappings / Canonical Databases Slides by Jeffrey D. Ullman Fall 2006.
CS Introduction to AI Tutorial 8 Resolution Tutorial 8 Resolution.
Copyright © Zeph Grunschlag, Induction Zeph Grunschlag.
Propositional calculus
A Logic of Partially Satisfied Constraints Nic Wilson Cork Constraint Computation Centre Computer Science, UCC.
1 Introduction to Abstract Mathematics Chapter 2: The Logic of Quantified Statements. Predicate Calculus Instructor: Hayk Melikya 2.3.
Naïve Set Theory. Basic Definitions Naïve set theory is the non-axiomatic treatment of set theory. In the axiomatic treatment, which we will only allude.
CS6133 Software Specification and Verification
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
Introduction to Finite Automata
1 Decision Properties of Regular Languages General Discussion of “Properties” The Pumping Lemma Minimizing DFA.
1 Reasoning with Infinite stable models Piero A. Bonatti presented by Axel Polleres (IJCAI 2001,
Copyright © Zeph Grunschlag, Induction Zeph Grunschlag.
Deterministic Finite Automata
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen Department of Computer Science University of Texas-Pan American.
Lecture 041 Predicate Calculus Learning outcomes Students are able to: 1. Evaluate predicate 2. Translate predicate into human language and vice versa.
Metalogic Soundness and Completeness. Two Notions of Logical Consequence Validity: If the premises are true, then the conclusion must be true. Provability:
Extensions of Datalog Wednesday, February 13, 2001.
1 Datalog with negation Adapted from slides by Jeff Ullman.
Chapter 2 Sets and Functions.
Containment Mappings Canonical Databases Sariaya’s Algorithm
Semantics of Datalog With Negation
Axiomatic semantics Points to discuss: The assignment statement
Alternating tree Automata and Parity games
MA/CSSE 474 More Math Review Theory of Computation
Computer Security: Art and Science, 2nd Edition
Logic Based Query Languages
Datalog Inspired by the impedance mismatch in relational databases.
Rules Programs Negation
Presentation transcript:

1 Semantics of Datalog With Negation Local Stratification Stable Models Well-Founded Models

2 The Story So Far uWhen there is no (IDB) negation, there is a unique minimal model (least fixedpoint), which is the accepted meaning of the Datalog program. uWith negation, we often have several minimal models, and we need to decide which one is meant by the program.

3 The Story So Far uWhen the program is stratified, one minimal model is the stratified model. wThis model appears in all cases to be the one we intuitively want. wImportant technical point: if the program actually has no negation, then the stratified model is the unique minimal model. wThus, stratified semantics extends least- fixedpoint semantics.

4 What About Unstratified Datalog? uThere are some more general conditions under which an “accepted” choice among models exists. uFrom least to most general: Locally stratified models, modularly stratified models, stable/well-founded models.

5 Why Should We Care? 1.Solidify our understanding of when declarative assertions, like logical rules, lead to a meaningful description of something. 2.SQL recursion really deals with ambiguities of the same kind, especially regarding aggregations, as well as negation.

6 Ground Atoms uAll these approaches start by instantiating the rules: replace variables by constants in all possible ways, and throw away instances of the rules with a known false EDB subgoal. uAn atom with no variables is a ground atom. wLike propositions in propositional calculus.

7 Example: Ground Atoms uConsider the Win program: win(X) :- move(X,Y) & NOT win(Y) with the following moves: 132 win(1) :- move(1,2) & NOT win(2) win(1) :- move(1,3) & NOT win(3) win(2) :- move(2,3) & NOT win(3)

8 Example --- Continued  Other instantiations of the rule have a false move subgoal and therefore cannot infer anything. uwin(1), win(2), and win(3) are the only relevant IDB ground atoms for this game. win(1) :- move(1,2) & NOT win(2) win(1) :- move(1,3) & NOT win(3) win(2) :- move(2,3) & NOT win(3)

9 Locally Stratified Models 1.Build dependency graph with: uNodes = relevant IDB ground atoms. uArc p -> q iff q appears in an instantiated body with head p. uLabel – on arc if q is negated. 2.Stratum of each node defined as before. 3.Locally stratified = finite strata only.

10 Example win(1) :- move(1,2) & NOT win(2) win(1) :- move(1,3) & NOT win(3) win(2) :- move(2,3) & NOT win(3) win(1)win(2)win(3) -- Stratum 0Stratum 2Stratum 1

11 Locally Stratified Model Include all EDB ground atoms; FOR (stratum i = 0, 1, …) DO WHILE (changes occur) DO IF (some rule for some p at stratum i has a true body) THEN add p to model;

12 Example win(1)win(2)win(3) -- win(1) :- move(1,2) & NOT win(2) win(1) :- move(1,3) & NOT win(3) win(2) :- move(2,3) & NOT win(3) 1.No rules for win(3); therefore false. 2.win(2) rule satisfied; therefore true. 3.Second rule for win(1) satisfied, therefore win(2) also true.

13 Stable Models --- Intuition uA model M is “stable” if, when you apply the rules to M and add in the EDB, you infer exactly the IDB portion of M. uBut … when applying the rules to M, you can only use non-membership in M.

14 Example uThe Win program again: win(X) :- move(X,Y) & NOT win(Y) with moves: uM = EDB + {win(1), win(2)} is stable. wY = 3, X = 1 yields win(1). wY = 3, X = 2 yields win(2). wCannot yield win(3). 132

15 Gelfond-Lifschitz Transform (Formal Notion of Stability) 1.Instantiate rules in all possible ways. 2.Delete instantiated rules with any false EDB or arithmetic subgoal (incl. NOT). 3.Delete instantiated rules with an IDB subgoal NOT p(…), where p(…) is in M. 4.Delete subgoal NOT p(…) if p(…) is not in M. 5.Delete true EDB and arithmetic subgoals.

16 GL Transform --- Continued uUse the remaining instantiated rules plus EDB to infer all possible IDB ground atoms. uThen add in the EDB. uResult is GL(M ). uM is stable iff GL(M ) = M.

17 Example uThe Win program yet again: win(X) :- move(X,Y) & NOT win(Y) with moves: uM = EDB + {win(1), win(2)}. uAfter steps (1) and (2): 132 win(1) :- move(1,2) & NOT win(2) win(1) :- move(1,3) & NOT win(3) win(2) :- move(2,3) & NOT win(3) Step (3): neg- ated true IDB.

18 Example --- Continued win(1) :- move(1,3) & NOT win(3) win(2) :- move(2,3) & NOT win(3) Step (4): negated false IDB subgoal. Step (5): true EDB subgoal. Remaining rules have true (empty) bodies. Infer win(1), win(2). GL(M ) = EDB + {win(1), win(2)} = M.

19 Bottom Line on the GL Transform uYou can use (positive or negative) IDB and EDB facts from M to satisfy or falsify a negated subgoal. uYou can use a positive EDB fact from M to satisfy a positive subgoal. uBut you can only use a positive IDB fact to satisfy a positive IDB subgoal if that fact has been derived in the final step.

20 To Make the Point Clear… uIf all I have is the instantiated rule p(1) :- p(1), then M = {p(1)} is not stable. uWe cannot use the membership of positive IDB subgoal p(1) in M to make the body of the rule true.

21 The Stable Model uIf a program and EDB has exactly one model M with that EDB that is stable, then M is the stable model for the program and EDB. uOtherwise, there is no stable model for this program and EDB.

22 Propositional Datalog uMany useful examples use propostional variables (0-ary predicates). uAll propositional variables are IDB. uFor GL transform, just: 1.Eliminate rules with a negated variable that is in M. 2.Eliminate subgoal NOT p if p in M. 3.Run the deduction step.

23 Example: p:- NOT q, q :- NOT p {p} is stable. p :- NOT q q :- NOT p Eliminate rule with a false negated subgoal. Eliminate true subgoal that is the NOT of IDB. Infer only p. Thus, {p} is stable. Unfortunately, so is {q}. Thus, this program has no stable model.

24 p:- p & NOT q, q :- NOT p {p} is not stable. p :- p & NOT q q :- NOT p Eliminate rule with a false subgoal. Cannot infer p ! GL({p}) = . {p} is not stable. Eliminate true negated subgoal.

25 Example --- Continued {q} is stable. p :- p & NOT q q :- NOT p Eliminate rule with a false negated subgoal. Eliminate true negated subgoal. Infer only q. Thus, {q} is stable. GL({p,q}) =  ; GL(  ) = {q}. Thus, {q} is the (unique) stable model.

26 3-Valued Models uNeeded for “well-founded semantics.” uModel consists of: 1.A set of true EDB facts (all other EDB facts are assumed false). 2.A set of true IDB facts. 3.A set of false IDB facts (remaining IDB facts have truth value “unknown”).

27 Well-Founded Models uStart with instantiated rules. uClean the rules = eliminate rules with a known false subgoal, and drop known true subgoals. uTwo modes of inference: 1.“Ordinary”: if the body is true, infer head. 2.“Unfounded sets”: assume all members of an unfounded set are false.

28 Unfounded Sets uU is an unfounded set (of positive, ground, IDB atoms) if every remaining instantiated rule with a member of U in the head also has a member of U in the body. uNote we could never infer any member of U to be true. uBut assuming them false is still “metalogic.”

29 Example p :- q, q :- p u{p,q} is an unfounded set. uSo is . uNote the property of being an unfounded set is closed under union, so there is always a unique, maximal unfounded set.

30 Constructing the Well- Founded Model REPEAT “clean” instantiated rules; make all ordinary inferences; “clean” instantiated rules; find the largest unfounded set and make its atoms false; UNTIL no changes; make all remaining IDB atoms “unknown”;

31 Example win(X) :- move(X,Y) & NOT win(Y) with these moves:

32 Instantiated, Cleaned Rules win(1) :- NOT win(2) win(2) :- NOT win(1) win(2) :- NOT win(3) win(3) :- NOT win(4) win(4) :- NOT win(5) win(5) :- NOT win(6) No ordinary inferences. {win(6)} is the largest unfounded set. Infer NOT win(6).

33 Second Round win(1) :- NOT win(2) win(2) :- NOT win(1) win(2) :- NOT win(3) win(3) :- NOT win(4) win(4) :- NOT win(5) win(5) :- Infer win(5). {win(4)} is the largest unfounded set. Infer NOT win(4).

34 Third Round win(1) :- NOT win(2) win(2) :- NOT win(1) win(2) :- NOT win(3) win(3) :- win(5) :- Infer win(3). No nonempty unfounded set, so done.

35 Example --- Concluded uWell founded model is: {win(3), win(5), NOT win(4), NOT win(6)}. uThe remaining IDB ground atoms --- win(1) and win(2) --- have truth value “unknown.” uNotice that if both sides play best, 3 and 5 are a win for the mover, 4 and 6 are a loss, and 1 and 2 are a draw

36 Another Example p :- q r :- p & q q :- p s :- NOT p & NOT q uFirst round: no ordinary inferences. u{p, q} is an unfounded set, but {p, q, r} is the largest unfounded set. uSecond round: infer s from NOT p and NOT q. uModel: {NOT p, NOT q, NOT r, s}.

37 Alternating Fixedpoint 1.Instantiate and “clean” the rules. 2.In “round 0,” assume all IDB ground atoms are false. 3.In each round, apply the GL transform to the EDB plus true IDB ground atoms from the previous round.

38 Alternating Fixedpoint uProcess converges to an alternation of two sets of true IDB facts. uEven rounds only increase; odd rounds only decrease the sets of true facts. uIn the limit, true facts are true in both sets; false facts are false in both sets, and “unknown” facts alternate.

39 Previous Win Example win(1) :- NOT win(2) win(2) :- NOT win(1) win(2) :- NOT win(3) win(3) :- NOT win(4) win(4) :- NOT win(5) win(5) :- NOT win(6)

40 Computing the AFP Round win(1) win(2) win(3) win(4) win(5) win(6)

41 Another Example p :- q r :- p & q q :- p s :- NOT p & NOT q Round p q r s

42 Yet Another Example p :- q q :- NOT p Round p q Notice how p is inferred only after we infer q on this round Notice that we may not use positive IDB fact q from previous round to infer p.

43 Containment of Semantics uSay method A < method B if: 1.Whenever a program has a model according to method A, it has the same model under method B. 2.There is at least one program that has a model under B but not under A. uDraw B above A in diagrams.

44 Comparison of Semantics Least Fixedpoint Stratified Locally stratified StableWell-founded

45 LFP < Stratified uLFP only applies to Datalog without negation. uWhat does Stratified do when there is no negation? uEverything is in stratum 0, and the whole IDB is computed by the LFP algorithm. uSo Stratified model = LFP model.

46 Stratified < Locally Stratified uConsider a Datalog program P with a stratified model, S. Need to show: 1.P is locally stratified. 2.Fact q(…) in S is in the locally stratified model L. 3.Fact q(…) in L is also in S.

47 P Is Locally Stratified uKey idea: path with n ---’s starting at q(…) in dependency graph of ground atoms implies a path with n ---’s starting at q in the dependency graph of predicates. q(1,2)p(3,4)s(5,6)t(7,8) --- implies qpst ---

48 P Is Locally Stratified uThus, stratum of ground atom q(…) is no greater than the stratum of predicate q. uIf the strata of all predicates in P are finite, then the strata of all ground atoms are finite. wI.e., if P is stratified, it is locally stratified for any EDB.

49 q(…) in S iff q(…) in L uProof = induction on stratum of q. uBasis: stratum 0. uThen q(…) is inferred for S using naïve evaluation, with no negated IDB subgoals ever used. uThe same sequence of inferences, using instantiated rules, lets us infer q(…) as we compute L.

50 S = L, Continued uConversely, if q(…) is in L, then it is inferred using no negated IDB subgoals. uThus, the same sequence of inferences will be carried out using naïve evaluation for S.

51 Inductive Step uSuppose q is at stratum i. uSuppose q(…) is inferred for S at stratum i, using naïve evaluation with all IDB at stratum <i treated as EDB. uBy inductive hypothesis, L and S agree below stratum i. uThus, same sequence of inferences, starting with what is known about L, infers q(…) for L; i.e. S  L.

52 Inductive Step: L  S uSuppose q(…) is inferred for L, and q is at stratum i. uThen the inference uses an instantiated rule, with any negated IDB subgoals having predicates at strata <i. uBy the inductive hypothesis, L and S agree on all those instantiated subgoals.

53 Inductive Step --- Concluded uThus, naïve evaluation at stratum i puts q(…) in S. wRequires another induction about predicates at stratum i. uThat is, L  S. uTherefore L = S.

54 More Proofs uWe’re not going to prove that the locally stratified model, if it exists, is both the unique stable model and the well-founded model.

55 Comparison of Stable and Well-Founded uIf there is a 2-valued (no UNKNOWN’s) well-founded model, then that is also the stable model. uYet, computing the well-founded model by alternating fixedpoint is polynomial in the size of the database. uBut it is NP-hard to tell whether a program has a unique stable model.