Presentation is loading. Please wait.

Presentation is loading. Please wait.

Slide 1 Logic: TD as search, Datalog (variables) Jim Little UBC CS 322 – CSP October 24, 2014 Textbook §5.2 and 12.

Similar presentations


Presentation on theme: "Slide 1 Logic: TD as search, Datalog (variables) Jim Little UBC CS 322 – CSP October 24, 2014 Textbook §5.2 and 12."— Presentation transcript:

1 Slide 1 Logic: TD as search, Datalog (variables) Jim Little UBC CS 322 – CSP October 24, 2014 Textbook §5.2 and 12

2 Slide 2 Lecture Overview Recap Top Down TopDown Proofs as search Datalog

3 Bottom-up vs. Top-down Key Idea of top-down: search backward from a query G to determine if it can be derived from KB. KB C G is proved if G  C When does BU look at the query G? At the end Bottom-up Top-down TD performs a backward search starting at G KB answer Query G Slide 3

4 Slide 4 Top-down Ground Proof Procedure Key Idea: search backward from a query G to determine if it can be derived from KB.

5 Slide 5 Top-down Proof Procedure: Basic elements Notation : An answer clause is of the form: yes ← a 1 ∧ a 2 ∧ … ∧ a m Rule of inference (called SLD Resolution) Given an answer clause of the form: yes ← a 1 ∧ a 2 ∧ … ∧ a m and the clause: a i ← b 1 ∧ b 2 ∧ … ∧ b p You can generate the answer clause yes ← a 1 ∧ … ∧ a i-1 ∧ b 1 ∧ b 2 ∧ … ∧ b p ∧ a i+1 ∧ … ∧ a m i.e., resolving a i with a i ← b 1 ∧ b 2 ∧ … ∧ b p Express query as an answer clause (e.g., query a 1 ∧ a 2 ∧ … ∧ a m ) yes ← a 1 ∧ a 2 ∧ … ∧ a m

6 Slide 6 Rule of inference: Examples Rule of inference (called SLD Resolution) Given an answer clause of the form: yes ← a 1 ∧ a 2 ∧ … ∧ a m and the KB clause: a i ← b 1 ∧ b 2 ∧ … ∧ b p You can generate the answer clause yes ← a 1 ∧ … ∧ a i-1 ∧ b 1 ∧ b 2 ∧ … ∧ b p ∧ a i+1 ∧ … ∧ a m yes ← b ∧ c.b ← k ∧ f. yes ← e ∧ f.e. KB clause -> yes ← k ∧ f ∧ c. -> yes ← f.

7 Slide 7 (successful) Derivations An answer is an answer clause with m = 0. That is, it is the answer clause yes ←. A (successful) derivation of query “?q 1 ∧ … ∧ q k “ from KB is a sequence of answer clauses γ 0, γ 1,…,γ n such that γ 0 is the answer clause yes ← q 1 ∧ … ∧ q k γ i is obtained by resolving γ i-1 with a clause in KB, and γ n is an answer. An unsuccessful derivation….. yes ←. yes ← a ∧ b.

8  4 : yes  e  3 : yes  c  1 : yes  e  f  5 : yes  Example: successful derivation  0 : yes  a  2 : yes  e  c 1 2 4 3 5 a  b  c.a  e  f.b  f  k. c  e.d  k e. f  j  e. f  c. j  c. Query: ?a Done. The question was “Can we derive a?” The answer is “Yes, we can” 8

9 a  b  c.a  e  f.b  f  k. c  e.d  k e. f  j  e. f  c. j  c. Query: ?a Example: failing derivation  4 : yes  e  k  c  3 : yes  c  k  c  1 : yes  b  c  5 : yes  k  c  0 : yes  a  2 : yes  f  k  c  6 : yes  k  e  7 : yes  k 1 2 3 4 5 There is no rule with k as its head, thus … fail 67 9

10 Slide 10 Example: derivations k ← e. a ← b ∧ c.b ← k ∧ f. c ← e.d ← k. e. f ← j ∧ e. f ← c. j ← c. KB

11 Sound and Complete? When you have derived an answer, you can read a bottom up proof in the opposite direction. Every top-down derivation corresponds to a bottom up proof and every bottom up proof has a top-down derivation. This equivalence can be used to prove the soundness and completeness of the derivation procedure. 11

12 Sound If KB ⊦ TD G there is a top down derivation for G Therefore there is a bottom up derivation of G. Therefore G is in C and KB |=G (because BU is sound) If KB ⊦ TD G then KB |=G 12

13 Complete If KB |=G then we can find a bottom-up derivation for G. Therefore there is a top-down derivation as well, which we can find because the search space is finite. If KB |=G then KB ⊦ TD G 13

14 Complete If KB |=G then KB ⊦ TD G Do it as an exercise 14

15 SLD resolution as search SLD resolution can be seen as a search from a query stated as an answer clause to an answer Through the space of all possible answer clauses 15

16 Slide 16 Course Big Picture Environment Problem Inference Planning Deterministic Stochastic Search Arc Consistency Search Value Iteration Var. Elimination Constraint Satisfaction Logics STRIPS Belief Nets Vars + Constraints Decision Nets Markov Processes Var. Elimination Static Sequential Representation Reasoning Technique SLS

17 Constraint Satisfaction (Problems): State: assignments of values to a subset of the variables Successor function: assign values to a “free” variable Goal test: set of constraints Solution: possible world that satisfies the constraints Heuristic function: none (all solutions at the same distance from start) Planning : State: full assignment of values to features Successor function: states reachable by applying valid actions Goal test: partial assignment of values to features Solution: a sequence of actions Heuristic function: relaxed problem! E.g. “ignore delete lists” Query (Top-down/SLD resolution) State: answer clause of the form yes  a 1 ...  a k Successor function: state resulting from substituting first atom a 1 with b 1  …  b m if there is a clause a 1 ← b 1  …  b m Goal test: is the answer clause empty (i.e. yes  ) ? Solution: the proof, i.e. the sequence of SLD resolutions Heuristic function: ????? Inference as Standard Search 17

18 Search Graph Prove: ?← a ∧ d. a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. KB Why are these dead ends in the search space? 18

19 Search Graph Prove: ?← a ∧ d. a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. KB Why are these dead ends in the search space? the successor function resolves the first atom in the body of the answer clause But j and m cannot be resolved Why are these dead ends in the search space? can trace the example in the Deduction Applet at http://aispace.org/deduction/ using file kb-for-top- down-search available in course schedule http://aispace.org/deduction/ 19

20 Search Graph Prove: ?← a ∧ d. a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. KB Heuristics? Slide 20

21 Search Graph a ← b ∧ c. a ← g. a ← h.b ← j. b ← k. d ← m. d ← p. f ← m. f ← p. g ← m. g ← f. k ← m. h ←m. p. KB Slide 21

22 Representation and Reasoning in complex domains Expressing knowledge with propositions can be quite limiting up_s 2 up_s 3 ok_cb 1 ok_cb 2 live_w 1 connected_w 1 _w 2 up( s 2 ) up( s 3 ) ok( cb 1 ) ok( cb 2 ) live( w 1 ) connected( w 1, w 2 ) What we need is a more natural way to consider individuals and their properties E.g. there is no notion that Now there is a notion that up_s 1 and up_s 1 are about the same property w 1 is the same in live_w 1 and in connected_w 1 _w 2 w 1 is the same in live(w 1 ) and in connected(w 1, w 2 ) up is the same in up(s 1 ) and up(s 3 ) 22

23 Slide 23 Lecture Overview Recap Top Down TopDown Proofs as search Datalog

24 An extension of propositional definite clause (PDC) logic We now have variables We now have relationships between variables We can express knowledge that holds for a set of individuals, writing more powerful clauses by introducing variables, such as: We can ask generic queries, E.g. “which wires are connected to w 1 ?“ live(W)  wire(W) ∧ connected_to(W,W 1 ) ∧ wire(W 1 ) ∧ live(W 1 ). ? connected_to(W, w 1 ) 24

25 Datalog: a relational rule language A variable is a symbol starting with an upper case letter A constant is a symbol starting with lower-case letter or a sequence of digits. A predicate symbol is a symbol starting with a lower-case letter. A term is either a variable or a constant. Datalog expands the syntax of PDCL…. Examples: X, Y Examples: alan, w1 Examples: live, connected, part-of, in Examples: X, Y, alan, w1 25

26 Datalog Syntax (cont’d) An atom is a symbol of the form p or p(t 1 …. t n ) where p is a predicate symbol and t i are terms A definite clause is either an atom (a fact) or of the form: h ← b 1 ∧ … ∧ b m where h and the b i are atoms (Read this as ``h if b.'') A knowledge base is a set of definite clauses Examples: sunny, in(alan,X) Example: in(X,Z) ← in(X,Y) ∧ part-of(Y,Z) 26

27 Summary of Datalog Syntax Definite Clause atom p p(t 1,….t n ) a <- b 1 ^…^ b n where a and b 1.. b n are atoms Datalog Expression Term Query: ?body constantvariable 27

28 DataLog Sematics Role of semantics is still to connect symbols and sentences in the language with the target domain. Main difference: need to create correspondence both between terms and individuals, as well as between predicate symbols and relations We won’t cover the formal definition of Datalog semantics, but if you are interested see 12.3.1 and 12.3.2 in textbook 28

29 Lecture Overview Recap Lecture 8 TopDown-Up Proof Procedure DataLog Syntax and semantics SLD resolution 29

30 Datalog: Top Down Proof Procedure Extension of Top-Down procedure for PDCL. How do we deal with variables? Idea: -Find a clause with head that matches the query -Substitute variables in the clause with their matching constants Example: We will not cover the formal details of this process, called unification. See textbook Section 12.4.2, p. 511 for the details. in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). Query: yes  in(alan, cs_building). yes  part_of(Z,cs_building), in(alan, Z). in(X,Y)  part_of(Z,Y) & in(X,Z). with Y = cs_building X = alan 30

31 Example proof of a Datalog query in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). Query: yes  in(alan, cs_building). yes  part_of(Z,cs_building), in(alan, Z). yes  in(alan, r123). yes  part_of(Z, r123), in(alan, Z). yes . Using clause: in(X,Y)  part_of(Z,Y) & in(X,Z), with Y = cs_building X = alan Using clause: part_of(r123,cs_building) with Z = r123 Using clause: in(alan, r123). Using clause: in(X,Y)  part_of(Z,Y) & in(X,Z). With X = alan Y = r123 fail No clause with matching head: part_of(Z,r123). 31

32 Tracing Datalog proofs in AIspace You can trace the example from the last slide in the AIspace Deduction Applet at http://aispace.org/deduction/ using file in-part-of available in course schedule http://aispace.org/deduction/ 32

33 Datalog: queries with variables What would the answer(s) be? Query: in(alan, X1). in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). yes(X1)  in(alan, X1). 33

34 Datalog: queries with variables What would the answer(s) be? yes(r123). yes(cs_building). Query: in(alan, X1). in(alan, r123). part_of(r123,cs_building). in(X,Y)  part_of(Z,Y) & in(X,Z). yes(X1)  in(alan, X1). Again, you can trace the SLD derivation for this query in the AIspace Deduction Applet, 34

35 Learning Goals For Logic PDCL syntax & semantics - Verify whether a logical statement belongs to the language of propositional definite clauses - Verify whether an interpretation is a model of a PDCL KB. - Verify when a conjunction of atoms is a logical consequence of a KB Bottom-up proof procedure - Define/read/write/trace/debug the Bottom Up (BU) proof procedure - Prove that the BU proof procedure is sound and complete Top-down proof procedure - Define/read/write/trace/debug the Top-down (SLD) proof procedure - Define it as a search problem - Prove that the TD proof procedure is sound and complete Datalog - Represent simple domains in Datalog - Apply the Top-down proof procedure in Datalog 35

36 Logics: Big picture We only covered rather simple logics There are much more powerful representation and reasoning systems based on logics e.g. full first order logic (with negation, disjunction and function symbols) second-order logics (predicates over predicates) non-monotonic logics, modal logics, … There are many important applications of logic For example, software agents roaming the web on our behalf Based on a more structured representation: the semantic web This is just one example for how logics are used 36

37 Logics: Big Picture Propositional Logics First-Order Logics Propositional Definite Clause Logics Semantics and Proof Theory Satisfiability Testing Satisfiability Testing (SAT) Description Logics Cognitive Architectures Video Games Hardware Verification Product Configuration Ontologies Semantic Web Information Extraction Summarization Production Systems Tutoring Systems 37

38 Examples for typical \Web queries How much is a typical flight to Mexico for a given date? What’s the cheapest vacation package to some place in the Caribbean in a given week? Plus, the hotel should have a white sandy beach and scuba diving If webpages are based on basic HTML Humans need to scout for the information and integrate it Computers are not reliable enough (yet) Natural language processing (NLP) can be powerful (see Watson and Siri!) But some information may be in pictures (beach), or implicit in the text, so existing NLP techniques still don’t get all the info. Semantic Web: Extracting data 38

39 More structured representation: the Semantic Web Beyond HTML pages only made for humans Languages and formalisms based on description logics that allow websites to include rich, explicit information on relevant concepts, individual and their relationships \ Goal: software agents that can roam the web and carry out sophisticated tasks on our behalf, based on these richer representations Different than searching content for keywords and popularity. Infer meaning from content based on metadata and assertions that have already been made. Automatically classify and integrate information For further material, P&M text, Chapter 13. Also the Introduction to the Semantic Web tutorial given at 2011 Semantic TechnologyConference http://www.w3.org/People/Ivan/CorePresentations/SWTutorial/ http://www.w3.org/People/Ivan/CorePresentations/SWTutorial/ 39

40 Examples of ontologies for the Semantic Web “Ontology”: logic-based representation of the world eClassOwl: eBusiness ontology for products and services 75,000 classes (types of individuals) and 5,500 properties National Cancer Institute’s ontology: 58,000 classes Open Biomedical Ontologies Foundry: several ontologies including the Gene Ontology to describe gene and gene product attributes in any organism or protein sequence OpenCyc project: a 150,000-concept ontology including Top-level ontology describes general concepts such as numbers, time, space, etc Many specific concepts such as “OLED display”, “iPhone” 40

41 A different example of applications of logic Cognitive Tutors (http://pact.cs.cmu.edu/)http://pact.cs.cmu.edu/ computer tutors for a variety of domains (math, geometry, programming, etc.) Provide individualized support to problem solving exercises, as good human tutors do Rely on logic-based, detailed computational models (ACT-R) of skills and misconceptions underlying a learning domain. CanergieLearning (http://www.carnegielearning.com/ ):http://www.carnegielearning.com a company that commercializes these tutors, sold to hundreds of thousands of high schools in the USA 41

42 ACT-R Models for Cognitive Tutors One of ACT-R main assumptions: Cognitive skills (procedural knowledge) are represented as production rules: IF this situation is TRUE, THEN do X ACT-R model representing expertise in a given domain: set of production rules mimicking how a human would reason to perform tasks in that domain An ACT-R model for a Tutor encodes all the reasoning steps necessary to solve problems in the target domain Example: rules describing how to solve 5x+3=30 42

43 ACT-R Models for Intelligent Tutoring Systems Eq: 5x+3=30 ; Goals: [Solve for x] Rule: To solve for x when there is only one occurrence, unwrap (isolate) x. Eq:5x+3=30 ; Goals: [Unwrap x] Rule: To unwrap ?V, find the outermost wrapper ?W of ?V and remove ?W Eq: 5x+3=30; Goals: [Find wrapper ?W of x; Remove ?W] Rule: To find wrapper ?W of ?V, find the top level expression ?E on side of equation containing ?V, and set ?W to part of ?E that does not contain ?V Eq: 5x+3=30; Goals: [Remove “+3”] Rule: To remove “+?E”, subtract “+?E” from both sides Eq: 5x+3=30; Goals: [Subtract “+3” from both sides] Rule: To subtract “+?E” from both sides …. Eq: 5x+3-3=30-3 43

44 Cognitive Tutors Computer tutors that use Act-R models of target domains (e.g. algebra, geometry), in order to trace student performance by firing rules and do a stepwise comparison of rule outcome with student action Mismatches signal incorrect student knowledge that requires tutoring These models showed good fit with student performance, indicating value of the ACT-R theory Cognitive Tutors are great examples of AI success – used in thousands of high schools in the USA (http://www.carnegielearning.com/success.cfm ) 44

45 Where are we? Environment Problem Type Query Planning Deterministic Stochastic Constraint Satisfaction Search Arc Consistency Search Logics STRIPS Vars + Constraints Value Iteration Variable Elimination Belief Nets Decision Nets Markov Processes Static Sequential Representation Reasoning Technique Variable Elimination Done with Deterministic Environments 45

46 Where are we? Environment Problem Type Query Planning Deterministic Stochastic Constraint Satisfaction Search Arc Consistency Search Logics STRIPS Vars + Constraints Value Iteration Variable Elimination Belief Nets Decision Nets Markov Processes Static Sequential Representation Reasoning Technique Variable Elimination Second Part of the Course 46

47 Where Are We? Environment Problem Type Query Planning Deterministic Stochastic Constraint Satisfaction Search Arc Consistency Search Logics STRIPS Vars + Constraints Value Iteration Variable Elimination Belief Nets Decision Nets Markov Processes Static Sequential Representation Reasoning Technique Variable Elimination We’ll focus on Belief Nets 47

48 48 Full First-Order Logics (FOLs) We have constant symbols, predicate symbols and function symbols So interpretations are much more complex (but the same basic idea – one possible configuration of the world) INFERENCE: Semidecidable: algorithms exists that says yes for every entailed formulas, but no algorithm exists that also says no for every non-entailed sentence Resolution Procedure can be generalized to FOL constant symbols => individuals, entities predicate symbols => relations function symbols => functions


Download ppt "Slide 1 Logic: TD as search, Datalog (variables) Jim Little UBC CS 322 – CSP October 24, 2014 Textbook §5.2 and 12."

Similar presentations


Ads by Google