Announcements Midterm is Wednesday March 20, 7pm-9pm 

Slides:



Advertisements
Similar presentations
First-Order Logic.
Advertisements

Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
UIUC CS 497: Section EA Lecture #2 Reasoning in Artificial Intelligence Professor: Eyal Amir Spring Semester 2004.
Artificial Intelligence
Logic.
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Mar, 4, 2015 Slide credit: some slides adapted from Stuart.
CPSC 322, Lecture 23Slide 1 Logic: TD as search, Datalog (variables) Computer Science cpsc322, Lecture 23 (Textbook Chpt 5.2 & some basic concepts from.
Outline Recap Knowledge Representation I Textbook: Chapters 6, 7, 9 and 10.
CPSC 322, Lecture 23Slide 1 Logic: TD as search, Datalog (variables) Computer Science cpsc322, Lecture 23 (Textbook Chpt 5.2 & some basic concepts from.
Knowledge Representation using First-Order Logic (Part II) Reading: Chapter 8, First lecture slides read: Second lecture slides read:
Computing & Information Sciences Kansas State University Lecture 11 of 42 CIS 530 / 730 Artificial Intelligence Lecture 11 of 42 William H. Hsu Department.
Predicate Calculus.
Rutgers CS440, Fall 2003 Propositional Logic Reading: Ch. 7, AIMA 2 nd Ed. (skip )
Propositional Logic Agenda: Other forms of inference in propositional logic Basics of First Order Logic (FOL) Vision Final Homework now posted on web site.
CHAPTERS 7, 8 Oliver Schulte Logical Inference: Through Proof to Truth.
CS 188: Artificial Intelligence Logical Agents Instructor: Stuart Russell University of California, Berkeley.
Logical Agents Logic Propositional Logic Summary
Computing & Information Sciences Kansas State University Wednesday, 20 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 12 of 42 Wednesday, 20 September.
S P Vimal, Department of CSIS, BITS, Pilani
CPSC 322, Lecture 23Slide 1 Logic: TD as search, Datalog (variables) Computer Science cpsc322, Lecture 23 (Textbook Chpt 5.2 & some basic concepts from.
Computing & Information Sciences Kansas State University Lecture 13 of 42 CIS 530 / 730 Artificial Intelligence Lecture 13 of 42 William H. Hsu Department.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
CPSC 422, Lecture 21Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 21 Oct, 30, 2015 Slide credit: some slides adapted from Stuart.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 11 Jim Martin.
CS 188: Artificial Intelligence
First-Order Logic Reading: C. 8 and C. 9 Pente specifications handed back at end of class.
Computing & Information Sciences Kansas State University Lecture 12 of 42 CIS 530 / 730 Artificial Intelligence Lecture 12 of 42 William H. Hsu Department.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
First-Order Logic Semantics Reading: Chapter 8, , FOL Syntax and Semantics read: FOL Knowledge Engineering read: FOL.
1 First Order Logic CS 171/271 (Chapter 8) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
Knowledge Repn. & Reasoning Lecture #9: Propositional Logic UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2005.
Announcements  Upcoming due dates  Thursday 10/1 in class Midterm  Coverage: everything in lecture and readings except first-order logic; NOT probability.
Computing & Information Sciences Kansas State University Monday, 18 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 11 of 42 Monday, 18 September.
Logical Agents. Inference : Example 1 How many variables? 3 variables A,B,C How many models? 2 3 = 8 models.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
EA C461 Artificial Intelligence
Logic: TD as search, Datalog (variables)
Propositional Logic Russell and Norvig: Chapter 6 Chapter 7, Sections 7.1—7.4 CS121 – Winter 2003.
Announcements No office hours today!
Knowledge and reasoning – second part
First-Order Logic Chapter 8.
EA C461 – Artificial Intelligence Logical Agent
Propositional Logic Session 3
The Propositional Calculus
Announcements Upcoming due dates Discussion Sections
Logical Inference: Through Proof to Truth
CS 4700: Foundations of Artificial Intelligence
Logical Agents Chapter 7.
First-Order Logic Chapter 8.
Artificial Intelligence
Logic: Top-down proof procedure and Datalog
Artificial Intelligence
Artificial Intelligence: Agents and Propositional Logic.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 22
CS 416 Artificial Intelligence
Knowledge and reasoning – second part
Logical Agents Chapter 7.
Logics for Data and Knowledge Representation
Automated Reasoning in Propositional Logic
CS 416 Artificial Intelligence
Knowledge Representation I (Propositional Logic)
CS 188: Artificial Intelligence
Warm-up: The regions below visually enclose the set of models that satisfy the respective sentence
First-Order Logic Chapter 8.
Logical and Rule-Based Reasoning Part I
Representations & Reasoning Systems (RRS) (2.2)
Logical Agents Prof. Dr. Widodo Budiharto 2018
Presentation transcript:

Announcements Midterm is Wednesday March 20, 7pm-9pm  Details on midterm prep page (follow link on piazza) Format: directly on the exam sheet (no blue books); closed book, no notes Location: multiple rooms, go to the right one Past exams: on prep page Review sessions in section next week and the week after

Reminder: Satisfiablity A sentence is satisfiable if it is true in some world Efficient SAT solvers operate on CNF representation (easy to convert any PL sentence into CNF) (A  B)  (A  C) is satisfied by {A=true,B=*,C=*} (A  B)  ( A  C)  ( C   A)  ( C   B)  (A   B  C) unsatisfiable

Planning as satisfiability For the fully observable, deterministic case: planning problem is solvable iff there is some satisfying assignment solution obtained from truth values of action variables For T = 1 to infinity, set up the KB as follows and run SAT solver: Initial state, domain constraints Transition model sentences up to time T Goal is true at time T

Pacman variables Pacman’s location At_1,1_0 (Pacman is at [1,1] at time 0) At_3,3_4 etc Wall locations (these do not change with time) Wall_0,0 Wall_0,1 etc Ghosts (predictable movement) BGhost_2,2_0, etc. Food Food_2,1_0, etc. Life Alive_0 Actions W_0 (Pacman moves West at time 0), E_0 etc.

Initial state, domain constraints Pacman may know its initial location: At_1,1_0 Domain constraints: general knowledge (W_0  E_0)  (W_0  S_0)  … (W_1  E_1)  (W_1  S_1)  … …  (W_0 v E_0 v N_0 v S_0)  … Domain constraints: map (or not) Wall_0,0   Wall_1,1  …  At_1,2_0  At_1,3_0 ….

Transition model A state variable gets its value according to a successor-state axiom Xt  [Xt-1  (some actiont-1 made it false)] v [Xt-1  (some actiont-1 made it true)] For food: Food_3,3_17  [Food_3,3_16  At_3,3_17] Alive at t iff Alive at t-1 and no ghost on same square as Pacman Alive_7  [Alive_6  (At_1,1_6  (Bghost_1,1_6 v Cghost_1,1_6 V …))  (At_1,2_6  (Bghost_1,2_6 v Cghost_1,2_6 V …))  (At_1,3_6  (Bghost_1,3_6 v Cghost_1,3_6 V …)) …

Transition model contd. For Pacman location: At_3,3_17  [At_3,3_16  ((Wall_3,4  N_16) v (Wall_4,3  E_16) v …)] v [At_3,3_16  ((At_3,2_16  Wall_3,3  N_16) v (At_2,3_16  Wall_3,3  N_16) v …)]

Goal Still alive and no food anywhere at time T Alive _T  Food_1,1_T  Food_1,2_T …

Summary One possible agent architecture: knowledge + inference Logics provide a formal way to encode knowledge A logic is defined by: syntax, set of possible worlds, truth condition Logical inference computes entailment relations among sentences SAT solvers based on DPLL provide incredibly efficient inference Logical agents can construct plans by asking whether there is a future in which the goal is achieved Propositional logic is insufficiently expressive for general knowledge

CS 188: Artificial Intelligence First-Order Logic Please retain proper attribution and the reference to ai.berkeley.edu. Thanks! Instructor: Stuart Russell University of California, Berkeley

Spectrum of representations Search, game-playing CSPs, planning, propositional logic, Bayes nets, neural nets First-order logic, databases, probabilistic programs

Expressive power Rules of chess: Rules of Pacman: 100,000 pages in propositional logic 1 page in first-order logic Rules of Pacman: t Alive(t)  [Alive(t-1)   g,x,y [Ghost(g)  At(Pacman,x,y,t-1)  At(g,x,y,t-1)]]

Possible worlds A possible world for FOL consists of: Knows(A, BFF(B)) A non-empty set of objects For each k-ary predicate in the language, a set of k-tuples of objects (i.e., the set of tuples of objects that satisfy the predicate in this world) For each k-ary function in the language, a mapping from k-tuples of objects to objects For each constant symbol, a particular object (can think of constants as 0-ary functions) Knows(A, BFF(B)) 1 2

Possible worlds A possible world for FOL consists of: Knows(A, BFF(B)) A non-empty set of objects For each k-ary predicate in the language, a set of k-tuples of objects (i.e., the set of tuples of objects that satisfy the predicate in this world) For each k-ary function in the language, a mapping from k-tuples of objects to objects For each constant symbol, a particular object (can think of constants as 0-ary functions) Knows(A, BFF(B)) 1 2

Possible worlds A possible world for FOL consists of: Knows(A, BFF(B)) A non-empty set of objects For each k-ary predicate in the language, a set of k-tuples of objects (i.e., the set of tuples of objects that satisfy the predicate in this world) For each k-ary function in the language, a mapping from k-tuples of objects to objects For each constant symbol, a particular object (can think of constants as 0-ary functions) Knows(A, BFF(B)) 1 2 3 How many possible worlds?

Syntax and semantics: Terms A term refers to an object; it can be A constant symbol, e.g., A , B, EvilKingJohn The possible world fixes these referents A function symbol with terms as arguments, e.g., BFF(EvilKingJohn) The possible world specifies the value of the function, given the referents of the terms BFF(EvilKingJohn) -> BFF(2) -> 3 A logical variable, e.g., x (more later) A B EvilKingJohn 1 2 3

Syntax and semantics: Atomic sentences An atomic sentence is an elementary proposition (cf symbols in PL) A predicate symbol with terms as arguments, e.g., Knows(A,BFF(B)) True iff the objects referred to by the terms are in the relation referred to by the predicate Knows(A,BFF(B)) -> Knows(1,BFF(2)) -> Knows(1,3) -> F An equality between terms, e.g., BFF(BFF(BFF(B)))=B True iff the terms refer to the same objects BFF(BFF(BFF(B)))=B -> BFF(BFF(BFF(2)))=2 -> BFF(BFF(3))=2 -> BFF(1)=2 -> 2=2 -> T A B EvilKingJohn 1 2 3

Syntax and semantics: Complex sentences Sentences with logical connectives ,   ,   ,   ,    Sentences with universal or existential quantifiers, e.g., x Knows(x,BFF(x)) True in world w iff true in all extensions of w where x refers to an object in w x -> 1: Knows(1,BFF(1)) -> Knows(1,2) -> T x -> 2: Knows(2,BFF(2)) -> Knows(2,3) -> T x -> 3: Knows(3,BFF(3)) -> Knows(3,1) -> F A B EvilKingJohn 1 2 3

Syntax and semantics: Complex sentences Sentences with logical connectives ,   ,   ,   ,    Sentences with universal or existential quantifiers, e.g., x Knows(x,BFF(x)) True in world w iff true in some extension of w where x refers to an object in w x -> 1: Knows(1,BFF(1)) -> Knows(1,2) -> T x -> 2: Knows(2,BFF(2)) -> Knows(2,3) -> T x -> 3: Knows(3,BFF(3)) -> Knows(3,1) -> F A B EvilKingJohn 1 2 3

Fun with sentences Everyone knows President Obama There is someone that everyone knows Everyone knows someone

More fun with sentences Any two people of the same nationality speak a common language

Inference in FOL Entailment is defined exactly as for PL:  |=  (“ entails ” or “ follows from ”) iff in every world where  is true,  is also true E.g., x Knows(x,Obama) entails yx Knows(x,y) If asked “Do you know what time it is?”, it’s rude to say “Yes” Similarly, given an existentially quantified query, it’s polite to provide an answer in the form of a substitution (or binding) for the variable(s): KB = x Knows(x,Obama) Query = yx Knows(x,y) Answer = Yes, {y/Obama} Applying the substitution should produce a sentence that is entailed by KB

Inference in FOL: Propositionalization Convert (KB  ) to PL, use a PL SAT solver to check (un)satisfiability Trick: replace variables with ground terms, convert atomic sentences to symbols x Knows(x,Obama) and Democrat(Feinstein) Knows(Obama,Obama) and Knows(Feinstein,Obama) and Democrat(Feinstein) K_O_O  K_F_O  D_F and x Knows(Mother(x),x) Knows(Obama,Obama) and Knows(Mother(Obama),Obama) and Knows(Mother(Mother(Obama)),Obama) ……. Real trick: for k = 1 to infinity, use terms of function nesting depth k If entailed, will find a contradiction for some finite k; if not, may continue for ever; semidecidable

Inference in FOL: Lifted inference Apply inference rules directly to first-order sentences, e.g., KB = Person(Socrates), x Person(x)  Mortal(x) conclude Mortal(Socrates) The general rule is a version of Modus Ponens: Given [x]  [x] and ’, where ’ = [x] for some substitution  conclude [x]   is {x/Socrates} Given Knows(x,Obama) and Knows(y,z)  Likes(y,z)  is {y/x, z/Obama}, conclude Likes(x,Obama) Examples: Prolog (backward chaining), Datalog (forward chaining), production rule systems (forward chaining), resolution theorem provers

Summary, pointers FOL is a very expressive formal language Many domains of common-sense and technical knowledge can be written in FOL (see AIMA Ch. 12) circuits, software, planning, law, network and security protocols, product descriptions, ecommerce transactions, geographical information systems, Google Knowledge Graph, Semantic Web, etc. Inference is semidecidable in general; many problems are efficiently solvable in practice Inference technology for logic programming is especially efficient (see AIMA Ch. 9)