Natural Language Processing

Slides:



Advertisements
Similar presentations
First-Order Logic Chapter 8.
Advertisements

Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Logic.
Knowledge Representation Methods
Logic in general Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the sentences in the language.
Knowledge Representation & Reasoning.  Introduction How can we formalize our knowledge about the world so that:  We can reason about it?  We can do.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 9 Jim Martin.
Knowledge Representation using First-Order Logic (Part II) Reading: Chapter 8, First lecture slides read: Second lecture slides read:
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Knowledge Representation & Reasoning (Part 1) Propositional Logic chapter 6 Dr Souham Meshoul CAP492.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Knowledge Representation I (Propositional Logic) CSE 473.
Knoweldge Representation & Reasoning
Let remember from the previous lesson what is Knowledge representation
Logical Agents Chapter 7 Feb 26, Knowledge and Reasoning Knowledge of action outcome enables problem solving –a reflex agent can only find way from.
Logical Agents Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2005.
Propositional Logic Agenda: Other forms of inference in propositional logic Basics of First Order Logic (FOL) Vision Final Homework now posted on web site.
FIRST ORDER LOGIC Levent Tolga EREN.
1 CS 2710, ISSP 2610 Chapter 8, Part 2 First Order Predicate Calculus FOPC.
1 CS 2710, ISSP 2610 Chapter 7 Propositional Logic Reasoning.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Logical Agents Logic Propositional Logic Summary
1 Natural Language Processing Chapter Transition First we did words (morphology) Then we looked at syntax Now we’re moving on to meaning.
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
For Friday Read chapter 8 Homework: –Chapter 7, exercise 1.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
For Wednesday Read chapter 9, sections 1-3 Homework: –Chapter 7, exercises 8 and 9.
11 Artificial Intelligence CS 165A Thursday, October 25, 2007  Knowledge and reasoning (Ch 7) Propositional logic 1.
First-Order Logic Reading: C. 8 and C. 9 Pente specifications handed back at end of class.
First-Order Logic Semantics Reading: Chapter 8, , FOL Syntax and Semantics read: FOL Knowledge Engineering read: FOL.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
1 UNIT-3 KNOWLEDGE REPRESENTATION. 2 Agents that reason logically(Logical agents) A Knowledge based Agent The Wumpus world environment Representation,
1 CS 2710, ISSP 2610 Chapter 8, Part 1 First Order Predicate Calculus FOPC.
Artificial Intelligence Logical Agents Chapter 7.
Logical Agents. Inference : Example 1 How many variables? 3 variables A,B,C How many models? 2 3 = 8 models.
LOGICAL AGENTS CHAPTER 7 AIMA 3. OUTLINE  Knowledge-based agents  Wumpus world  Logic in general - models and entailment  Propositional (Boolean)
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Propositional Logic: Logical Agents (Part I)
Chapter 7. Propositional and Predicate Logic
Knowledge Representation using First-Order Logic
Knowledge Representation and Reasoning
Knowledge and reasoning – second part
First-Order Logic Chapter 8.
The Propositional Calculus
EA C461 – Artificial Intelligence Logical Agent
Logical Agents Chapter 7.
First-Order Logic Chapter 8.
Artificial Intelligence Lecture 11: First Order Logic
Artificial Intelligence
CS 416 Artificial Intelligence
Chapter 8, Part 1 First Order Predicate Calculus FOPC
Knowledge and reasoning – second part
Back to “Serious” Topics…
CSCI 5832 Natural Language Processing
Logical Agents Chapter 7.
EA C461 – Artificial Intelligence Logical Agent
Chapter 7. Propositional and Predicate Logic
CS 416 Artificial Intelligence
Knowledge Representation I (Propositional Logic)
Methods of Proof Chapter 7, second half.
Propositional Logic CMSC 471 Chapter , 7.7 and Chuck Dyer
First-Order Logic Chapter 8.
Logical and Rule-Based Reasoning Part I
Chapter 8, Part 1 First Order Predicate Calculus FOPC
Logical Agents Prof. Dr. Widodo Budiharto 2018
Presentation transcript:

Natural Language Processing Lecture Notes 10 Chapter 14

Big Transition First we did words (morphology) Then we looked at syntax Now we’re moving on to meaning.

Semantics What things mean Approach: meaning representations which bridge the gap from linguistic forms to knowledge of the world Serve the practical purposes of a program doing semantic processing

Semantic Processing Representations that allow a system to Answer questions Determine truth Perform inference …

Types of Meaning Representations First-order predicate calculus Semantic networks Conceptual dependency Frame-based representations See lecture for examples…

Verifiability Does Spice Island serve vegetarian food? Serves(spiceisland,vegetarianfood) Verifiability: the system’s ability to compare representations to facts in memory

Current Focus in Class Conventional meanings of words Ignore context Literal meaning

Ambiguity I want to eat someplace that’s close to Pitt Mary kissed her husband and Joan did too I baked the cake on the table Old men and women go to the park Every student ate a sandwich

Canonical Form Does Spice Island have vegetarian dishes? Do they have vegetarian food at Spice Island? Are vegetarian dishes served at Spice Island? Does Spice Island serve vegetarian fare? Canonical form: inputs that mean the same thing should have the same meaning representations

Canonical Form Simplifies reasoning Makes representations more compact (fewer different representations) BUT: makes semantic analysis harder Need to figure out that “have” and “serve” mean the same thing in the previous examples; same for the various phrases for vegetarian food BUT: can perform word sense disambiguation; use a single representation for all senses in a synset

Representational Schemes We’re going to make use of First Order Predicate Calculus (FOPC) as our representational framework Not because we think it’s perfect All the alternatives turn out to be either too limiting or too complicated, or They turn out to be notational variants

Knowledge Based Agents Central component: knowledge base, or KB. A set of sentences in a knowledge representation language Generic Functions TELL (add a fact to the knowledge base) ASK (get next action based on info in KB) Both often involve inference, which is? Deriving new sentences from old

Fundamental Concepts of logical representation and reasoning Information is represented in sentences, which must have correct syntax ( 1 + 2 ) * 7 = 21 vs. 2 ) + 7 = * ( 1 21 The semantics of a sentence defines its truth with respect to each possible world W is a model of S means that sentence S is true in world W What do the following mean? X |= Y X entails Y Y logically follows from X

Entailment A |= B In all worlds in which A is true, B must be true as well All models of A are models of B Whenever A is true, B must be true as well A entails B B logically follows from A

Inference KB |-i A Inference algorithm i can derive A from KB i derives A from KB i can derive A from KB A can be inferred from KB by i

Propositional Logic Syntax Sentence -> AtomicSent | complexSent AtomicSent -> true|false| P, Q, R … ComplexSent -> sentence | ( sentence  sentence ) | ( sentence  sentence ) | ( sentence sentence ) | ( sentence  sentence ) | ( sentence ) [no predicate or function symbols]

Propositional Logic Sentences If there is a pit at [1,1], there is a breeze at [1,0] P11  B10 There is a breeze at [2,2], if and only if there is a pit in the neighborhood B22  ( P21  P23  P12  P32 ) There is no breeze at [2,2] B22

Semantics of Prop Logic In model-theoretic semantics, an interpretation assigns elements of the world to sentences, and defines the truth values of sentences Propositional logic: easy! Assign T or F to each proposition symbol; then assign truth values to complex sentences in the obvious way

Propositional Logic A ^ B is true if both A and B are true A v B is true if one or both A and B are true P  Q equiv ~P v Q. Thus, P  Q is false if P is true and Q is false. Otherwise, P  Q is true. ~A is true if A is false

Proofs A derivation A sequence of applications of (usually sound) rules of inference Reasoning by Search Example KB = AB, BC, DE, EF, D Forward chaining: Add A, infer B, infer C Backward chaining:F? E? D? Yes… Sound but not complete inference procedures

For your information only; resolution won’t be on exam Resolution allows a sound and complete inference mechanism (search-based) using only one rule of inference Resolution rule: Given: P1  P2  P3 … Pn, and P1  Q1 … Qm Conclude: P2  P3 … Pn  Q1 … Qm Complementary literals P1 and P1 “cancel out” For your information only; resolution won’t be on exam

Again, for your information only; will not be on the exam Resolution Any complete search algorithm, applying only the resolution rule, can derive any conclusion entailed by any KB in propositional logic. Refutation completeness: Given A, we cannot use resolution to generate the consequence A v B. But we can answer the question, is A v B true. I.e., resolution can be used to confirm or refute a sentence Again, for your information only; will not be on the exam

Unsound (but useful) Inference Where there is smoke, there is fire Example KB: Fire  Smoke, Smoke Abduction: conclude Fire Unsound: Example KB1: Fire  Smoke, DryIce  Smoke, Smoke DryIce rather than Fire could be true

Propositional Logic  FOPC B11  (P12 v P21) B23  (P32 v P 23 v P34 v P 43) … “Internal squares adjacent to pits are breezy”: All X Y (B(X,Y) ^ (X > 1) ^ (Y > 1) ^ (Y < 4) ^ (X < 4))  (P(X-1,Y) v P(X,Y-1) v P(X+1,Y) v (X,Y+1))

Ontological commitment FOPC Worlds Rather than just T,F, now worlds contain: Objects: the gold, the wumpus, people, ideas, … “the domain” Predicates: holding, breezy, red, sisters Functions: fatherOf, colorOf, plus Ontological commitment

FOPC Syntax Add variables and quantifiers to propositional logic

Sentence  AtomicSentence | (Sentence Connective Sentence) | Quantifier Variable, … Sentence | ~Sentence AtomicSentence  Predicate(Term,…) | Term = Term Term  Function(Term,…) | Constant | Variable Connective   | ^ | v |  Quantifier  all, exists Constant  john, 1, … Variable  A, B, C, X Predicate  breezy, sunny, red Function  fatherOf, plus Knowledge engineering involves deciding what types of things Should be constants, predicates, and functions for your problem

Examples Everyone likes chocolate Someone likes chocolate X (person(X)  likes(X, chocolate)) Someone likes chocolate X (person(X) ^ likes(X, chocolate)) Everyone likes chocolate unless they are allergic to it X ((person(X) ^ allergic (X, chocolate))  likes(X, chocolate))

Quantifiers All X p(X) means that p holds for all elements in the domain Exists X p(X) means that p holds for at least one element of the domain

Nesting of Variables Everyone likes some kind of food Put quantifiers in front of likes(P,F) Assume the domain of discourse of P is the set of people Assume the domain of discourse of F is the set of foods Everyone likes some kind of food There is a kind of food that everyone likes Someone likes all kinds of food Every food has someone who likes it

Answers (DOD of P is people and of F is food) Everyone likes some kind of food All P Exists F likes(P,F) There is a kind of food that everyone likes Exists F All P likes(P,F) Someone likes all kinds of food Exists P All F likes(P,F) Every food has someone who likes it All F Exists P likes(P,F)

Answers, without Domain of Discourse Assumptions Everyone likes some kind of food All P (person(P)  Exists F (food(F) and likes(P,F))) There is a kind of food that everyone likes Exists F (food(F) and (All P (person(P)  likes(P,F)))) Someone likes all kinds of food Exists P (person(P) and (All F (food(F)  likes(P,F)))) Every food has someone who likes it All F (food (F)  Exists P (person(P) and likes(P,F)))

Interpretation Specifies which objects, functions, and predicates are referred to by which constant symbols, function symbols, and predicate symbols.

Example 3 people: John, Sally, Bill John is tall Sally and Bill are short John is Bill’s father Sally is Bill’s sister Interpretation 1 (others are possible): “John”, “Sally”, and “Bill” as you think “person”  {John, Sally,Bill} “short”  {Sally,Bill} “tall”  {John} “sister”  {<Sally,Bill>} A 2-ary predicate “father” {<Bill,John>} A 1-ary function

Determining Truth Values of FOPC sentences Connectives and negation are the same as in propositional logic

Example tall(father(bill)) ^ ~sister(sally,bill) Assign meanings to terms: “bill”  Bill; “sally”  Sally; “father(bill)”  John Assign truth values to atomic sentences Tall(father(bill)) is T because John is in the set assigned to “tall” ~sister(sally,bill) is F because <Sally,Bill> is in the set assigned to “sister” So, sentence is false, because T ^ F is F

Determining Truth Values Exist X tall(X) : true, because the set assigned to “tall” isn’t {} All X short(X) : false, because there are objects that are not in the set assigned to “short”

Representational Schemes What are the objects, predicates, and functions?

Choices: Functions vs Predicates Rep-Scheme 1: tall(fatherOf(bob)). Rep-Scheme 2: Exists X (fatherOf(bob,X) ^ tall(X) ^ (All Y (fatherOf(bob,Y)  X = Y))) “fatherOf” in both cases is assigned a set of 2-tuples: {<b,bf>,<t,tf>,…} But {<b,bf>,<t,tf>,<b,bff>,…} is possible if it is a predicate

Choices: Predicates versus Constants Rep-Scheme 1: Let’s consider the world: D = {a,b,c,d,e}. red: {a,b,c}. pink: {d,e}. Some sentences that are satisfied by the intended interpretation: red(a). red(b). pink(d). ~(All X red(X)). All X (red(X) v pink(X)). But what if we want to say that pink is pretty?

Choices: Predicates versus Constants Rep-Scheme 2: The world: D = {a,b,c,d,e,red,pink} colorof: {<a,red>,<b,red>,<c,red>,<d,pink>,<e,pink>} pretty: {pink} primary: {red} Some sentences that are satisfied by the intended interpretation: colorOf(a,red). colorOf(b,red). colorOf(d,pink). ~(All X colorOf(X,red)). All X (colorOf(X,red) v colorOf(X,pink)). ***pretty(pink). primary(red).*** We have reified predicates pink and red: made them into objects

Inference with Quantifiers Universal Instantiation: Given X (person(X)  likes(X, sun)) Infer person(john)  likes(john,sun) Existential Instantiation: Given x likes(x, chocolate) Infer: likes(S1, chocolate) S1 is a “Skolem Constant” that is not found anywhere else in the KB and refers to (one of) the individuals who likes sun.

FOPC Allows for… The analysis of truth conditions Allows us to answer yes/no questions Supports the use of variables Allows us to answer questions through the use of variable binding Supports inference Allows us to answer questions that go beyond what we know explicitly

FOPC This choice isn’t completely arbitrary or driven by the needs of practical applications FOPC reflects the semantics of natural languages because it was designed that way by human beings In particular…

Meaning Structure of Language The semantics of human languages… Display a basic predicate-argument structure Make use of variables Make use of quantifiers Use a partially compositional semantics

Predicate-Argument Structure Events, actions and relationships can be captured with representations that consist of predicates and arguments to those predicates. Languages display a division of labor where some words and constituents function as predicates and some as arguments.

Example Mary gave a list to John Giving(Mary, John, List) More precisely Gave conveys a three-argument predicate The first arg refers to the subject (the giver) The second is the recipient, which is conveyed by the NP in the PP The third argument refers to the thing given, conveyed by the direct object

Is this a good representation? John gave Mary a book for Susan Giving (john,mary,book,susan) John gave Mary a book for Susan on Wednesday Giving (john,mary,book,susan,wednesday) John gave Mary a book for Susan on Wednesday in class Giving (john,mary,book,susan,wednesday,inClass) John gave Mary a book for Susan on Wednesday in class after 2pm Giving (john,mary,book,susan,wednesday,inClass,>2pm)

Reified Representation Exist b,e (ISA(e,giving) ^ agent(e,john) ^ beneficiary(e,sally) ^ patient(e,b) ^ ISA(b,book)) “That happened on Sunday” Add later (assuming S2 is the skolem for e): happenedOnDay(s2,sunday)

Representing Time in Language I ran to Oakland I am running to Oakland I will run to Oakland Now, all represented the same: Exist w (ISA(w,running) ^ agent(w,speaker) ^ dest(w,oakland))

Representing Time Events are associated with points or intervals in time. Exist w,i (ISA(w,running) ^ agent(w,speaker) ^ dest(w,oakland) ^ interval(w,i) ^ precedes(i,now)) … member(i,now) … precedes(now,i)

Determining Temporal Relations is complex (largely unsolved) Ok, so for Christmas, we fly to Dallas then to El Paso (refers to the future, but the tense is present) Let’s see, flight 1390 will be at the gate an hour now (refers to an interval starting in the past using the future tense) I take the bus in the morning but the incline in the evening (habitual – not a specific morning or evening)

Tense versus Aspect Flight 2020 arrived Flight 2020 had arrived What’s the difference? What do you expect in the second example?

Reference Point Reichenbach (1947) introduced notion of Reference point (R), separated out from Speech time (S) and Event time (E) Example: When Mary's flight departed, I ate lunch When Mary's flight departed, I had eaten lunch Departure event specifies reference point.

Reichenbach Applied to Tenses R,S S S,R,E S,R S This is for the “posterior present”. “I will eat” in English is ambiguous. The “simple future” is S < R=E Je vais dormir: S=R < E Je dormirai: S<R=E