An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009.

Slides:



Advertisements
Similar presentations
Artificial Intelligence
Advertisements

1 Containment, Exclusion and Implicativity: A Model of Natural Logic for Textual Inference (MacCartney and Manning)  Every firm saw costs grow more than.
UIUC CS 497: Section EA Lecture #2 Reasoning in Artificial Intelligence Professor: Eyal Amir Spring Semester 2004.
The Logic of Intelligence Pei Wang Department of Computer and Information Sciences Temple University.
Semantics (Representing Meaning)
Inference and Reasoning. Basic Idea Given a set of statements, does a new statement logically follow from this. For example If an animal has wings and.
Logic Use mathematical deduction to derive new knowledge.
Recognizing Textual Entailment Challenge PASCAL Suleiman BaniHani.
Natural Logic for Textual Inference Bill MacCartney and Christopher D. Manning NLP Group Stanford University 29 June 2007.
An Introduction to Propositional Logic Translations: Ordinary Language to Propositional Form.
Logic Concepts Lecture Module 11.
Statistical NLP: Lecture 3
CAS LX 502 8a. Formal semantics Truth and meaning The basis of formal semantics: knowing the meaning of a sentence is knowing under what conditions.
Albert Gatt LIN3021 Formal Semantics Lecture 5. In this lecture Modification: How adjectives modify nouns The problem of vagueness Different types of.
Robust Textual Inference via Graph Matching Aria Haghighi Andrew Ng Christopher Manning.
Linguistic Theory Lecture 8 Meaning and Grammar. A brief history In classical and traditional grammar not much distinction was made between grammar and.
LTAG Semantics on the Derivation Tree Presented by Maria I. Tchalakova.
Formal Logic Proof Methods Direct Proof / Natural Deduction Conditional Proof (Implication Introduction) Reductio ad Absurdum Resolution Refutation.
Copyright © 2003 by The McGraw-Hill Companies, Inc. All rights reserved. Business and Administrative Communication SIXTH EDITION.
NaLIX: A Generic Natural Language Search Environment for XML Data Presented by: Erik Mathisen 02/12/2008.
1 Natural Language Processing for the Web Prof. Kathleen McKeown 722 CEPSR, Office Hours: Wed, 1-2; Tues 4-5 TA: Yves Petinot 719 CEPSR,
Two Related Approaches to the Problem of Textual Inference Bill MacCartney NLP Group Stanford University 6 March 2008.
Syllabus Every Week: 2 Hourly Exams +Final - as noted on Syllabus
Natural Language Inference Bill MacCartney NLP Group Stanford University 8 May 2009.
Modeling Semantic Containment and Exclusion in Natural Language Inference Bill MacCartney and Christopher D. Manning NLP Group Stanford University 22 August.
Containment, Exclusion, and Implicativity: A Model of Natural Logic for Textual Inference Bill MacCartney and Christopher D. Manning NLP Group Stanford.
Artificial Intelligence
Proof by Deduction. Deductions and Formal Proofs A deduction is a sequence of logic statements, each of which is known or assumed to be true A formal.
Meaning and Language Part 1.
Copyright © Cengage Learning. All rights reserved.
Linguistic Theory Lecture 3 Movement. A brief history of movement Movements as ‘special rules’ proposed to capture facts that phrase structure rules cannot.
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
Debbie Mueller Mathematical Logic Spring English sentences take the form Q A B Q is a determiner expression  the, every, some, more than, at least,
Lecture 8 Introduction to Logic CSCI – 1900 Mathematics for Computer Science Fall 2014 Bill Pine.
Outline P1EDA’s simple features currently implemented –And their ablation test Features we have reviewed from Literature –(Let’s briefly visit them) –Iftene’s.
Natural Logic and Natural Language Inference Bill MacCartney Stanford University / Google, Inc. 8 April 2011.
Dr. Monira Al-Mohizea MORPHOLOGY & SYNTAX WEEK 12.
November 2003CSA4050: Semantics I1 CSA4050: Advanced Topics in NLP Semantics I What is semantics for? Role of FOL Montague Approach.
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 13, Feb 16, 2007.
Chapter 1, Part II: Predicate Logic With Question/Answer Animations.
Pattern-directed inference systems
1 Knowledge Representation. 2 Definitions Knowledge Base Knowledge Base A set of representations of facts about the world. A set of representations of.
Noun-Phrase Analysis in Unrestricted Text for Information Retrieval David A. Evans, Chengxiang Zhai Laboratory for Computational Linguistics, CMU 34 th.
Chapter 1, Part II: Predicate Logic With Question/Answer Animations.
Computational Semantics Day 5: Inference Aljoscha.
Ideas for 100K Word Data Set for Human and Machine Learning Lori Levin Alon Lavie Jaime Carbonell Language Technologies Institute Carnegie Mellon University.
LECTURE 2: SEMANTICS IN LINGUISTICS
Key Concepts Representation Inference Semantics Discourse Pragmatics Computation.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
Element Level Semantic Matching Pavel Shvaiko Meaning Coordination and Negotiation Workshop, ISWC 8 th November 2004, Hiroshima, Japan Paper by Fausto.
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 16, March 6, 2007.
Chapter 2 Logic 2.1 Statements 2.2 The Negation of a Statement 2.3 The Disjunction and Conjunction of Statements 2.4 The Implication 2.5 More on Implications.
Making it stick together…
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
SYNTAX.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
ARTIFICIAL INTELLIGENCE Lecture 2 Propositional Calculus.
Meaning and Language Part 1. Plan We will talk about two different types of meaning, corresponding to two different types of objects: –Lexical Semantics:
5 Lecture in math Predicates Induction Combinatorics.
Hedging, Boosting and Positioning
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Logic.
Statistical NLP: Lecture 3
Semantics (Representing Meaning)
Introduction to Prolog
Language, Logic, and Meaning
Are deictic and anaphoric uses distinguishable?
Back to “Serious” Topics…
Predicates and Quantifiers
Propositional Logic CMSC 471 Chapter , 7.7 and Chuck Dyer
Presentation transcript:

An Extended Model of Natural Logic Bill MacCartney and Christopher D. Manning NLP Group Stanford University 8 January 2009

2 Natural language inference (NLI) Aka recognizing textual entailment (RTE) Does premise P justify an inference to hypothesis H? An informal, intuitive notion of inference: not strict logic Emphasis on variability of linguistic expression Necessary to goal of natural language understanding (NLU) Can also enable semantic search, question answering, … P Every firm polled saw costs grow more than expected, even after adjusting for inflation. H Every big company in the poll reported cost increases. yes Some no Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

3 NLI: a spectrum of approaches lexical/ semantic overlap Jijkoun & de Rijke 2005 patterned relation extraction Romano et al semantic graph matching MacCartney et al Hickl et al FOL & theorem proving Bos & Markert 2006 robust, but shallow deep, but brittle natural logic (this work) Problem: imprecise  easily confounded by negation, quantifiers, conditionals, factive & implicative verbs, etc. Problem: hard to translate NL to FOL idioms, anaphora, ellipsis, intensionality, tense, aspect, vagueness, modals, indexicals, reciprocals, propositional attitudes, scope ambiguities, anaphoric adjectives, non- intersective adjectives, temporal & causal relations, unselective quantifiers, adverbs of quantification, donkey sentences, generic determiners, comparatives, phrasal verbs, … Solution? Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

4 What is natural logic? (  natural deduction) Characterizes valid patterns of inference via surface forms precise, yet sidesteps difficulties of translating to FOL A long history traditional logic: Aristotle’s syllogisms, scholastics, Leibniz, … modern natural logic begins with Lakoff (1970) van Benthem & Sánchez Valencia ( ): monotonicity calculus Nairn et al. (2006): an account of implicatives & factives We introduce a new theory of natural logic extends monotonicity calculus to account for negation & exclusion incorporates elements of Nairn et al.’s model of implicatives Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

5 16 elementary set relations ?? ?? yy xx x y Assign sets  x, y  to one of 16 relations, depending on emptiness or non- emptiness of each of four partitions Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion empty non-empty

6 16 elementary set relations x ^ y x  yx  y x  yx  y x ⊐ yx ⊐ y x ⊏ yx ⊏ y x | yx # y But 9 of 16 are degenerate: either x or y is either empty or universal. I.e., they correspond to semantically vacuous expressions, which are rare outside logic textbooks. We therefore focus on the remaining seven relations. Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

7 The set of 7 basic semantic relations Vennsymbo l name example x  yx  y equivalence couch  sofa x ⊏ yx ⊏ y forward entailment (strict) crow ⊏ bird x ⊐ yx ⊐ y reverse entailment (strict) European ⊐ French x ^ y negation (exhaustive exclusion) human ^ nonhuman x | y alternation (non-exhaustive exclusion) cat | dog x  y cover (exhaustive non-exclusion) animal  nonhuman x # y independence hungry # hippo Relations are defined for all semantic types: tiny ⊏ small, hover ⊏ fly, kick ⊏ strike, this morning ⊏ today, in Beijing ⊏ in China, everyone ⊏ someone, all ⊏ most ⊏ some Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

8 | x R y Joining semantic relations fishhumannonhuman ^ yz S??  ⋈  ⊏ ⋈ ⊏  ⊏ ⊐ ⋈ ⊐  ⊐ ^ ⋈ ^  R ⋈  R  ⋈ R  R ⊏ Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

9 Some joins yield unions of relations! x | yy | zx ? z couch | table | sofacouch  sofa pistol | knife | gunpistol ⊏ gun dog | cat | terrierdog ⊐ terrier rose | orchid | daisyrose | daisy woman | frog | Eskimowoman # Eskimo What is | | ? ⋈ | |   { , ⊏, ⊐, |, #} ⋈ Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

10 Of 49 join pairs, 32 yield relations in ; 17 yield unions Larger unions convey less information — limits power of inference In practice, any union which contains # can be approximated by # The complete join table Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

11  will depend on: 1.the lexical semantic relation generated by e:  (e) 2.other properties of the context x in which e is applied  (, ) Lexical semantic relations xe(x)e(x) compound expression atomic edit: DEL, INS, SUB semantic relation Example: suppose x is red car If e is SUB ( car, convertible ), then  (e) is ⊐ If e is DEL ( red ), then  (e) is ⊏ Crucially,  (e) depends solely on lexical items in e, independent of context x But how are lexical semantic relations determined? Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

12 Lexical semantic relations: SUBs  ( SUB (x, y)) =  (x, y) For open-class terms, use lexical resource (e.g. WordNet)  for synonyms: sofa  couch, forbid  prohibit ⊏ for hypo-/hypernyms: crow ⊏ bird, frigid ⊏ cold, soar ⊏ rise |for antonyms and coordinate terms: hot | cold, cat | dog  or | for proper nouns: USA  United States, JFK | FDR # for most other pairs: hungry # hippo Closed-class terms may require special handling Quantifiers: all ⊏ some, some ^ no, no | all, at least 4  at most 6 See paper for discussion of pronouns, prepositions, … Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

13 Lexical semantic relations: DELs & INSs Generic (default) case:  ( DEL ()) = ⊏,  ( INS ()) = ⊐ Examples: red car ⊏ car, sing ⊐ sing off-key Even quite long phrases: car parked outside since last week ⊏ car Applies to intersective modifiers, conjuncts, independent clauses, … This heuristic underlies most approaches to RTE! Does P subsume H? Deletions OK; insertions penalized. Special cases Negation: didn’t sleep ^ did sleep Implicatives & factives (e.g. refuse to, admit that ): discussed later Non-intersective adjectives: former spy | spy, alleged spy # spy Auxiliaries etc.: is sleeping  sleeps, did sleep  slept Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

14 The impact of semantic composition How are semantic relations affected by semantic composition? x y  ? The monotonicity calculus provides a partial answer UP  ⊏  ⊏ ⊐  ⊐ #  # DOWN  ⊏  ⊐ ⊐  ⊏ #  # NON  ⊏  # ⊐  # #  # If f has monotonicity… How is  (x, y) projected by f? But how are other relations (|, ^,  ) projected? Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation means fn application 

15 A typology of projectivity Projectivity signatures: a generalization of monotonicity classes negatio n  ⊏  ⊐ ⊐  ⊏ ^  ^ |   | #  # not French  not German not more than 4 | not less than 6 not human ^ not nonhuman didn’t kiss ⊐ didn’t touch not ill ⊏ not seasick In principle, 7 7 possible signatures, but few actually realized ↦ Each projectivity signature is a map not happy  not glad isn’t swimming # isn’t hungry Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

16 A typology of projectivity Projectivity signatures: a generalization of monotonicity classes Each projectivity signature is a map In principle, 7 7 possible signatures, but few actually realized ↦ negatio n  ⊏  ⊐ ⊐  ⊏ ^  ^ |   | #  # metallic pipe # nonferrous pipe intersective modification  ⊏  ⊏ ⊐  ⊐ ^  | |  |  # #  # live human | live nonhuman French wine | Spanish wine See paper for projectivity of various quantifiers, verbs Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

17 Projecting through multiple levels ⊏ ⊏ ⊐ ⊐ ⊐ @ Propagate semantic relation between atoms upward, according to projectivity class of each node on path to root nobody can enter with a shirt ⊏ nobody can enter with clothes Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

18 Implicatives & factives [Nairn et al. 06] signatur e example implicative s + / – he managed to escape + / o he was forced to sell o / – he was permitted to live implicative s – / + he forgot to pay – / o he refused to fight o / + he hesitated to ask factives+ / + he admitted that he knew – / – he pretended he was sick o / o he wanted to fly 9 signatures, per implications (+, –, or o) in positive and negative contexts Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

19 Implicatives & factives signatur e example  ( DEL )  ( INS ) implicative s + / – he managed to escape  he escaped  + / o he was forced to sell ⊏ he sold ⊏⊐ o / – he was permitted to live ⊐ he lived ⊐⊏ implicative s – / + he forgot to pay ^ he paid ^^ – / o he refused to fight | he fought || o / + he hesitated to ask  he asked  factives+ / + he admitted that he knew ⊏ he knew ⊏⊐ – / – he pretended he was sick | he was sick || o / o he wanted to fly # he flew ## We can specify relation generated by DEL or INS of each signature Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion Room for variation w.r.t. infinitives, complementizers, passivation, etc. Some more intuitive when negated: he didn’t hesitate to ask | he didn’t ask Factives not fully explained: he didn’t admit that he knew | he didn’t know

20 Putting it all together 1.Find a sequence of edits  e 1, …, e n  which transforms p into h. Define x 0 = p, x n = h, and x i = e i (x i–1 ) for i  [1, n]. 2.For each atomic edit e i : a.Determine the lexical semantic relation  (e i ). b.Project  (e i ) upward through the semantic composition tree of expression x i–1 to find the atomic semantic relation  (x i–1, x i ) 3.Join atomic semantic relations across the sequence of edits:  (p, h) =  (x 0, x n ) =  (x 0, x 1 ) ⋈ … ⋈  (x i–1, x i ) ⋈ … ⋈  (x n–1, x n ) Limitations: need to find appropriate edit sequence connecting p and h; tendency of ⋈ operation toward less-informative semantic relations; lack of general mechanism for combining multiple premises Less deductive power than FOL. Can’t handle e.g. de Morgan’s Laws. Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

21 An example P The doctor didn’t hesitate to recommend Prozac. H The doctor recommended medication. yes ieiei xixi lexatomjoin The doctor didn’t hesitate to recommend Prozac. 1DEL( hesitate to ) The doctor didn’t recommend Prozac. 2DEL( didn’t ) The doctor recommended Prozac. 3SUB( Prozac, medication ) The doctor recommended medication. Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion  || ^^ ⊏ ⊏⊏⊏ yes

22 Different edit orders? ieiei lexatomjoin 1DEL( hesitate to )  || 2DEL( didn’t )^^ ⊏ 3SUB( Prozac, medication ) ⊏⊏⊏ ieiei lexatomjoin 1DEL( didn’t )^^^ 2DEL( hesitate to )  ⊏ 3SUB( Prozac, medication ) ⊏⊏⊏ ieiei lexatomjoin 1SUB( Prozac, medication ) ⊏⊏⊏ 2DEL( hesitate to )  || 3DEL( didn’t )^^ ⊏ ieiei lexatomjoin 1DEL( hesitate to )  || 2SUB( Prozac, medication ) ⊏⊐ | 3DEL( didn’t )^^ ⊏ ieiei lexatomjoin 1DEL( didn’t )^^^ 2SUB( Prozac, medication ) ⊏⊐ | 3DEL( hesitate to )  ⊏ ieiei lexatomjoin 1SUB( Prozac, medication ) ⊏⊏⊏ 2DEL( didn’t )^^| 3DEL( hesitate to )  ⊏ Intermediate steps may vary; final result is typically (though not necessarily) the same Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

23 Implementation & evaluation The NatLog system: an implementation of this model in code For implementation details, see [MacCartney & Manning 2008] Evaluation on FraCaS test suite 183 NLI problems, nine sections, three-way classification Accuracy 70% overall; 87% on “relevant” sections (60% coverage) Precision 89% overall: rarely predicts entailment wrongly Evaluation on RTE3 test suite Longer, more natural premises; greater diversity of inference types NatLog alone has mediocre accuracy (59%) but good precision Hybridization with broad-coverage RTE system yields gains of 4% Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion

24 Natural logic is not a universal solution for NLI Many types of inference not amenable to natural logic approach Our inference method faces many limitations on deductive power More work to be done in fleshing out our account Establishing projectivity signatures for more quantifiers, verbs, etc. Better incorporating presuppositions But, our model of natural logic fills an important niche Precise reasoning on negation, antonymy, quantifiers, implicatives, … Sidesteps the myriad difficulties of full semantic interpretation Practical value demonstrated on FraCaS and RTE3 test suites Conclusion Natural logic is not a universal solution for NLI Many types of inference not amenable to natural logic approach Our inference method faces many limitations on deductive power More work to be done in fleshing out our account Establishing projectivity signatures for more quantifiers, verbs, etc. Better incorporating presuppositions But, our model of natural logic fills an important niche Precise reasoning on negation, antonymy, quantifiers, implicatives, … Sidesteps the myriad difficulties of full semantic interpretation Practical value demonstrated on FraCaS and RTE3 test suites Introduction Semantic Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion :-) Thanks! Questions?