Download presentation
Presentation is loading. Please wait.
Published byEdwin Dickerson Modified over 9 years ago
1
Natural Logic and Natural Language Inference Bill MacCartney Stanford University / Google, Inc. 8 April 2011
2
2 Two disclaimers The work I present today isn’t exactly fresh Essentially, it’s my dissertation work from 2009 I hope it can usefully provide context for more recent work I’m a computer scientist, not a semanticist or a logician Consequently, I emphasize pragmatism over rigor Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
3
3 Natural language inference (NLI) Aka recognizing textual ‘entailment’ (RTE) Does premise P justify an inference to hypothesis H? An informal, intuitive notion of inference: not strict logic Emphasis on variability of linguistic expression Necessary to goal of natural language understanding (NLU) Can also enable semantic search, question answering, … P Every firm polled saw costs grow more than expected, even after adjusting for inflation. H Every big company in the poll reported cost increases. yes Some no Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
4
4 NLI: a spectrum of approaches lexical/ semantic overlap Jijkoun & de Rijke 2005 patterned relation extraction Romano et al. 2006 semantic graph matching MacCartney et al. 2006 Hickl et al. 2006 FOL & theorem proving Bos & Markert 2006 robust, but shallow deep, but brittle natural logic (this work) Problem: imprecise easily confounded by negation, quantifiers, conditionals, factive & implicative verbs, etc. Problem: hard to translate NL to FOL idioms, anaphora, ellipsis, intensionality, tense, aspect, vagueness, modals, indexicals, reciprocals, propositional attitudes, scope ambiguities, anaphoric adjectives, non- intersective adjectives, temporal & causal relations, unselective quantifiers, adverbs of quantification, donkey sentences, generic determiners, comparatives, phrasal verbs, … Solution? Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
5
5 What is natural logic? ( natural deduction) Characterizes valid patterns of inference via surface forms precise, yet sidesteps difficulties of translating to FOL A long history traditional logic: Aristotle’s syllogisms, scholastics, Leibniz, … the term “natural logic” was introduced by Lakoff (1970) van Benthem & Sánchez Valencia (1986-91): monotonicity calculus Nairn et al. (2006): an account of implicatives & factives We introduce a new theory of natural logic extends monotonicity calculus to account for negation & exclusion incorporates elements of Nairn et al.’s model of implicatives …and implement & evaluate a computational model of it Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
6
6 ‘Entailment’ relations in past work X is a man X is a woman X is a hippo X is hungry X is a fish X is a carp X is a crow X is a bird X is a couch X is a sofa Yes entailment No non-entailment 2-way RTE1,2,3 Yes entailment No contradiction Unknown compatibility 3-way FraCaS, PARC, RTE4 P = Q equivalence P < Q forward entailment P > Q reverse entailment P # Q non-entailment containment Sánchez-Valencia Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
7
7 16 elementary set relations ?? ?? yy xx x y Assign sets x, y to one of 16 relations, depending on emptiness or non- emptiness of each of four partitions Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion empty non-empty
8
8 16 elementary set relations x ^ y x yx y x yx y x ⊐ yx ⊐ y x ⊏ yx ⊏ y x | yx # y But 9 of 16 are degenerate: either x or y is either empty or universal. I.e., they correspond to semantically vacuous expressions, which are rare outside logic textbooks. We therefore focus on the remaining seven relations. Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
9
9 The set of 7 basic entailment relations Vennsymbo l name example x yx y equivalence couch sofa x ⊏ yx ⊏ y forward entailment (strict) crow ⊏ bird x ⊐ yx ⊐ y reverse entailment (strict) European ⊐ French x ^ y negation (exhaustive exclusion) human ^ nonhuman x | y alternation (non-exhaustive exclusion) cat | dog x y cover (exhaustive non-exclusion) animal nonhuman x # y independence hungry # hippo Relations are defined for all semantic types: tiny ⊏ small, hover ⊏ fly, kick ⊏ strike, this morning ⊏ today, in Beijing ⊏ in China, everyone ⊏ someone, all ⊏ most ⊏ some Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
10
10 | x R y Joining entailment relations fishhumannonhuman ^ yz S?? ⋈ ⊏ ⋈ ⊏ ⊏ ⊐ ⋈ ⊐ ⊐ ^ ⋈ ^ R ⋈ R ⋈ R R ⊏ Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
11
11 Some joins yield unions of relations! x | yy | zx ? z couch | table | sofacouch sofa pistol | knife | gunpistol ⊏ gun dog | cat | terrierdog ⊐ terrier rose | orchid | daisyrose | daisy woman | frog | Eskimowoman # Eskimo What is | | ? ⋈ | | { , ⊏, ⊐, |, #} ⋈ Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
12
12 Of 49 join pairs, 32 yield relations in ; 17 yield unions Larger unions convey less information — limits power of inference In practice, any union which contains # can be approximated by # The complete join table Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
13
13 will depend on: 1.the lexical entailment relation generated by e: (e) 2.other properties of the context x in which e is applied (, ) Lexical entailment relations xe(x)e(x) compound expression atomic edit: DEL, INS, SUB entailment relation Example: suppose x is red car If e is SUB ( car, convertible ), then (e) is ⊐ If e is DEL ( red ), then (e) is ⊏ Crucially, (e) depends solely on lexical items in e, independent of context x But how are lexical entailment relations determined? Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
14
14 Lexical entailment relations: SUBs ( SUB (x, y)) = (x, y) For open-class terms, use lexical resource (e.g. WordNet) for synonyms: sofa couch, forbid prohibit ⊏ for hypo-/hypernyms: crow ⊏ bird, frigid ⊏ cold, soar ⊏ rise |for antonyms and coordinate terms: hot | cold, cat | dog or | for proper nouns: USA United States, JFK | FDR # for most other pairs: hungry # hippo Closed-class terms may require special handling Quantifiers: all ⊏ some, some ^ no, no | all, at least 4 at most 6 See paper for discussion of pronouns, prepositions, … Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
15
15 Lexical entailment relations: DEL & INS Generic (default) case: ( DEL ()) = ⊏, ( INS ()) = ⊐ Examples: red car ⊏ car, sing ⊐ sing off-key Even quite long phrases: car parked outside since last week ⊏ car Applies to intersective modifiers, conjuncts, independent clauses, … This heuristic underlies most approaches to RTE! Does P subsume H? Deletions OK; insertions penalized. Special cases Negation: didn’t sleep ^ did sleep Implicatives & factives (e.g. refuse to, admit that ): discussed later Non-intersective adjectives: former spy | spy, alleged spy # spy Auxiliaries etc.: is sleeping sleeps, did sleep slept Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
16
16 The impact of semantic composition How are entailment relations affected by semantic composition? f @ f @ x y ? The monotonicity calculus provides a partial answer UP ⊏ ⊏ ⊐ ⊐ # # DOWN ⊏ ⊐ ⊐ ⊏ # # NON ⊏ # ⊐ # # # If f has monotonicity… How is (x, y) projected by f? But how are other relations (|, ^, ) projected? Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion @ means fn application [ ]
17
negation ⊏ ⊐ ⊐ ⊏ ^ ^ | | # # not happy not glad isn’t swimming # isn’t hungry 17 A typology of projectivity Projectivity signatures: a generalization of monotonicity classes not French not German not more than 4 | not less than 6 not human ^ not nonhuman didn’t kiss ⊐ didn’t touch not ill ⊏ not seasick In principle, 7 7 possible signatures, but few actually realized ↦ Each projectivity signature is a map Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
18
intersective modification ⊏ ⊏ ⊐ ⊐ ^ | | | # # # 18 A typology of projectivity Projectivity signatures: a generalization of monotonicity classes Each projectivity signature is a map In principle, 7 7 possible signatures, but few actually realized ↦ negation ⊏ ⊐ ⊐ ⊏ ^ ^ | | # # metallic pipe # nonferrous pipe live human | live nonhuman French wine | Spanish wine See my disseration for projectivity of various quantifiers, verbs Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
19
19 Projecting through multiple levels ⊏ ⊏ ⊐ ⊐ ⊐ a shirtnobodycanwithoutenter @ @ @ @ clothesnobodycanwithoutenter @ @ @ @ Propagate entailment relation between atoms upward, according to projectivity class of each node on path to root nobody can enter without a shirt ⊏ nobody can enter without clothes Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
20
20 Implicatives & factives [Nairn et al. 06] signatur e example implicative s + / – he managed to escape + / o he was forced to sell o / – he was permitted to live implicative s – / + he forgot to pay – / o he refused to fight o / + he hesitated to ask factives+ / + he admitted that he knew – / – he pretended he was sick o / o he wanted to fly 9 signatures, per implications (+, –, or o) in positive and negative contexts Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
21
21 Implicatives & factives signatur e example ( DEL ) ( INS ) implicative s + / – he managed to escape he escaped + / o he was forced to sell ⊏ he sold ⊏⊐ o / – he was permitted to live ⊐ he lived ⊐⊏ implicative s – / + he forgot to pay ^ he paid ^^ – / o he refused to fight | he fought || o / + he hesitated to ask he asked factives+ / + he admitted that he knew ⊏ he knew ⊏⊐ – / – he pretended he was sick | he was sick || o / o he wanted to fly # he flew ## We can specify relation generated by DEL or INS of each signature Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion Room for variation w.r.t. infinitives, complementizers, passivation, etc. Some more intuitive when negated: he didn’t hesitate to ask | he didn’t ask Factives not fully explained: he didn’t admit that he knew | he didn’t know
22
22 Putting it all together 1.Find a sequence of edits e 1, …, e n which transforms p into h. Define x 0 = p, x n = h, and x i = e i (x i–1 ) for i [1, n]. 2.For each atomic edit e i : a.Determine the lexical entailment relation (e i ). b.Project (e i ) upward through the semantic composition tree of expression x i–1 to find the atomic entailment relation (x i–1, x i ) 3.Join atomic entailment relations across the sequence of edits: (p, h) = (x 0, x n ) = (x 0, x 1 ) ⋈ … ⋈ (x i–1, x i ) ⋈ … ⋈ (x n–1, x n ) Limitations: need to find appropriate edit sequence connecting p and h; tendency of ⋈ operation toward less-informative entailment relations; lack of general mechanism for combining multiple premises Less deductive power than FOL. Can’t handle e.g. de Morgan’s Laws. Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
23
23 An example P The doctor didn’t hesitate to recommend Prozac. H The doctor recommended medication. yes ieiei xixi lexatomjoin The doctor didn’t hesitate to recommend Prozac. 1DEL( hesitate to ) The doctor didn’t recommend Prozac. 2DEL( didn’t ) The doctor recommended Prozac. 3SUB( Prozac, medication ) The doctor recommended medication. Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion || ^^ ⊏ ⊏⊏⊏ yes
24
24 Different edit orders? ieiei lexatomjoin 1DEL( hesitate to ) || 2DEL( didn’t )^^ ⊏ 3SUB( Prozac, medication ) ⊏⊏⊏ ieiei lexatomjoin 1DEL( didn’t )^^^ 2DEL( hesitate to ) ⊏ 3SUB( Prozac, medication ) ⊏⊏⊏ ieiei lexatomjoin 1SUB( Prozac, medication ) ⊏⊏⊏ 2DEL( hesitate to ) || 3DEL( didn’t )^^ ⊏ ieiei lexatomjoin 1DEL( hesitate to ) || 2SUB( Prozac, medication ) ⊏⊐ | 3DEL( didn’t )^^ ⊏ ieiei lexatomjoin 1DEL( didn’t )^^^ 2SUB( Prozac, medication ) ⊏⊐ | 3DEL( hesitate to ) ⊏ ieiei lexatomjoin 1SUB( Prozac, medication ) ⊏⊏⊏ 2DEL( didn’t )^^| 3DEL( hesitate to ) ⊏ Intermediate steps may vary; final result is typically (though not necessarily) the same Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
25
25 Implementation & evaluation The NatLog system: an implementation of this model in code For implementation details, see [MacCartney & Manning 2008] Evaluation on FraCaS test suite 183 NLI problems, nine sections, three-way classification Accuracy 70% overall; 87% on “relevant” sections (60% coverage) Precision 89% overall: rarely predicts entailment wrongly Evaluation on RTE3 test suite Longer, more natural premises; greater diversity of inference types NatLog alone has mediocre accuracy (59%) but good precision Hybridization with broad-coverage RTE system yields gains of 4% Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
26
Natural logic is not a universal solution for NLI Many types of inference not amenable to natural logic approach Our inference method faces many limitations on deductive power More work to be done in fleshing out our account Establishing projectivity signatures for more quantifiers, verbs, etc. Better incorporating presuppositions But, our model of natural logic fills an important niche Precise reasoning on negation, antonymy, quantifiers, implicatives, … Sidesteps the myriad difficulties of full semantic interpretation Practical value demonstrated on FraCaS and RTE3 test suites 26Conclusions Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion :-) Thanks! Questions?
27
27 Backup slides follow Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
28
28 An example involving exclusion P Stimpy is a cat. H Stimpy is not a poodle. yes ieiei xixi lexatomjoin Stimpy is a cat. 1SUB( cat, dog ) Stimpy is a dog. 2INS( not ) Stimpy is not a dog. 3SUB( dog, poodle ) Stimpy is not a poodle. Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion ||| ^^ ⊏ ⊐⊏⊏ yes
29
29 An example involving an implicative P We were not permitted to smoke. H We smoked Cuban cigars. no ieiei xixi lexatomjoin We were not permitted to smoke. 1DEL( permitted to ) We did not smoke. 2DEL( not ) We smoked. 3INS( Cuban cigars ) We smoked Cuban cigars. Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion ⊐⊏⊏ ^^ | ⊐⊐ | no
30
30 de Morgan’s Laws for quantifiers P Not all birds fly. H Some birds do not fly. yes ieiei xixi lexatomjoin Not all birds fly. 1DEL( not ) All birds fly. 2SUB( all, some ) Some birds fly. 3INS( not ) Some birds do not fly. Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion ^^^ ⊏⊏ ^ ⊏⊐ # wtf??
31
31 de Morgan’s Laws for quantifiers (2) P Not all birds fly. H Some birds do not fly. yes ieiei xixi lexatomjoin Not all birds fly. 1DEL( not ) All birds fly. 2INS( not ) All birds do not fly. 3SUB( all, some ) Some birds do not fly. Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion ^^^ | ^ ⊐ ⊏ ⊏ ⊏⊐ # wtf??
32
32 A more complex example P Jimmy Dean refused to move without blue jeans. H James Dean didn’t dance without pants. yes ieiei lexatomjoin 1SUB( Jimmy Dean, James Dean ) 2DEL( refuse to )||| 3INS( did ) | 4INS( n’t )^^ ⊏ 5SUB( move, dance ) ⊐⊏⊏ 6DEL( blue ) ⊏⊏⊏ 7SUB( jeans, pants ) ⊏⊏⊏ Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
33
33 A more complex example (2) P Jimmy Dean refused to move without blue jeans. H James Dean didn’t dance without pants. yes ieiei lexatomjoin 1INS( did ) 2INS( n’t )^^^ 3DEL( blue ) ⊏⊐ | 4SUB( jeans, pants ) ⊏⊐ | 5SUB( move, dance ) ⊐⊐ | 6DEL( refuse to )| ⊏ 7SUB( Jimmy Dean, James Dean ) ⊏ Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
34
34 A more complex example (3) P Jimmy Dean refused to move without blue jeans. H James Dean didn’t dance without pants. yes ieiei lexatomjoin 1INS( did ) 2INS( n’t )^|| 6DEL( refuse to )|| ⊏⊐ | # 3DEL( blue ) ⊏⊏ 4SUB( jeans, pants ) ⊏⊏ 5SUB( move, dance ) ⊐⊐ 7SUB( Jimmy Dean, James Dean ) Introduction Entailment Relations Joins Lexical Relations Projectivity Implicatives Inference Evaluation Conclusion
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.