1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15.

Slides:



Advertisements
Similar presentations
Problems of syntax-semantics interface ESSLLI 02 Trento.
Advertisements

Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
Lexical Functional Grammar : Grammar Formalisms Spring Term 2004.
1 Unification Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
Semantics (Representing Meaning)
Lexical Functional Grammar History: –Joan Bresnan (linguist, MIT and Stanford) –Ron Kaplan (computational psycholinguist, Xerox PARC) –Around 1978.
1 Natural Language Processing Lecture 7 Unification Grammars Reading: James Allen NLU (Chapter 4)
1 Chapter Chapter 5 Grammars for Natural Language 5.1 Auxiliary Verbs and Verb Phrases 5.2 Movement Phenomena in Language 5.3 Handling Questions in Context-Free.
Grammatical Relations and Lexical Functional Grammar Grammar Formalisms Spring Term 2004.
Statistical NLP: Lecture 3
Linguistic Theory Lecture 8 Meaning and Grammar. A brief history In classical and traditional grammar not much distinction was made between grammar and.
1 Features and Augmented Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
1 Auxiliary Verbs and Movement Phenomena Allen ’ s Chapter 5 J&M ’ s Chapter 11.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור עשר Chart Parsing (cont) Features.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Artificial Intelligence 2005/06 From Syntax to Semantics.
Artificial Intelligence 2005/06 Features, Gaps, Movement Questions and Passives.
Features and Unification
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Outline of English Syntax.
LING 364: Introduction to Formal Semantics Lecture 5 January 26th.
Syntax Nuha AlWadaani.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
CAS LX 502 Semantics 3a. A formalism for meaning (cont ’ d) 3.2, 3.6.
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 14, Feb 27, 2007.
Syntax Lecture 8: Verb Types 1. Introduction We have seen: – The subject starts off close to the verb, but moves to specifier of IP – The verb starts.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
IV. SYNTAX. 1.1 What is syntax? Syntax is the study of how sentences are structured, or in other words, it tries to state what words can be combined with.
1 Natural Language Processing Lecture 11 Efficient Parsing Reading: James Allen NLU (Chapter 6)
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Constituent Parsing and Algorithms (with.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
1 Natural Language Processing Chapter 15 (part 2).
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
Rules, Movement, Ambiguity
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Section 11.3 Features structures in the Grammar ─ Jin Wang.
Making it stick together…
1 Natural Language Processing Lectures 8-9 Auxiliary Verbs Movement Phenomena Reading: James Allen NLU (Chapter 5)
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
SYNTAX.
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 3.
1 Some English Constructions Transformational Framework October 2, 2012 Lecture 7.
September 26, : Grammars and Lexicons Lori Levin.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Welcome to the flashcards tool for ‘The Study of Language, 5 th edition’, Chapter 8 This is designed as a simple supplementary resource for this textbook,
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 King Faisal University.
Lecture 2: Categories and Subcategorisation
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Syntax Lecture 9: Verb Types 1.
Statistical NLP: Lecture 3
Lexical Functional Grammar
Natural Language Processing
Semantics (Representing Meaning)
Chapter Eight Syntax.
Natural Language Processing
Part I: Basics and Constituency
CSC 594 Topics in AI – Applied Natural Language Processing
CS 388: Natural Language Processing: Syntactic Parsing
Chapter Eight Syntax.
Linguistic Essentials
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
David Kauchak CS159 – Spring 2019
Presentation transcript:

1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15

2 Rule-by-rule semantic interpretation Computing Logical forms (i.e., Semantic Interpretation) can be coupled with parsing A syntactic tree can be generated from a specified logical form (i.e., Semantic Realization) To couple syntax and semantic, the meaning of all constituent must be determined By using features, the relation between the meaning of a constituent and that of its sub constituents can be specified in the grammar Each syntactical rule is associated with a semantic interpretation rule (this is called Rule-by-Rule semantic interpretation)

3 Semantic interpretation and compositionality Semantic interpretation is assumed to be a compositional process (similar to parsing) The meaning of a constituent is derived from the meaning of its sub constituents Interpretations can be built incrementally from the interpretations of sub phrases Compositional models make grammars easier to extend and maintain

4 Semantic interpretation and compositionality Integrating parsing and semantic interpretation is harder than it may seem One classic obstacle is the inconsistency between syntactical structures, and corresponding structures of the logical forms Syntactical structure of Jill loves every dog is: ((Jill) (loves (every dog))) Its unambiguous logical form is: (EVERY d1 : (DOG1 d1) (LOVES1 l1 (NAME j1 “Jill”) d1)

5 Difficulties of Semantic interpretation via compositionality In the syntactical structure, every dog is a sub constituent of VP, whereas in the logical form, the situation is reversed How the meaning of every dog could be represented in isolation and then used to construct the meaning of the sentence? Using quasi-logical forms is one way around this problem (LOVES1 l1 (NAME j1 “Jill”) )

6 Problem with Idioms Another obstacle for the compositionality assumption is the presence of idioms Jack kicked the bucket = Jack died Solution1: semantic meaning to be assigned to the entire phrase, rather than building it incrementally What about: The bucket was kicked by Jack? Jack kicked the bucket is ambiguous between: –(KICK1 k1 (NAME j1 “Jack”) ) –(DIE1 d1 (NAME j1 “Jack”)) Solution2: adding a new sense of die for the verb kick with sub categorization for an object BUCKET1

7 Semantic interpretation and compositionality In compositional semantic interpretation, we should be able to assign a semantic structure to any syntactic constituent For instance, we should be able to assign a uniform form of meaning to every verb phrase in any rule involving a VP The meaning of the VP in Jack laughed can be shown by a unary predicate: (LAUGHED1 l1 (NAME j1 “Jack”)) The VP in, Jack loved Sue, should also be represented by a unary predicate: (LOVES1 l2 (NAME j1 “Jack”) (NAME s1 “Sue”)) But, how can we express such complex unary predicates? Lambda calculus provides a formalism for representing such predicates

8 Lambda calculus Using lambda calculus, loved sue is represented as: ( x (LOVES1 l1 x (NAME s1 “SUE”))) We can apply a lambda expression to an argument, by a process called lambda reduction ( x (LOVES1 l1 x (NAME s1 “SUE”))(NAME j1 “Jack”) = (LOVES1 l1 (NAME j1 “Jack”) (NAME s1 “Sue”))

9 Lambda calculus Lambda-calculus provides a handy tool to couple syntax and semantics Now verb phrases with even different structures can easily be conjoined Sue laughs and opens the door ( a (LAUGHES1 l1 a)) and ( a (OPENS1 o1 a) ) can be conjoined into: ( a (& (LAUGHES1 l1 a) (OPENS1 o1 a) ))) Applying it to (NAME s1 “Sue”) would produce the logical form: (& (LAUGHES1 l1 (NAME s1 “Sue”)) (OPENS1 o1 (NAME s1 “Sue”)) ))

10 Lambda calculus (Cont.) Propositional phrase modifiers in noun phrases could be handled in different ways 1.Search for location of modifiers in noun phases and incorporate them into the interpretations Works for the man in the store, but not for the man is in the store or the man was thought to be in the store 2.Give and independent meaning to prepositional phrases –in the store is represented by a unary predicate ( o (IN-LOC1 o ) The man in the store ( ))>) The man is in the store (IN-LOC1 )

11 Compositional approach to semantics In general, each major syntactic phrase corresponds to a particular semantic construction: –VPs and PPs map to unary predicates, –sentences map to propositions, –NPs map to terms, and –minor categories are used in building major categories

12 A simple grammar and lexicon for semantic interpretation By using features, logical forms can be computed during parsing The main extension needed is a SEM feature, which is added to each lexical entry and rule Example: (S SEM (?semvp ?semnp))  (NP SEM ?semnp) (VP SEM ?semvp) NP with SEM (NAME m1 “Mary”) VP with SEM ( a (SEES1 e8 a (NAME j1 “Jack))) Then the SEM feature of S is: (( a (SEES1 e8 a (NAME j1 “Jack))) (NAME m1 “Mary”)), is reduced to (SEES1 e8 (NAME m1 “Mary”) (NAME j1 “Jack)

13 Mary sees Jack

14 Sample lexicon with SEM features

15 A sample grammar with SEM and VAR features A feature called VAR is also needed to store the discourse variable that corresponds to the constituents The VAR feature is automatically generated when a lexical constituent is constructed from a word Then the VAR feature is passed up the parse tree by being treated as a head feature This guarantees that the discourse variables are always unique If ?v is m1, SEM of the man is:

16 Morphological rules with SEM feature The lexical rules for morphological derivations must also be modified to handle SEM features For instance, the rule for converting singular nouns to plural ones take the SEM of a singular noun and adds a PLUR operator (L7) A similar technique is used for tense operators (L1, L2, and L3)

17 Using Chart Parsing for semantic interpretation Only two simple modifications are needed to use the standard Chart Parser to handle semantic interpretation: When a lexical rule is instantiated, the VAR feature is set to a new discourse variable Whenever a constituent is built, its SEM is simplified by possible lambda reductions

18 Jill saw the dog

19 Handling auxiliary verbs (VP SEM ( a1 (?semaux (?semvp a1))))  (AUX SUBCAT ?v SEM ?semaux) (VP VFORM ?v SEM ?semvp) If ?semaux is the modal operator CAN1, and ?semvp is ( x (LAUGHS1 e3 X)) Then the SEM of the VP can laugh will be ( a1 (CAN1 (( x (LAUGHS1 e3 x)) a1)) ( a1 (CAN1 (LAUGHS1 e3 a1)))

20 Handling Prepositional Phrases Propositional phrases play two different semantic roles: 1.PP can be a modifier to a noun phrase or a verb phrase (cry in the corner) 2.PP may be needed by a head word, and the preposition acts more as a term than as an independent predicate (Jill decided on a couch)

21 PP as a modifier of a noun phrase The appropriate rule for interpreting PP is: (PP SEM ( y (?semp y ?semnp)))  (P SEM ?semp) (NP SEM ?semnp) if SEM of P is: IN-LOC1, and SEM of NP is:, Then the SEM of PP in the corner will be: ( y (IN-LOC1 y )),

22 PP as a modifier of a noun phrase If the PP is used in the rule: (CNP SEM ( n (& (?semcnp n) (?sempp n))))  (CNP SEM ?semcnp) (PP SEM ?sempp) Then SEM of CNP man in the corner will be: ( n (& (MAN1 n) (( y (IN-LOC1 y )) n))) That is reduced to: ( n (& (MAN1 n) (IN-LOC1 n ))) Then using rule (NP VAR ?v SEM )  (ART SEM ?semart) (CNP SEM ?semcnp) The SEM of the man in the corner will be: ))) m1)> Which is reduced to: ))>

23 PP as a modifier of a verb phrase Appropriate rule is: (VP VAR ?v SEM ( x (& (?semvp x) (?sempp ?v))))  (VP VAR ?v SEM ?semvp) (PP SEM ?sempp) SEM of PP in the corner is: ( y (IN-LOC1 y )), SEM of VP of cry is: ( x (CRIES1 e1 x)), SEM of VP of cry in the corner will be: ( a (& (CRIES1 e1 a) (IN-LOC1 e1 )))

24 Parse tree of Cry in the corner

25 PP as a sub constituent of a head word Jill decided on a couch is ambiguous: 1.Jill made a decision while she was on a couch 2.Jill made a decision about a couch The appropriate rule for the second one is: VP  V[_pp:on] NP PP[on] –The desired logical form for this is ( s (DECIDES-ON1 d1 s )) Here the preposition “on” doesn’t have any semantic contribution (it is only a term) A new binary feature PRED is needed to determine the role of preposition (i.e., a predicate (+) or a term(-))

26 Rules for handling PPs

27 Different parse trees for the VP decide on the coach

28 Lexicalized semantic interpretation So far all the complexity of the semantic interpretation has been encoded in grammar rules A different approach is to encode complexities in lexical entries and have simpler rules Verb decide has two senses, with two different SEM values: 1.Intransitive sense: ( y (DECIDES1 e1 y)) 2.Transitive sense: ( o ( y (DECIDE-ON1 e1 y o)))

29 Lexicalized semantic interpretation There is a trade off between the complexity of the grammatical rules and the complexity of the lexical entries The more complex that logical forms are, the more attractive the second approach becomes For instance, to handle thematic roles with simple lexical entries, we need extra rules to classify different type of verbs

30 Lexicalized semantic interpretation See and eat both have transitive forms where the subject fills the AGENT role and the object fills the THEME role Break also has a transitive sense where the subject fills the INSTR role and the object fills the THEME role We need a new feature ROLES to be added to the lexical entries to identify the appropriate form, and a separate rule for each

31 Lexicalized semantic interpretation (VP VAR ?v SEM ( a (?semv ?v [AGENT a] [THEME ?semnp])))  (V ROLES AGENT-THEME SEM ?semv) (NP SEM ?semnp) (VP VAR ?v SEM ( a (?semv ?v [INSTR a] [THEME ?semnp])))  (V ROLES INSTR-THEME SEM ?semv) (NP SEM ?semnp) Additional rules must be added for all combinations of roles that can be used by the verbs This approach is cumbersome, because it requires adding thematic role information to lexicon anyway (using the ROLES feature)

32 Lexicalized semantic interpretation It is simpler to enter the appropriate form only in the lexicon: See: (V VAR ?v SEM ( o ( a (SEES1 ?v [AGENT a] [THEME o])))) Break: (V VAR ?v SEM ( o ( a (BREAKS1 ?v [INSTR a] [THEME o])))) In this case, just a single rule is needed: (VP SEM (?semv ?semnp))  (V SEM ?semv) (NP SEM ?semnp) SEM of VP See the book is represented as: ( o ( a (SEES1 b1 [AGENT a] [THEME o])))) = ( a (SEES1 b1 [AGENT a] [THEME ])))) SEM of VP Break the book is represented as: ( o ( a (BREAKS1 b1 [INSTR a] [THEME o])))) = ( a (BREAKS1 b1 [INSTR a] [THEME ]))))

33 Hierarchical lexicons The problem of making the lexicon more complex is that there are too many words Verb give allows: –I gave the money –I gave John the money –I gave the money to John The lexical entries for give includes: –(V SUBCAT _np SEM o a (GIVE * [AGENT a] [THEME o])) –(V SUBCAT _np_np SEM r o a (GIVE * [AGENT a] [THEME o] [TO-POSS r)) –(V SUBCAT _np_pp:to SEM r o a (GIVE * [AGENT a] [THEME o] [TO-POSS r))

34 Hierarchical lexicons This cause a lot of redundancies in the lexicon A better idea is to exploit regularities of verbs in English A very large class of transitive verbs (give, take, see, find, paint, etc.), which describe an action, use the same semantic interpretation rule for SUBCAT _np

35 Hierarchical lexicons The idea of hierarchical lexicon is to organize verb senses in such a way that their shared properties can be used (via inheritance) –INTRANS-ACT defines verbs with SUBCAT _none s (?PREDN * [AGENT s]) –TRANS-ACT for verbs with SUBCAT _np o a (?PREDN * [AGENT a] [THEME o]) Give: (V ROOT give PREDN GIVE1 SUP {BITRANS-TO-ACT TRANS-ACT})

36 Lexical hierarchy

37 Handling simple questions To handle simple questions, we only need to extend the appropriate grammar rules with the SEM feature The lexical entry for the wh-words are extended with the SEM feature: who: (PRO WH {Q R} SEM WHO1 AGR {3s 3p}) But, how SEM and GAP features interact?

38 Who did Jill see?

39 Prepositional phrase Wh-Questions Questions can begin with prepositional phrases: –In which box did you put the book? –Where did Jill go? The semantic interpretation of these questions depend on the type of PPs (S INV – SEM (WH-query ?sems))  (PP WH Q PRED ?p PTYPE ?pt SEM ?sempp) (S INV + SEM ?sems GAP (PP PRED ?p PTYPE ?pt SEM ?sempp))

40 Prepositional phrase Wh-Questions To handle wh-terms like where the following rule is also needed: (PP PRED ?p PTYPE ?pt SEM ?sempp)  (PP-WRD PRED ?p PTYPE ?pt SEM ?sempp) There would also be two lexical entries: (PP-WRD PTYPE {LOC MOT} PRED – VAR ?v SEM ) (PP PRED + VAR ?v SEM ( x (AT-LOC x )))

41 Where did Jill go? But, Wh-questions starting with a PP with + PRED cannot be handled, yet The difficulty comes from the restriction that in rule VP  VP PP, the GAP is passed only into the VP sub constituent Thus there is still no way to create a PP gap that modifies a verb phrase

42 Relative Clauses The similarity between relative clauses and wh-questions continues at the semantic level, too. The logical form of a relative clause is a unary predicate, which is produced by (CNP SEM ( ?x (& (?semcnp ?x) (?semrel ?x)))  (CNP SEM ?semcnp) (REL SEM ?semrel) When the embedded sentence is completed, a lambda is wrapped around it to form the relative clause

43 Who Jill saw The wh-term variable is specified in a feature called RVAR in the rule (REL SEM ( ?v ?sems))  (NP WH R RVAR ?v AGR ?a SEM ?semnp) (S [fin] INV – GAP (NP AGR ?a SEM ?semnp) SEM ?sems)

44 Semantic Interpretation by Features Unification versus Lambda reductions Semantic interpretation can be performed by just using feature values and variables The basic idea is to introduce new features for the argument positions that would have been filled with lambda reductions

45 Jill saw the dog

46 Features versus Lambda expressions

47 Features versus Lambda expressions One advantage of using features is that no special mechanism (e.g., Lambda Reduction) is needed for semantic interpretation Another significant advantage is that the grammar in this form is reversible (can be used to generate sentences) But, not all lambda expressions can be eliminated using this technique Handling conjoined subject phrases as in sue and Sam saw Jack, SUBJ variable need to unify with both Sue and Sam, which is not possible

48 Generating sentences from Logical Forms Intuitively, it should be easy to reverse a grammar and use it for generation: Decompose the logical form of each constituent into a series of lexical constituents with the appropriate meaning But, not all grammars are reversible: e.g., a grammar with Lambda reduction

49 Generating sentences from Logical Forms Consider ( s1 (NAME j1 “Jill”) ) Rule S with SEM (?semvp ?semnp) cannot be unified with the above logical form The problem is that lambda reduction was used to convert the original logical form, which was (( a ( d1 a )) (NAME j1 “Jill”) Lambda abstraction can be used to find a match, but the problem is that there are three possible lambda abstractions –( e ( e (NAME j1 “Jill”) –( a ( s1 a )) –( o ( s1(NAME j1 “Jill”) o))

50 Realization versus Parsing Parsing and realization both can be viewed as building a syntactic tree A parser starts with the words and tries to find a tree that accounts for them and hence determine the logical form of the sentence A realizer starts with a logical form and tries to find a tree to account for it and hence determine the words to realize it

51 Realization and Parsing Using standard top-down parsing algorithm is extremely inefficient Consider again ( s1 (NAME j1 “Jill”) ) (NP SEM ?semsubj) (VP SUBJ ?semsubj SEM ( s1 (NAME j1 “Jill”) )) The problem is that the SEM of NP is unconstrained Solution: expand the constituents in a different order (e.g., choosing the subject based on decisions about the verb and the structure of the verb phrase)

52 Head Driven Realization algorithm

53 Head driven Realization Using rule 3 of grammar 9.14 ?semv =, ?v = s1, ?semsubj = (NAME j1 “Jill”), ?semnp = After rewriting the VP, the following list of constituents is obtained: (NP SEM (NAME j1 “Jill”)) (V [_np] SEM ) (NP SEM )

54 Head driven Realization Since there is no non-lexical head, the algorithm picks any non-lexical constituent with a bound SEM (e.g., the first NP) Using rule 5, it yields to (NAME SEM “Jill”) Selecting the remaining NP, and using rules 6, and then 7, produce the following list: (NAME SEM “Jill”), (V [_np] SEM ), (ART SEM THE), (N SEM DOG1) Now it is easy to produce Jill saw the dog

55 Ambiguities in Realization With large grammars, a wider range of forms would be possible If only the SEM feature is specified, the realization program would randomly pick between active and passive sentences when allowed by the verb It might also pick between different sub categorization structures –Jill gave the dog to Jack –Jill gave Jack the dog Different word senses may be picked –Jack gave the money to the Humane Society –Jack donated the money to the Humane Society To force particular realization, other features in addition to SEM may be necessary (e.g., VOICE to force active voice)