1 Computational Semantics Chapter 18 November 2012 We will not do all of this… Lecture #12.

Slides:



Advertisements
Similar presentations
Semantic Analysis Read J & M Chapter 15.. The Principle of Compositionality There’s an infinite number of possible sentences and an infinite number of.
Advertisements

Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
Statistical NLP: Lecture 3
CS 4705 Slides adapted from Julia Hirschberg. Homework: Note POS tag corrections. Use POS tags as guide. You may change them if they hold you back.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
LTAG Semantics on the Derivation Tree Presented by Maria I. Tchalakova.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Meaning Representation and Semantic Analysis Ling 571 Deep Processing Techniques for NLP February 9, 2011.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Artificial Intelligence 2005/06 From Syntax to Semantics.
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
Lecture 1 Introduction: Linguistic Theory and Theories
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
Albert Gatt LIN 3098 Corpus Linguistics. In this lecture Some more on corpora and grammar Construction Grammar as a theoretical framework Collostructional.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15.
Natural Language Processing
Chapter 15. Semantic Analysis
1 Features and Unification Chapter 15 October 2012 Lecture #10.
CS 4705: Semantic Analysis: Syntax-Driven Semantics
Probabilistic Parsing Reading: Chap 14, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
1 Statistical Parsing Chapter 14 October 2012 Lecture #9.
CAS LX 502 8b. Formal semantics A fragment of English.
November 2003CSA4050: Semantics I1 CSA4050: Advanced Topics in NLP Semantics I What is semantics for? Role of FOL Montague Approach.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Semantic Analysis CMSC Natural Language Processing May 8, 2003.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
1 Natural Language Processing Chapter 15 (part 2).
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
Section 11.3 Features structures in the Grammar ─ Jin Wang.
Supertagging CMSC Natural Language Processing January 31, 2006.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Natural Language Processing Vasile Rus
Statistical NLP: Lecture 3
Basic Parsing with Context Free Grammars Chapter 13
Natural Language Processing
Natural Language Processing
CSCI 5832 Natural Language Processing
CPSC 503 Computational Linguistics
CS 4705: Semantic Analysis: Syntax-Driven Semantics
CS 388: Natural Language Processing: Syntactic Parsing
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
Semantics: Representations and Analyses
Linguistic Essentials
David Kauchak CS159 – Spring 2019
Presentation transcript:

1 Computational Semantics Chapter 18 November 2012 We will not do all of this… Lecture #12

2 Semantic Analysis Semantic analysis is the process of taking in some linguistic input and producing a meaning representation for it. –There are many ways of doing this, ranging from completely ad hoc domain specific methods to more theoretically founded by not quite useful methods. –Different methods make more or less (or no) use of syntax –We’re going to start with the idea that syntax does matter The compositional rule-to-rule approach

3 Compositional Analysis Principle of Compositionality –The meaning of a whole is derived from the meanings of the parts What parts? –The constituents of the syntactic parse of the input.

4 Example AyCaramba serves meat.

5 Compositional Analysis

6 Compositional Semantics Note in the previous example: Part of the meaning derives from the people and activities it’s about (predicates and arguments, or, nouns and verbs) and part from the way they are ordered and related grammatically: syntax Question: can we link up syntactic structures to a corresponding semantic representation to produce the ‘meaning’ of a sentence in the course of parsing it?

7 Specific vs. General-Purpose Rules We don’t want to have to specify for every possible parse tree what semantic representation it maps to We want to identify general mappings from parse trees to semantic representations: –Again (as with feature structures) we will augment the lexicon and the grammar –Rule-to-rule hypothesis: a mapping exists between rules of the grammar and rules of semantic representation

8 Semantic Attachments Extend each grammar rule with instructions on how to map the components of the rule to a semantic representation (grammars are getting complex) S  NP VP {VP.sem(NP.sem)} Each semantic function is defined in terms of the semantic representation of choice Problem: how to define these functions and how to specify their composition so we always get the meaning representation we want from our grammar?

9 Augmented Rules Let’s look at this a little more abstractly. Consider the general case of a grammar rule: This should be read as the semantics we attach to A can be computed from some function applied to the semantics of A’s parts.

10 Augmented Rules As we’ll see the class of actions performed by f in the following rule can be quite restricted.

11 Compositional Analysis

12 A ‘Simple’ Example AyCaramba serves meat. Associating constants with constituents –ProperNoun  AyCaramba {AyCaramba} –MassNoun  meat {Meat} Defining functions to produce these from input –NP  ProperNoun {ProperNoun.sem} –NP  MassNoun {MassNoun.sem} Assumption: meaning reps of children are passed up to parents for non-branching constituents

13 Verbs here are where the action is –V  serves {  e,x,y) Isa(e,Serving) ^ Server(e,x) ^ Served(e,y)} –Will every verb have its own distinct representation? –Predicate(Agent,Patient)… How do we combine these pieces? –VP  V NP {????} –Goal:  (e,x) Isa(e,Serving) ^ Server(e,x) ^ Served(e,Meat) –S  NP VP {????} –Goal:  (e) Isa(e,Serving) ^ Server(e, AyCaramba) ^ Served(e,Meat) –VP and S semantics must tell us Which variables are to be replaced by which arguments? How is this replacement done?

14 Lambda Notation Extension to FOPC x P(x) + variable(s) + FOPC expression in those variables Lambda binding Apply lambda-expression to logical terms to bind lambda- expression’s parameters to terms (lambda reduction) Simple process: substitute terms for variables in lambda expression xP(x)(car) P(car)

15 Lambda notation provides requisite verb semantics –Formal parameter list makes variables within the body of the logical expression available for binding to external arguments provided by e.g. NPs –Lambda reduction implements the replacement Semantic attachment for grammar rules: –S  NP VP{VP.sem(NP.sem)} –VP  V NP{V.sem(NP.sem)} –V  serves {???} {  (e,x,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)} becomes { y x  (e) Isa(e,Serving) ^ Server(e,x) ^ Served(e,y)} –Now ‘x’ is available to be bound when V.sem is applied to NP.sem, and ‘y’ is available to be bound when the S rule is applied.

16 Example

17 Example

18 Example

19 Example

20 Key Points Each node in a tree corresponds to a rule in the grammar Each grammar rule has a semantic rule associated with it that specifies how the semantics of the LHS of that rule can be computed from the semantics of its daughters.

21 Strong Compositionality The semantics of the whole is derived solely from the semantics of the parts. (i.e. we ignore what’s going on in other parts of the tree).

22 Predicate-Argument Semantics The functions/operations permitted in the semantic rules fall into two classes –Pass the semantics of a daughter up unchanged to the mother –Apply (as a function) the semantics of one of the daughters of a node to the semantics of the other daughters

23 Mismatches There are unfortunately some annoying mismatches between the syntax of FOPC and the syntax provided by our grammars… So we’ll accept that we can’t always directly create valid logical forms in a strictly compositional way –We’ll get as close as we can and patch things up after the fact.

24 Quantified Phrases Consider A restaurant serves meat. Assume that A restaurant looks like If we do the normal lambda thing we get

25 Complex Terms Allow the compositional system to pass around representations like the following as objects with parts: Complex-Term →

26 Example Our restaurant example winds up looking like Big improvement…

27 Conversion So… complex terms wind up being embedded inside predicates. So pull them out and redistribute the parts in the right way… P( ) turns into Quantifier var body connective P(var)

28 Example

29 Quantifiers and Connectives If the quantifier is an existential, then the connective is an ^ (and) If the quantifier is a universal, then the connective is an -> (implies)

30 Multiple Complex Terms Note that the conversion technique pulls the quantifiers out to the front of the logical form… That leads to ambiguity if there’s more than one complex term in a sentence.

31 Quantifier Ambiguity Consider –Every restaurant has a menu –Every restaurant has a beer. –I took a picture of everyone in the room. –That could mean that every restaurant has a menu –Or that There’s some super-menu out there and all restaurants have that menu

32 Quantifier Scope Ambiguity

33 Ambiguity This turns out to be a lot like the prepositional phrase attachment problem The number of possible interpretations goes up exponentially with the number of complex terms in the sentence The best we can do is to come up with weak methods to prefer one interpretation over another

34 Doing Compositional Semantics To incorporate semantics into grammar we must –Figure out right representation for a single constituent based on the parts of that constituent (e.g. Adj) –Figure out the right representation for a category of constituents based on other grammar rules making use of that constituent (e.g Nom  Adj Nom) This gives us a set of function-like semantic attachments incorporated into our CFG –E.g. Nom  Adj Nom { x Nom.sem(x) ^ Isa(x,Adj.sem)}

35 What do we do with them? As we did with feature structures: –Alter an Early-style parser so when constituents (dot at the end of the rule) are completed, the attached semantic function is applied and a meaning representation created and stored with the state Or, let parser run to completion and then walk through resulting tree running semantic attachments from bottom-up

36 Option 1 (Integrated Semantic Analysis) S  NP VP {VP.sem(NP.sem)} –VP.sem has been stored in state representing VP –NP.sem has been stored with the state for NP –When rule completed, go get value of VP.sem, go get NP.sem, and apply VP.sem to NP.sem –Store result in S.sem. As fragments of input parsed, semantic fragments created Can be used to block ambiguous representations

37 Drawback You also perform semantic analysis on orphaned constituents that play no role in final parse Hence, case for pipelined approach: Do semantics after syntactic parse But…. Let’s look at some other examples….

38 Harder Example What makes this hard? What role does Harry play in all this?

39 Harder Example  e,f,x Isa(e, Telling) ٨ Isa(f, Going) ٨ Teller(e, Speaker) ٨ Tellee(e, Harry) ٨ ToldThing(e, f) ٨ Goer(f, Harry) ٨ Destination(f, x)

40 Harder Example The VP for told is VP -> V NP VPto –So you do what? Apply the semantic function attached to VPTO the semantics of the NP; this binds Harry as the goer of the going. Then apply the semantics of the V to the semantics of the NP; this binds Harry as the Tellee of the Telling And to the result of the first application to get the right value of the told thing. V.Sem(NP.Sem, VPto.Sem(NP.Sem))

41 Harder Example That’s a little messy and violates the notion that the grammar ought not to know much about what is going on in the semantics… Better might be –V.sem(NP.Sem, VPto.Sem) ???? –VPto.sem(V.sem, NP.sem)??? –i.e Apply the semantics of the head verb to the semantics of its arguments. –Complicate the semantics of the verb inside VPto to figure out what’s going on.

42 Two Philosophies 1.Let the syntax do what syntax does well and don’t expect it to know much about meaning –In this approach, the lexical entry’s semantic attachments do the work 2.Assume the syntax does know about meaning Here the grammar gets complicated and the lexicon simpler

43 Example Consider the attachments for the VPs VP -> Verb NP NP (gave Mary a book) VP -> Verb NP PP (gave a book to Mary) Assume the meaning representations should be the same for both. Under the lexicon-heavy scheme the attachments are: VP.Sem(NP.Sem, NP.Sem) VP.Sem(NP.Sem, PP.Sem)

44 Example Under the syntax-heavy scheme we might want to do something like VP -> V NP NP V.sem ^ Recip(NP1.sem) ^ Object(NP2.sem) VP -> V NP PP V.Sem ^ Recip(PP.Sem) ^ Object(NP1.sem) I.e. the verb only contributes the predicate, the grammar “knows” the roles.

45 Integration Two basic approaches –Integrate semantic analysis into the parser (assign meaning representations as constituents are completed) –Pipeline… assign meaning representations to complete trees only after they’re completed

46 Example From BERP –I want to eat someplace near campus –Somebody tell me the two meanings…

47 Pros and Cons If you integrate semantic analysis into the parser as its running… –You can use semantic constraints to cut off parses that make no sense –You assign meaning representations to constituents that don’t take part in the correct (most probable) parse

48 Non-Compositionality Unfortunately, there are lots of examples where the meaning (loosely defined) can’t be derived from the meanings of the parts –Idioms, jokes, irony, sarcasm, metaphor, metonymy, indirect requests, etc

49 English Idioms Kick the bucket, buy the farm, bite the bullet, run the show, bury the hatchet, etc… Lots of these… constructions where the meaning of the whole is either –Totally unrelated to the meanings of the parts (kick the bucket) –Related in some opaque way (run the show)

50 Example Enron is the tip of the iceberg. NP -> “the tip of the iceberg” Not so good… attested examples… –the tip of Mrs. Ford’s iceberg –the tip of a 1000-page iceberg –the merest tip of the iceberg How about –That’s just the iceberg’s tip.

51 Example What we seem to need is something like NP -> An initial NP with tip as its head followed by a subsequent PP with of as its head and that has iceberg as the head of its NP And that allows modifiers like merest, Mrs. Ford, and page to modify the relevant semantic forms

52 The Tip of the Iceberg Describing this particular construction 1.A fixed phrase with a particular meaning 2.A syntactically and lexically flexible phrase with a particular meaning 3.A syntactically and lexically flexible phrase with a partially compositional meaning

53 Constructional Approach Syntax and semantics aren’t separable in the way that we’ve been assuming Grammars contain form-meaning pairings that vary in the degree to which the meaning of a constituent (and what constitutes a constituent) can be computed from the meanings of the parts.

54 Constructional Approach So we’ll allow both VP → V NP{V.sem(NP.sem)} and VP → Kick-Verb the bucket {λ x Die(x)}

55 Computational Realizations Semantic grammars –Simple idea, dumb name Cascaded finite-state transducers –Just like Chapter 3

56 Semantic Grammars One problem with traditional grammars is that they don’t necessarily reflect the semantics in a straightforward way You can deal with this by… –Fighting with the grammar Complex lambdas and complex terms, etc –Rewriting the grammar to reflect the semantics And in the process give up on some syntactic niceties

57 BERP Example

58 BERP Example How about a rule like the following… Request → I want to go to eat FoodType Time { some attachment }

59 Semantic Grammar The term semantic grammar refers to the motivation for the grammar rules The technology (plain CFG rules with a set of terminals) is the same as we’ve been using The good thing about them is that you get exactly the semantic rules you need The bad thing is that you need to develop a new grammar for each new domain

60 Semantic Grammars Typically used in conversational agents in constrained domains –Limited vocabulary –Limited grammatical complexity –Chart parsing (Earley) can often produce all that’s needed for semantic interpretation even in the face of ungrammatical input.