With slides borrowed from Jason Eisner

Slides:



Advertisements
Similar presentations
Prolog programming....Dr.Yasser Nada. Chapter 8 Parsing in Prolog Taif University Fall 2010 Dr. Yasser Ahmed nada prolog programming....Dr.Yasser Nada.
Advertisements

Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Natural Language Processing Syntax. Syntactic structure John likes Mary PN VtVt NP VP S DetPNVtVt NP VP S Every man likes Mary Noun.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Semantics Static semantics Dynamic semantics attribute grammars
CSA4050: Advanced Topics in NLP Semantics IV Partial Execution Proper Noun Adjective.
Semantic Analysis Read J & M Chapter 15.. The Principle of Compositionality There’s an infinite number of possible sentences and an infinite number of.
Syntax and Context-Free Grammars Julia Hirschberg CS 4705 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
CAS LX a. A notational holiday. Sets A set is a collection of entities of any kind. They can be finite: {√2, John Saeed, 1984}. They can be infinite:
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
CAS LX 502 Semantics 2b. A formalism for meaning 2.5, 3.2, 3.6.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
CAS LX 502 Semantics 3a. A formalism for meaning (cont ’ d) 3.2, 3.6.
Syntax Directed Translation. Syntax directed translation Yacc can do a simple kind of syntax directed translation from an input sentence to C code We.
1 CS 385 Fall 2006 Chapter 14 Understanding Natural Language (omit 14.4)
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
CAS LX 502 8b. Formal semantics A fragment of English.
Intro to NLP - J. Eisner1 Tree-Adjoining Grammar (TAG) One of several formalisms that are actually more powerful than CFG Note: CFG with features.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
Syntax Why is the structure of language (syntax) important? How do we represent syntax? What does an example grammar for English look like? What strategies.
Artificial Intelligence: Natural Language
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
The man bites the dog man bites the dog bites the dog the dog dog Parse Tree NP A N the man bites the dog V N NP S VP A 1. Sentence  noun-phrase verb-phrase.
1 Syntax 1. 2 In your free time Look at the diagram again, and try to understand it. Phonetics Phonology Sounds of language Linguistics Grammar MorphologySyntax.
11 Project, Part 3. Outline Basics of supervised learning using Naïve Bayes (using a simpler example) Features for the project 2.
Knowledge Structure Vijay Meena ( ) Gaurav Meena ( )
CAS LX b. Summarizing the fragment analysis, relative clauses.
CAS LX 502 9b. Formal semantics Pronouns and quantifiers.
NATURAL LANGUAGE PROCESSING
Semantics cCS 224n / Lx 237 Tuesday, May With slides borrowed from Jason Eisner.
Intro to NLP - J. Eisner1 Semantics From Syntax to Meaning!
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
Semantics From Syntax to Meaning! Intro to NLP - J. Eisner.
Parsing.
Syntax 1.
Modeling Grammaticality
Basic Parsing with Context Free Grammars Chapter 13
BBI 3212 ENGLISH SYNTAX AND MORPHOLOGY
Propositional Logic Session 3
Natural Language Processing
11a. Predicate modification and adjectives
CS730: Text Mining for Social Media, F2010
Syntax Analysis Sections :.
LING 581: Advanced Computational Linguistics
CSCI 5832 Natural Language Processing
Parsing.
Earley’s Algorithm (1970) Nice combo of our parsing ideas so far:
Lecture 9: Semantic Parsing
CS4705 Natural Language Processing
Weak Slot-and-Filler Structures
SYNTAX DIRECTED DEFINITION
CS246: Information Retrieval
David Kauchak CS159 – Spring 2019
Modeling Grammaticality
Curry A Tasty dish? Haskell Curry!.
Tree-Adjoining Grammar (TAG)
Artificial Intelligence 2004 Speech & Natural Language Processing
Visual Programming Languages ICS 539 Icon System Visual Languages & Visual Programming, Chapter 1, Editor Chang, 1990 ICS Department KFUPM Sept. 1,
Presentation transcript:

With slides borrowed from Jason Eisner Semantics CS 224n / Lx 237 Tuesday, May 11 2004 With slides borrowed from Jason Eisner

Objects Three Kinds: Boolean – semantic value of sentences Entities Objects, NPs Maybe space / time specifications Functions Predicates – function returning a boolean Functions might return other functions Functions might take other functions as arguments.

Nouns and their modifiers expert g expert(g) big fat expert g big(g)  fat(g)  expert(g) But: bogus expert Wrong: g bogus(g)  expert(g) Right: g (bogus(expert))(g) … bogus maps to new concept Baltimore expert (white-collar expert, TV expert …) g Related(Baltimore, g)  expert(g) Or with different intonation: g (Modified-by(Baltimore, expert))(g) Can’t use Related for that case: law expert and dog catcher = g Related(law,g)  expert(g)  Related(dog, g)  catcher(g) = dog expert and law catcher

Modifiers continued Non-intersective adjectives overpriced(in(paloalto)(house)) in(paloalto)(overprice(house)) Adjectives denotation depend precisely on what they are modifying.

Compositional Semantics We’ve discussed what semantic representations should look like. But how do we get them from sentences??? First - parse to get a syntax tree. Second - look up the semantics for each word. Third - build the semantics for each constituent Work from the bottom up The syntax tree is a “recipe” for how to do it

Compositional Semantics Add a “sem” feature to each context-free rule S  NP loves NP S[sem=loves(x,y)]  NP[sem=x] loves NP[sem=y] Meaning of S depends on meaning of NPs NP V loves VP S x y loves(x,y) NP the bucket V kicked VP S x died(x)

Compositional Semantics Instead of S  NP loves NP S[sem=loves(x,y)]  NP[sem=x] loves NP[sem=y] might want general rules like S  NP VP: V[sem=loves]  loves VP[sem=v(obj)]  V[sem=v] NP[sem=obj] S[sem=vp(subj)]  NP[sem=subj] VP[sem=vp] Now George loves Laura has sem=loves(Laura)(George) In this manner we’ll sketch a version where Still compute semantics bottom-up Grammar is in Chomsky Normal Form So each node has 2 children: 1 function & 1 argument To get its semantics, apply function to argument!

START Sfin NP Punc . VPfin Det Every N nation T -s VPstem Vstem want Sinf NP George VPinf T to VPstem Vstem love NP Laura

the meaning that we want here: how can we arrange to get it? START Sfin NP Punc . VPfin Det Every N nation T -s VPstem loves(G,L) Vstem want Sinf the meaning that we want here: how can we arrange to get it? NP George VPinf T to VPstem Vstem love NP Laura

apply to G to yield the desired blue result? (this is like division!) START Sfin NP Punc . VPfin Det Every N nation T -s VPstem loves(G,L) Vstem want Sinf what function should apply to G to yield the desired blue result? (this is like division!) G NP George VPinf T to VPstem Vstem love NP Laura

loves(G,L) START Sfin NP Punc . VPfin Det Every N nation T -s VPstem Vstem want Sinf x loves(x,L) G NP George VPinf T to VPstem Vstem love NP Laura

loves(G,L) x loves(x,L) START Sfin NP Punc . VPfin Det Every N nation VPstem loves(G,L) Vstem want Sinf x loves(x,L) G NP George VPinf a a x loves(x,L) T to VPstem Vstem love NP Laura We’ll say that “to” is just a bit of syntax that changes a VPstem to a VPinf with the same meaning.

loves(G,L) START Sfin NP Punc . VPfin Det Every N nation T -s VPstem Vstem want Sinf x loves(x,L) G NP George VPinf a a x loves(x,L) T to VPstem y x loves(x,y) L Vstem love NP Laura

x loves(x,L) x loves(x,L) y x loves(x,y) START Sfin NP Punc . VPfin x wants(x, loves(G,L)) Det Every N nation T -s VPstem by analogy loves(G,L) Vstem want Sinf x loves(x,L) G NP George VPinf x loves(x,L) T to VPstem a a Vstem love NP Laura L y x loves(x,y)

x loves(x,L) x loves(x,L) yx loves(x,y) START Sfin NP Punc . VPfin x wants(x, loves(G,L)) Det Every N nation T -s VPstem by analogy loves(G,L) Vstem want Sinf y x wants(x,y) x loves(x,L) G NP George VPinf x loves(x,L) T to VPstem a a Vstem love NP Laura L yx loves(x,y)

x present(wants(x, loves(G,L))) START x present(wants(x, loves(G,L))) Sfin NP Punc . VPfin x wants(x, loves(G,L)) Det Every N nation T -s VPstem v x present(v(x)) Vstem want Sinf NP George VPinf T to VPstem Vstem love NP Laura

present(wants(every(nation), loves(G,L)))) START Sfin NP Punc . every(nation) VPfin x present(wants(x, loves(G,L))) Det Every N nation T -s VPstem Vstem want Sinf NP George VPinf T to VPstem Vstem love NP Laura

present(wants(every(nation), loves(G,L)))) START Sfin NP Punc . every(nation) VPfin present(x wants(x, loves(G,L))) Det Every N nation T -s VPstem n every(n) Vstem want Sinf nation NP George VPinf T to VPstem Vstem love NP Laura

present(wants(every(nation), loves(G,L)))) START Sfin NP Punc . s assert(s) VPfin Det Every N nation T -s VPstem Vstem want Sinf NP George VPinf T to VPstem Vstem love NP Laura

In Summary: From the Words START assert(present(wants(every(nation), loves(G,L)))) Sfin NP Punc . s assert(s) VPfin Det Every N nation T -s VPstem every nation Vstem want Sinf v x present(v(x)) NP George VPinf y x wants(x,y) G T to VPstem a a Vstem love NP Laura y x loves(x,y) L

So now what? Now that we have the semantic meaning, what do we do with it? Huge literature on logical reasoning, and knowledge learning. Reasoning versus Inference “John ate a Pizza” Q:What was eaten by John? A: pizza “John ordered a pizza, but it came with anchovies. John then yelled at the waiter and stormed out.” Q: What was eaten by John? A: nothing

using the own predicate. Problem 1a Write grammar rules complete with semantic translations that could be added to the grammar fragment, which will parse the above sentence and generate a semantic representation using the own predicate.