1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)

Slides:



Advertisements
Similar presentations
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Advertisements

Natural Language Understanding Difficulties: Large amount of human knowledge assumed – Context is key. Language is pattern-based. Patterns can restrict.
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
Statistical NLP: Lecture 3
GRAMMAR & PARSING (Syntactic Analysis) NLP- WEEK 4.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Probabilistic Context Free Grammars (Chapter 14) Muhammed Al-Mulhem March 1,
1 Computational Semantics Chapter 18 November 2012 We will not do all of this… Lecture #12.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
6/3/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Meaning Representation and Semantic Analysis Ling 571 Deep Processing Techniques for NLP February 9, 2011.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Artificial Intelligence 2005/06 From Syntax to Semantics.
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Fall 2004 Lecture Notes #5 EECS 595 / LING 541 / SI 661 Natural Language Processing.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
Natural Language Processing
Chapter 15. Semantic Analysis
BİL711 Natural Language Processing1 Statistical Parse Disambiguation Problem: –How do we disambiguate among a set of parses of a given sentence? –We want.
Probabilistic Parsing Reading: Chap 14, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
1 Statistical Parsing Chapter 14 October 2012 Lecture #9.
1 Natural Language Processing Lecture 11 Efficient Parsing Reading: James Allen NLU (Chapter 6)
November 2003CSA4050: Semantics I1 CSA4050: Advanced Topics in NLP Semantics I What is semantics for? Role of FOL Montague Approach.
Semantic Analysis CMSC Natural Language Processing May 8, 2003.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
1 Natural Language Processing Chapter 15 (part 2).
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Natural Language - General
Section 11.3 Features structures in the Grammar ─ Jin Wang.
November 2006Semantics I1 Natural Language Processing Semantics I What is semantics for? Role of FOL Montague Approach.
Supertagging CMSC Natural Language Processing January 31, 2006.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
By Kyle McCardle.  Issues with Natural Language  Basic Components  Syntax  The Earley Parser  Transition Network Parsers  Augmented Transition Networks.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Natural Language Processing Vasile Rus
Statistical NLP: Lecture 3
Basic Parsing with Context Free Grammars Chapter 13
Natural Language Processing
Natural Language Processing
Compiler Lecture 1 CS510.
CSCI 5832 Natural Language Processing
CPSC 503 Computational Linguistics
CS 388: Natural Language Processing: Syntactic Parsing
بسم الله الرحمن الرحيم ICS 482 Natural Language Processing
CSCI 5832 Natural Language Processing
Probabilistic and Lexicalized Parsing
CSCI 5832 Natural Language Processing
Natural Language - General
CSCI 5832 Natural Language Processing
Linguistic Essentials
Natural Language Processing
David Kauchak CS159 – Spring 2019
Artificial Intelligence 2004 Speech & Natural Language Processing
Presentation transcript:

1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)

2 Semantic Analysis –These notes: syntax driven compositional semantic analysis –Assign meanings based only on the grammar and lexicon (no inference, ignore context)

3 Compositional Analysis Principle of Compositionality –The meaning of a whole is derived from the meanings of the parts What parts? –The constituents of the syntactic parse of the input

4 Example AyCaramba serves meat

5 Compositional Analysis

6 Augmented Rules We’ll accomplish this by attaching semantic formation rules to our syntactic CFG rules Abstractly This should be read as the semantics we attach to A can be computed from some function applied to the semantics of A’s parts.

7 Example Easy parts… –NP -> PropNoun –NP -> MassNoun –PropNoun -> AyCaramba –MassNoun -> meat Attachments {PropNoun.sem} {MassNoun.sem} {AyCaramba} {MEAT }

8 Example S -> NP VP VP -> Verb NP Verb -> serves {VP.sem(NP.sem)} {Verb.sem(NP.sem)} ???

9 Example

10 Example

11 Example

12 Example

13 Key Points Each node in a tree corresponds to a rule in the grammar Each grammar rule has a semantic rule associated with it that specifies how the semantics of the RHS of that rule can be computed from the semantics of its daughters.

14 Quantified Phrases Consider A restaurant serves meat. Assume that A restaurant looks like If we do the normal lambda thing we get

15 Complex Terms Allow the compositional system to pass around representations like the following as objects with parts: Complex-Term →

16 Example Our restaurant example winds up looking like

17 Conversion So… complex terms wind up being embedded inside predicates. So pull them out and redistribute the parts in the right way… P( ) turns into Quantifier var body connective P(var)

18 Example

19 Quantifiers and Connectives If the quantifier is an existential, then the connective is an ^ (and) If the quantifier is a universal, then the connective is an -> (implies)

20 Multiple Complex Terms Note that the conversion technique pulls the quantifiers out to the front of the logical form… That leads to ambiguity if there’s more than one complex term in a sentence.

21 Quantifier Ambiguity Consider –Every restaurant has a menu –That could mean that every restaurant has a menu –Or that There’s some uber-menu out there and all restaurants have that menu

22 Quantifier Scope Ambiguity

23 Ambiguity Much like the prepositional phrase attachment problem The number of possible interpretations goes up exponentially with the number of complex terms in the sentence The best we can do: weak methods to prefer one interpretation over another

24 Integration with a Parser Assume you’re using a dynamic-programming style parser (Earley or CYK). As with feature structures for agreement and subcategorization, we add semantic attachments to states. As constituents are completed and entered into the table, we compute their semantics.