Natural Language Processing

Slides:



Advertisements
Similar presentations
CSA2050: DCG I1 CSA2050 Introduction to Computational Linguistics Lecture 8 Definite Clause Grammars.
Advertisements

Semantic Analysis Read J & M Chapter 15.. The Principle of Compositionality There’s an infinite number of possible sentences and an infinite number of.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Cognitive Linguistics Croft & Cruse 9
Statistical NLP: Lecture 3
1 Computational Semantics Chapter 18 November 2012 We will not do all of this… Lecture #12.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
CS 4705 Semantic Analysis: Syntax-Driven Semantics.
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
Natural Language Query Interface Mostafa Karkache & Bryce Wenninger.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
Syntax Nuha AlWadaani.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15.
Natural Language Processing
1 Features and Unification Chapter 15 October 2012 Lecture #10.
9/8/20151 Natural Language Processing Lecture Notes 1.
Introduction to Linguistics
BİL711 Natural Language Processing1 Statistical Parse Disambiguation Problem: –How do we disambiguate among a set of parses of a given sentence? –We want.
For Monday Read chapter 23, sections 1-2 FOIL exercise due.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Semantic Analysis CMSC Natural Language Processing May 8, 2003.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
Computing Science, University of Aberdeen1 CS4025: Logic-Based Semantics l Compositionality in practice l Producing logic-based meaning representations.
1 Natural Language Processing Chapter 15 (part 2).
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
Grammars Grammars can get quite complex, but are essential. Syntax: the form of the text that is valid Semantics: the meaning of the form – Sometimes semantics.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
CSA2050 Introduction to Computational Linguistics Parsing I.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
November 16, 2004 Lexicon (An Interacting Subsystem in UG) Part-II Rajat Kumar Mohanty IIT Bombay.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
SYNTAX.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 3.
SYNTAX 1 NOV 9, 2015 – DAY 31 Brain & Language LING NSCI Fall 2015.
Welcome to the flashcards tool for ‘The Study of Language, 5 th edition’, Chapter 8 This is designed as a simple supplementary resource for this textbook,
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 King Faisal University.
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Syntax Lecture 9: Verb Types 1.
Vocabulary connections: multi-word items in English
Statistical NLP: Lecture 3
Natural Language Processing
SYNTAX.
Chapter Eight Syntax.
Part I: Basics and Constituency
CSCI 5832 Natural Language Processing
CPSC 503 Computational Linguistics
CS 388: Natural Language Processing: Syntactic Parsing
Chapter Eight Syntax.
Natural Language - General
CSCI 5832 Natural Language Processing
Language Variations: Japanese and English
CSCI 5832 Natural Language Processing
Linguistic Essentials
CPSC 503 Computational Linguistics
CSCI 5832 Natural Language Processing
Semantics A presentation by Jaafar Nabeel
Mariana Berenguer AP Language and Composition
David Kauchak CS159 – Spring 2019
Presentation transcript:

Natural Language Processing Chapter 15 (part 2)

Example in class: John told Mary to Walk Fido Harder Example What makes this hard? What role does Harry play in all this? Example in class: John told Mary to Walk Fido

Non-Compositionality Unfortunately, there are lots of examples where the meaning (loosely defined) can’t be derived from the meanings of the parts Idioms, jokes, irony, sarcasm, metaphor, metonymy, indirect requests, etc

English Idioms Kick the bucket, buy the farm, bite the bullet, run the show, bury the hatchet, etc… Lots of these… constructions where the meaning of the whole is either Totally unrelated to the meanings of the parts (kick the bucket) Related in some opaque way (run the show)

The Tip of the Iceberg Describe this construction A fixed phrase with a particular meaning A syntactically and lexically flexible phrase with a particular meaning A syntactically and lexically flexible phrase with a partially compositional meaning …

Example Enron is the tip of the iceberg. NP -> “the tip of the iceberg” Not so good… attested examples… the tip of Mrs. Ford’s iceberg the tip of a 1000-page iceberg the merest tip of the iceberg How about That’s just the iceberg’s tip.

Example What we seem to need is something like NP -> An initial NP with tip as its head followed by a subsequent PP with of as its head and that has iceberg as the head of its NP And that allows modifiers like merest, Mrs. Ford, and 1000-page to modify the relevant semantic forms

Constructional Approach Syntax and semantics aren’t separable in the way that we’ve been assuming Grammars contain form-meaning pairings that vary in the degree to which the meaning of a constituent can be computed from the meanings of the parts.

Constructional Approach So we’ll allow both VP → V NP {V.sem(NP.sem)} and VP → Kick-Verb the bucket {λ x Die(x)}

Computational Realizations Semantic grammars Simple idea, misleading name Read for your interest: information extraction, cascaded finite-state transducers

Semantic Grammars Traditional grammars may not reflect the semantics in a straightforward way You can deal with this by… Fighting with the grammar Complex lambdas and complex terms, etc Rewriting the grammar to reflect the semantics And in the process give up on some syntactic niceties

BERP Example

BERP Example How about a rule like the following instead… Request → I want to go to eat FoodType Time { some attachment }

Semantic Grammar The technology (plain CFG rules with a set of terminals) is the same as we’ve been using Good: you get exactly the semantic rules you need Bad: you need to develop a new grammar for each new domain

Semantic Grammars Typically used in conversational agents in constrained domains Limited vocabulary Limited grammatical complexity Chart parsing (Earley) can often produce all that’s needed for semantic interpretation even in the face of ungrammatical input.