Section 11.3 Features structures in the Grammar ─ Jin Wang.

Slides:



Advertisements
Similar presentations
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Advertisements

Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
07/05/2005CSA2050: DCG31 CSA2050 Introduction to Computational Linguistics Lecture DCG3 Handling Subcategorisation Handling Relative Clauses.
1 Unification Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
Mrach 1, 2009Dr. Muhammed Al-Mulhem1 ICS482 Formal Grammars Chapter 12 Muhammed Al-Mulhem March 1, 2009.
Chapter 4 Syntax.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Sequence Classification: Chunking Shallow Processing Techniques for NLP Ling570 November 28, 2011.
Grammatical Relations and Lexical Functional Grammar Grammar Formalisms Spring Term 2004.
Statistical NLP: Lecture 3
1 Features and Augmented Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 26, 2011.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 31, 2011.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור עשר Chart Parsing (cont) Features.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Sag et al., Chapter 4 Complex Feature Values 10/7/04 Michael Mulyar.
Artificial Intelligence 2005/06 From Syntax to Semantics.
Earley’s algorithm Earley’s algorithm employs the dynamic programming technique to address the weaknesses of general top-down parsing. Dynamic programming.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Features and Unification
Syntax and Context-Free Grammars CMSC 723: Computational Linguistics I ― Session #6 Jimmy Lin The iSchool University of Maryland Wednesday, October 7,
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Grammar Sentence Constructs.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
Features and Unification Read J & M Chapter 11.. Solving the Agreement Problem Number agreement: S  NP VP * Mary walk. [NP NUMBER] [VP NUMBER] NP  det.
Context-Free Parsing Part 2 Features and Unification.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING COMP3310 Natural Language Processing Eric Atwell, Language Research Group.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin and Rada Mihalcea.
Context-Free Grammars for English 1 인공지능 연구실 허 희 근.
1.Syntax: the rules of sentence formation; the component of the mental grammar that represent speakers’ knowledge of the structure of phrase and sentence.
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
Chapter 16: Features and Unification Heshaam Faili University of Tehran.
IV. SYNTAX. 1.1 What is syntax? Syntax is the study of how sentences are structured, or in other words, it tries to state what words can be combined with.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Fall 2004 Lecture Notes #4 EECS 595 / LING 541 / SI 661 Natural Language Processing.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
1 LIN6932 Spring 2007 LIN6932 Topics in Computational Linguistics Lecture 6: Grammar and Parsing (I) February 15, 2007 Hana Filip.
2007CLINT-LIN-FEATSTR1 Computational Linguistics for Linguists Feature Structures.
Linguistic Essentials
Rules, Movement, Ambiguity
The man bites the dog man bites the dog bites the dog the dog dog Parse Tree NP A N the man bites the dog V N NP S VP A 1. Sentence  noun-phrase verb-phrase.
CSA2050 Introduction to Computational Linguistics Parsing I.
Artificial Intelligence 2004
Natural Language Processing Lecture 14—10/13/2015 Jim Martin.
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
3.3 A More Detailed Look At Transformations Inversion (revised): Move Infl to C. Do Insertion: Insert interrogative do into an empty.
Chapter 11: Parsing with Unification Grammars Heshaam Faili University of Tehran.
Speech and Language Processing Formal Grammars Chapter 12.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
Natural Language Processing Vasile Rus
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
Statistical NLP: Lecture 3
Basic Parsing with Context Free Grammars Chapter 13
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 6/19/2018
Probabilistic CKY Parser
Lecture 13: Grammar and Parsing (I) November 9, 2004 Dan Jurafsky
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
CSCI 5832 Natural Language Processing
Linguistic Essentials
CPSC 503 Computational Linguistics
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Presentation transcript:

Section 11.3 Features structures in the Grammar ─ Jin Wang

How to specify a way to integrate feature structures and unification operations into the specification of a grammar? feature structures unification Augmenting the rules of ordinary context-free grammars with attachments that specify feature structures for the constituents of the rules, along with appropriate unification operations that express constraints on those constituents.

β 0 ─> β 1... Β n (PATR- Ⅱ system, Shieber 1986) {set of constraints} The specified constraints have one of the following forms: ﹤ β i feature path > = Atomic value ﹤ β i feature path > = ﹤ β j feature path >

How to use these constraints??? At the beginning of this chapter: S ─> NP VP Only if the number of the NP is equal to the number of the VP Using the new notation: S ─> NP VP ﹤ NP NUMBER ﹥ = ﹤ VP NUMBER ﹥ So, the simple generative nature of context-free rules has been fundamentally changed by this augmentation.

Unification constraints can be applied to four interesting linguistic phenomena: Agreement Grammatical heads Subcategorization Long-distance dependencies. subject-verb agreement Agreement S ─> NP VP ﹤ NP AGREEMENT > = ﹤ VP AGREEMENT > - This flight serves breakfast. -Does this flight serve breakfast? -Do these flight serve breakfast? S ─> Aux NP VP ﹤ Aux AGREEMENT > = ﹤ NP AGREEMENT > - Does this flight serve breakfast? - Do these flight serve breakfast?

Determiner-nominal agreement Agreement NP ─> Det Nominal ﹤ Det AGREEMENT > = ﹤ Nominal AGREEMENT > ﹤ NP AGREEMENT > = ﹤ Nominal AGREEMENT >

Non-lexical grammatical constituents can acquire values for at least some of their features from their component constituents. VP ─> Verb NP = Verb ─> serves = SG = 3 The same technique works for the remaining NP and Nominal categories.

Head Features VP ─> Verb NP =<Verb AGREEMENT> NP ─> Det Nominal =<Nominal AGREEMENT> =<Nominal AGREEMENT> Nominal─> Noun =<Noun AGREEMENT> VP ─> Verb NP = NP ─> Det Nominal = =<Nominal HEAD AGREEMENT> Nominal─> Noun = The features for most grammatical categories are copied from one of the children to the parent. Head of the phrase: The child that prvides the features. Head features: The features that are copied.

Subcategorization Verbs can be picky about the pattern of arguments they will allow themselves to appear with. SUBCAT: An atomic feature. A proper way to introduce feature structures to distinguish among the various members of the verb category. Verb ─> serves = SG = TRANS

This constraint is enforced by adding corresponding constraints to all the verb phrase rules in the grammar: VP ─> Verb = = INTRANS VP ─> Verb NP = = TRANS VP ─> Verb NP NP = = DITRANS However, this approach is somewhat opaque since these unanalyzable SUBCAT symbols do not directly encode either the number or type of the arguments that the verb expects to take.

A somewhat more elegant solution: - One argument (Serves dinner ) Verb ─> serves = SG = NP = END - Two arguments (leaves Boston in the morning ) Verb ─> serves = SG = NP = PP = END

Those examples just show rather simple subcategorization structures for verbs. In fact, verbs can subcategorize for quite complex subcategorization frames. Also, the notion of subcategorization can be used in other parts of speech, such as the prepositions.

Which flight do you want me to have the travel agent book ? Long-Distance Dependencies Sometimes, a constituent subcategorized for by the verb is not locally instantiated, but is in a long-distance relationship with the predicate. What cities does Continental service ?

Many solutions to representing long-distance dependencies in unification grammars involve keeping a gap list, implemented as a feature GAP, which is passed up from phrase to phrase in the parse tree.