Notes on TAG (LTAG) and Feature Structures Aravind K. Joshi April 14 2008.

Slides:



Advertisements
Similar presentations
Word list entry: (spiser (V spise Pres)) Stem list entry: (spise (V Transitive (sense eat'))) Template list entries: (V ((sense) (trans relation))) (Pres((syntax.
Advertisements

 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Recursion-06: 1 A Tale of Recursion (A very preliminary version) ARAVIND K. JOSHI April (revised May )
Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
Notes on TAG (LTAG) and Feature Structures September AKJ.
07/05/2005CSA2050: DCG31 CSA2050 Introduction to Computational Linguistics Lecture DCG3 Handling Subcategorisation Handling Relative Clauses.
Lexical Functional Grammar : Grammar Formalisms Spring Term 2004.
1 Unification Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
1 Features and Augmented Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
LTAG Semantics on the Derivation Tree Presented by Maria I. Tchalakova.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 26, 2011.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 31, 2011.
1 Pertemuan 22 Natural Language Processing Syntactic Processing Matakuliah: T0264/Intelijensia Semu Tahun: Juli 2006 Versi: 2/2.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור עשר Chart Parsing (cont) Features.
Analyzing Features of Grammatical Categories Show my head to the people; it is worth seeing. --Feature structure, to Ivan Sag in a dream.
CIS 630 Slides for March Schedule We missed two classes. I will try to make up in April Spring break– Week of March 8 Your presentations I would.
Extracting LTAGs from Treebanks Fei Xia 04/26/07.
Starting With Complex Primitives Pays Off: Complicate Locally, Simplify Globally ARAVIND K. JOSHI Department of Computer and Information Science and Institute.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
LEXICALIZATION AND CATEGORIAL GRAMMARS: ARAVIND K. JOSHI A STORY BAR-HILLEL MIGHT HAVE LIKED UNIVERSITY OF PENNSYLVANIA PHILADELPHIA, PA USA June.
Features and Unification
June 7th, 2008TAG+91 Binding Theory in LTAG Lucas Champollion University of Pennsylvania
MC-TAG, flexible composition, etc. ARAVIND K. JOSHI March
Ch.11 Features and Unification Ling 538, 2006 Fall Jiyoung Kim.
Unification-based algorithms We have seen that treating features, such as person-number agreement, as part of the category of a phrase results in a large.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
Context-Free Parsing Part 2 Features and Unification.
LING/C SC/PSYC 438/538 Lecture 19 Sandiway Fong 1.
Embedded Clauses in TAG
Feature structures and unification Attributes and values.
LING 388: Language and Computers Sandiway Fong Lecture 17.
Chapter 16: Features and Unification Heshaam Faili University of Tehran.
Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi and introduced in Tree-adjoining grammars are somewhat similar to context-free.
Intro to NLP - J. Eisner1 Tree-Adjoining Grammar (TAG) One of several formalisms that are actually more powerful than CFG Note: CFG with features.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
2007CLINT-LIN-FEATSTR1 Computational Linguistics for Linguists Feature Structures.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 4.
October 25, : Grammars and Lexicons Lori Levin.
SORTING & SEARCHING - Bubble SortBubble Sort - Insertion SortInsertion Sort - Quick SortQuick Sort - Binary SearchBinary Search 2 nd June 2005 Thursday.
NLP. Introduction to NLP Shallow parsing Useful for information extraction –noun phrases, verb phrases, locations, etc. Example –FASTUS (Appelt and Israel,
csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
Solving Systems Using Elimination
Supertagging CMSC Natural Language Processing January 31, 2006.
JOHN DOE PERIOD 8. AT LEAST 80% OF HUMANITY LIVES ON LESS THAN $10 A DAY.
LING/C SC/PSYC 438/538 Lecture 20 Sandiway Fong 1.
NLP. Introduction to NLP (U)nderstanding and (G)eneration Language Computer (U) Language (G)
Handling Unlike Coordinated Phrases in TAG by Mixing Syntactic Category and Grammatical Function Carlos A. Prolo Faculdade de Informática – PUCRS CELSUL,
7.3 Solving Systems of Equations The Elimination Method.
Chapter 11: Parsing with Unification Grammars Heshaam Faili University of Tehran.
Solving Systems of Equations The Elimination Method.
Natural Language Processing Vasile Rus
Embedded Clauses in TAG
Lecture 4b: Verb Processes
Dependency Parsing & Feature-based Parsing
TREE ADJOINING GRAMMAR
Solving Systems of Equations
Solving a system of equations by elimination using multiplication.
Introduction to Computational Linguistics
FRONT No Solutions Infinite Solutions 1 solution (x, y)
LING/C SC/PSYC 438/538 Lecture 24 Sandiway Fong.
Operator precedence and AST’s
CSA2050 Introduction to Computational Linguistics
Tree-Adjoining Grammar (TAG)
Operator Precedence and Associativity
Solving Systems of Equations
Presentation transcript:

Notes on TAG (LTAG) and Feature Structures Aravind K. Joshi April

S NP  V likes NP  e S VP S NP  V S*  think VP  V S does S* NP  who Harry Bill     substitution adjoining who does Bill think Harry likes LTAG: A Derivation

Constraints on Substitution and Adjoining S NP  V likes VP NP  Harry    requires a singular NP tree to be substituted at the NP node in 

S NP  V likes NP  e S VP S NP  V S*  think VP  V S does S* NP  who Harry Bill     Constraints on Substitution and Adjoining  can be adjoined to  at the root node of  because  is anchored on an untensed verb think

S NP  V likes NP  e S VP S NP  V S*  think VP  V S does S* NP  who Harry Bill     The tense associated with the root node of  comes from the tense associated with  and not from likes Constraints on Substitution and Adjoining: Feature Passing

Feature Structures Feature Structures: Attribute-Value Structures X1 f: a g: b h: c X2 f: a g: b X1 has more information than X2 X1 is more specific than X2

Feature Structures Values can be atomic or complex cat: NP agreement: number: sing gender: masc person: third Recursion in Feature Structures There is no recursion in the feature structures in LTAG Thus, in principle, FS can eliminated at the expense of multiplying non-terminals

Feature Structures Feature Structures: Attribute-Value Structures X1 f: a g: a X2 f: a g: X2 has more information than X1 X2 is more specific than X1 X2 subsumes X1 Co-indexing can be across feature structures also

Unification of feature structures Given two FS, X1 and X2 X3 = X1 U X2 where X3 is the least FG which subsumes both X1 and X2 X3 is obtained by unifying X1 and X2

Constraints on Substitution Feature Passing S NP  V likes VP NP  Harry    requires a singular NP tree to be substituted in  [num: sing] [num: ] [num: sing] [num: ]

S NP  V likes NP  e S VP S NP  V S*  think VP  V S does S* NP  who Harry Bill     The tense associated with the root node of  comes from the tense associated with  and not from likes Constraints on Adjoining: Feature Passing [t: ut] [t: ] [t: ut] [t: ] [t: pres] [t: ]

Top and Bottom Feature Structures We need top (t) and bottom (b) feature structures for each node, especially the internal nodes of a tree. Why? For each node we have a top and bottom view from that node– adjoining can pull apart these two views. After the derivation stops we unify the top and bottom FS at each node. If one of these unifications fails then the derivation crashes.

Feature Structures and Unification Adjoining as unification  X  X* X  X X tbtb trbrtrbr tfbftfbf t  t r brbr b  b f tftf 

Feature Structures and Unification Adjoining as unification  X  X* X  X X tbtb trbrtrbr tfbftfbf t  t r brbr b  b f tftf 

Feature Structures and Unification Adjoining as unification (Adjoining at the root node)  X  X* X  X tbtb trbrtrbr tfbftfbf t  t r brbr b  b f tftf  X 

Feature Structures and Unification :: X  X  X t trbrtrbr t  t r brbr  Substitution as unification

Obligatory (adjoining) constraints John tried to swim S NP  V S* tried VP [ t: -] [ t: ] S PRO TO V to VP swim [t: ] [ t: - ] [ t: +] [t: ] [ t: +] [t: ]  If nothing is adjoined to  then  will crash because the top and bottom features at the root node of  will not unify

Obligatory (adjoining) constraints John tried to swim S NP  V S* tried VP [ t: -] [ t: ] S PRO TO V to VP swim [t: ] [ t: - ] [ t: +] [t: ] [ t: +] [t: ] 

Adjoining  to the root of  John tried to swim S NP  V S* tried VP [ t: -] [ t: ] S PRO TO V to VP swim [t: ] [ t: - ] [ t: +] [t: ] [ t: +] [t: ]   Note that the conflicting features at the root node of  are now separated