Notes on TAG (LTAG) and Feature Structures September 28 2005 AKJ.

Slides:



Advertisements
Similar presentations
Feature-based Grammar Ling 571 Deep Techniques for NLP February 2, 2001.
Advertisements

Word list entry: (spiser (V spise Pres)) Stem list entry: (spise (V Transitive (sense eat'))) Template list entries: (V ((sense) (trans relation))) (Pres((syntax.
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Recursion-06: 1 A Tale of Recursion (A very preliminary version) ARAVIND K. JOSHI April (revised May )
Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
Lexical Functional Grammar : Grammar Formalisms Spring Term 2004.
1 Unification Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
1 Natural Language Processing Lecture 7 Unification Grammars Reading: James Allen NLU (Chapter 4)
Grammatical Relations and Lexical Functional Grammar Grammar Formalisms Spring Term 2004.
1 Features and Augmented Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
LTAG Semantics on the Derivation Tree Presented by Maria I. Tchalakova.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 26, 2011.
Features & Unification Ling 571 Deep Processing Techniques for NLP January 31, 2011.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור עשר Chart Parsing (cont) Features.
Analyzing Features of Grammatical Categories Show my head to the people; it is worth seeing. --Feature structure, to Ivan Sag in a dream.
Issues in Computational Linguistics: Parsing and Generation Dick Crouch and Tracy King.
CIS 630 Slides for March Schedule We missed two classes. I will try to make up in April Spring break– Week of March 8 Your presentations I would.
Extracting LTAGs from Treebanks Fei Xia 04/26/07.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Features and Unification
June 7th, 2008TAG+91 Binding Theory in LTAG Lucas Champollion University of Pennsylvania
MC-TAG, flexible composition, etc. ARAVIND K. JOSHI March
Indexing (cont.). Insertion in a B+ Tree Another B+ Tree
Computational Grammars Azadeh Maghsoodi. History Before First 20s 20s World War II Last 1950s Nowadays.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
Notes on TAG (LTAG) and Feature Structures Aravind K. Joshi April
Embedded Clauses in TAG
Feature structures and unification Attributes and values.
LING 388: Language and Computers Sandiway Fong Lecture 17.
Chapter 16: Features and Unification Heshaam Faili University of Tehran.
Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi and introduced in Tree-adjoining grammars are somewhat similar to context-free.
Intro to NLP - J. Eisner1 Tree-Adjoining Grammar (TAG) One of several formalisms that are actually more powerful than CFG Note: CFG with features.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Writing an ERG mal-rule David Mott IBM Emerging Technology Services.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
2007CLINT-LIN-FEATSTR1 Computational Linguistics for Linguists Feature Structures.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 4.
Head-driven Phrase Structure Grammar (HPSG)
Section 11.3 Features structures in the Grammar ─ Jin Wang.
October 25, : Grammars and Lexicons Lori Levin.
NLP. Introduction to NLP Shallow parsing Useful for information extraction –noun phrases, verb phrases, locations, etc. Example –FASTUS (Appelt and Israel,
csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
Artificial Intelligence 2004
Supertagging CMSC Natural Language Processing January 31, 2006.
LING 6520: Comparative Topics in Linguistics (from a computational perspective) Martha Palmer Jan 15,
JOHN DOE PERIOD 8. AT LEAST 80% OF HUMANITY LIVES ON LESS THAN $10 A DAY.
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
LING/C SC/PSYC 438/538 Lecture 20 Sandiway Fong 1.
NLP. Introduction to NLP (U)nderstanding and (G)eneration Language Computer (U) Language (G)
Handling Unlike Coordinated Phrases in TAG by Mixing Syntactic Category and Grammatical Function Carlos A. Prolo Faculdade de Informática – PUCRS CELSUL,
Carnegie Mellon School of Computer Science Copyright © 2007, Carnegie Mellon. All Rights Reserved. 1 LTI Grammars and Lexicons Grammar Writing Lecture.
Chapter 11: Parsing with Unification Grammars Heshaam Faili University of Tehran.
CIS Treebanks, Trees, Querying, QC, etc. Seth Kulick Linguistic Data Consortium University of Pennsylvania
Natural Language Processing Vasile Rus
Embedded Clauses in TAG
Treebanks, Trees, Querying, QC, etc.
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Lecture 4b: Verb Processes
Dependency Parsing & Feature-based Parsing
TREE ADJOINING GRAMMAR
Introduction to Computational Linguistics
LING/C SC/PSYC 438/538 Lecture 24 Sandiway Fong.
Tree-Adjoining Grammar (TAG)
Presentation transcript:

Notes on TAG (LTAG) and Feature Structures September AKJ

2 S NP  V likes NP  e S VP S NP  V S*  think VP  V S does S* NP  who Harry Bill     substitution adjoining who does Bill think Harry likes LTAG: A Derivation

3 Constraints on Substitution and Adjoining S NP  V likes VP NP  Harry    requires a singular NP tree to be substituted at the NP node in 

4 S NP  V likes NP  e S VP S NP  V S*  think VP  V S does S* NP  who Harry Bill     Constraints on Substitution and Adjoining  can be adjoined to  at the root node of  because  is anchored on an untensed verb think

5 S NP  V likes NP  e S VP S NP  V S*  think VP  V S does S* NP  who Harry Bill     The tense associated with the root node of  comes from the tense associated with  and not from likes Constraints on Substitution and Adjoining: Feature Passing

6 Feature Structures Feature Structures: Attribute-Value Structures X1 f: a g: b h: c X2 f: a g: b X1 has more information than X2 X1 is more specific than X2

7 Feature Structures Values can be atomic or complex cat: NP agreement: number: sing gender: masc person: third Recursion in FS: For LTAG, no recursion For semantics ?

8 Feature Structures Feature Structures: Attribute-Value Structures X1 f: a g: a X2 f: a g: X2 has more information than X1 X2 is more specific than X1 X2 subsumes X1 Co-indexing can be across feature structures also

9 Unification of feature structures Given two FS, X1 and X2 X3 = X1 U X2 where X3 is the least FG which subsumes both X1 and X2 X3 is obtained by unifying X1 and X2

10 Constraints on Substitution Feature Passing S NP  V likes VP NP  Harry    requires a singular NP tree to be substituted in  [num: sing] [num: ] [num: sing] [num: ]

11 S NP  V likes NP  e S VP S NP  V S*  think VP  V S does S* NP  who Harry Bill     The tense associated with the root node of  comes from the tense associated with  and not from likes Constraints on Adjoining: Feature Passing [t: ut] [t: ] [t: ut] [t: ] [t: pres] [t: ]

12 Top and Bottom Feature Structures We need top (t) and bottom (b) feature structures for each node, especially the internal nodes of a tree. Why? For each node we have a top and bottom view from that node– adjoining can pull apart these two views. When the derivation stops then we unify the top and bottom FS at each node. If one of these unifications fails then the derivation crashes.

13 Feature Structures and Unification Adjoining as unification  X  X* X  X X tbtb trbrtrbr tfbftfbf t  t r brbr b  b f tftf 

14 Feature Structures and Unification :: X  X  X t trbrtrbr t  t r brbr  Substitution as unification

15 Obligatory (adjoining) constraints John tried to swim S NP  V S* tried VP [ t: -] [ t: ] S PRO TO V to VP swim [t: ] [ t: - ] [ t: +] [t: ] [ t: +] [t: ]  If nothing is adjoined to  then  will crash because the top and bottom features at the root node of  will not unify