MC-TAG, flexible composition, etc. ARAVIND K. JOSHI March 15 2006.

Slides:



Advertisements
Similar presentations
Recursion-06: 1 A Tale of Recursion (A very preliminary version) ARAVIND K. JOSHI April (revised May )
Advertisements

Notes on TAG (LTAG) and Feature Structures September AKJ.
Lexical Functional Grammar : Grammar Formalisms Spring Term 2004.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
Grammars, Languages and Parse Trees. Language Let V be an alphabet or vocabulary V* is set of all strings over V A language L is a subset of V*, i.e.,
PARSING WITH CONTEXT-FREE GRAMMARS
LTAG Semantics on the Derivation Tree Presented by Maria I. Tchalakova.
Albert Gatt LIN3022 Natural Language Processing Lecture 8.
Starting With Complex Primitives Pays Off: Complicate Locally, Simplify Globally ARAVIND K. JOSHI Department of Computer and Information Science and Institute.
Transparency No. P2C4-1 Formal Language and Automata Theory Part II Chapter 4 Parse Trees and Parsing.
Issues in Computational Linguistics: Parsing and Generation Dick Crouch and Tracy King.
CS 4705 Lecture 17 Semantic Analysis: Syntax-Driven Semantics.
Extracting LTAGs from Treebanks Fei Xia 04/26/07.
Starting With Complex Primitives Pays Off: Complicate Locally, Simplify Globally ARAVIND K. JOSHI Department of Computer and Information Science and Institute.
LEXICALIZATION AND CATEGORIAL GRAMMARS: ARAVIND K. JOSHI A STORY BAR-HILLEL MIGHT HAVE LIKED UNIVERSITY OF PENNSYLVANIA PHILADELPHIA, PA USA June.
Features and Unification
104 Closure Properties of Regular Languages Regular languages are closed under many set operations. Let L 1 and L 2 be regular languages. (1) L 1  L 2.
June 7th, 2008TAG+91 Binding Theory in LTAG Lucas Champollion University of Pennsylvania
1/15 Synchronous Tree-Adjoining Grammars Authors: Stuart M. Shieber and Yves Schabes Reporter: 江欣倩 Professor: 陳嘉平.
Syntax LING October 11, 2006 Joshua Tauberer.
1 CSCI 3130: Formal Languages and Automata Theory Tutorial 4 Hung Chun Ho Office: SHB 1026 Department of Computer Science & Engineering.
Computational Grammars Azadeh Maghsoodi. History Before First 20s 20s World War II Last 1950s Nowadays.
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
Probabilistic Parsing Ling 571 Fei Xia Week 5: 10/25-10/27/05.
Notes on TAG (LTAG) and Feature Structures Aravind K. Joshi April
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
SI485i : NLP Set 9 Advanced PCFGs Some slides from Chris Manning.
Embedded Clauses in TAG
Chapter 16 Section 16.3 The Mean-Value Theorem; The Chain Rule.
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi and introduced in Tree-adjoining grammars are somewhat similar to context-free.
Continuous Discontinuity in It-Clefts Introduction Tension between the two approaches Our proposal: TAG analysis Equative it-cleft: It was Ohno who won.
THE BIG PICTURE Basic Assumptions Linguistics is the empirical science that studies language (or linguistic behavior) Linguistics proposes theories (models)
Intro to NLP - J. Eisner1 Tree-Adjoining Grammar (TAG) One of several formalisms that are actually more powerful than CFG Note: CFG with features.
A sentence (S) is composed of a noun phrase (NP) and a verb phrase (VP). A noun phrase may be composed of a determiner (D/DET) and a noun (N). A noun phrase.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
Markus Egg, Alexander Koller, Joachim Niehren The Constraint Language for Lambda Structures Ventsislav Zhechev SfS, Universität Tübingen
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 4.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
Rules, Movement, Ambiguity
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
© 2005 Hans Uszkoreit FLST WS 05/06 FLST Grammars and Parsing Hans Uszkoreit.
NLP. Introduction to NLP Shallow parsing Useful for information extraction –noun phrases, verb phrases, locations, etc. Example –FASTUS (Appelt and Israel,
Supertagging CMSC Natural Language Processing January 31, 2006.
 Chapter 8 (Part 2) Transformations Transformational Grammar Engl 424 Hayfa Alhomaid.
 2003 CSLI Publications Ling 566 Oct 20, 2015 How the Grammar Works.
LING 6520: Comparative Topics in Linguistics (from a computational perspective) Martha Palmer Jan 15,
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
Human and Machine Understanding of normal Language (NL) Character Strings Presented by Peter Tripodes.
Handling Unlike Coordinated Phrases in TAG by Mixing Syntactic Category and Grammatical Function Carlos A. Prolo Faculdade de Informática – PUCRS CELSUL,
December 2011CSA3202: PCFGs1 CSA3202: Human Language Technology Probabilistic Phrase Structure Grammars (PCFGs)
CSC312 Automata Theory Lecture # 26 Chapter # 12 by Cohen Context Free Grammars.
LTAG Semantics for Questions Aleksandar Savkov. Contents Introduction Hamblin’s idea Karttunen’s upgrade Goals of the paper Scope properties of wh-phrases.
CS 461 – Sept. 23 Context-free grammars Derivations Ambiguity Proving correctness.
Roadmap Probabilistic CFGs –Handling ambiguity – more likely analyses –Adding probabilities Grammar Parsing: probabilistic CYK Learning probabilities:
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
Embedded Clauses in TAG
Statistical NLP: Lecture 3
Basic Parsing with Context Free Grammars Chapter 13
Lexical Functional Grammar
Statistical NLP Spring 2011
TREE ADJOINING GRAMMAR
Compilers Principles, Techniques, & Tools Taught by Jing Zhang
Teori Bahasa dan Automata Lecture 9: Contex-Free Grammars
Parsing I: CFGs & the Earley Parser
Tree-Adjoining Grammar (TAG)
Presentation transcript:

MC-TAG, flexible composition, etc. ARAVIND K. JOSHI March

2 Feature Structures and Unification Adjoining as unification  X  X* X  X X tbtb trbrtrbr tfbftfbf t  t r brbr b  b f tftf  No directionality involved in the composition

3 Feature Structures and Unification :: X  X  X t trbrtrbr t  t r brbr  Substitution as unification No directionality involved in the composition

4 Flexible Composition (FC) All formal grammars (FG) have some notion of FC However, it is not the case that for all FG, FC is productive, in the sense that -- FC gives to new and useful derivations and/or -- new word order variations, scope ambiguities, for example In a CFG, a rule such as A B C either B is a function taking C as the argument or vice versa, i.e., A (A/C) C or A B (B\A) -- same derived tree, different derivations, but not useful -- same word order

5 Flexible Composition (FC) CFG rules as one level trees a1: S NP3 VP3 a2: S NP2 S VP2 a3: S NP1 S VP1 a1: S NP3 VP3 a2: S NP2 S VP2 a3: S NP1 S VP1 Same derived tree, two different derivations Same word order NP1 NP2 NP3 VP3 VP2 VP1 Structure adjacency (for one level tree structures) does not buy anything more

6 Standard TAG Derived Tree Derivation tree -- different ways of walking over the derivation tree -- top-down -- bottom-up -- inside-out -- It does not make any difference in terms of the the set of strings that can be derived

7 Multi-Component TAG (MC-TAG) Different motivations Tree-local -- Tree-local MC-TAG weakly equivalent to Standard TAG Set-local

8 Flexible Composition  X Split  at x X X  supertree of  at X  subtree of  at X Adjoining as Wrapping

9  X  X X  X X   wrapped around  i.e., the two components  and  are wrapped around   supertree of  at X  subtree of  at X Flexible Composition Adjoining as Wrapping

10 S V NP  likes NP(wh)  e S VP S NP  V S*S*  think VP  substitution adjoining Flexible Composition Wrapping as substitutions and adjunctions NP  - We can also view this composition as  wrapped around  - Flexible composition

11 S* V NP  likes NP(wh)  e S VP S NP  V S*S*  think VP  substitution adjoining Flexible Composition Wrapping as substitutions and adjunctions NP    S   and  are the two components of   attached (adjoined) to the root node S of   attached (substituted) at the foot node S of  Leads to multi-component TAG (MC-TAG)

12  Multi-component LTAG (MC-LTAG)     The two components are used together in one composition step. Both components attach to nodes in  an elementary tree. This preserves locality.

13 Tree-Local Multi-component LTAG (MC-LTAG) - How can the components of MC-LTAG compose preserving locality of LTAG - Tree-Local MC-LTAG -- Components of a set compose only with an elementary tree or an elementary component - Flexible composition -- The notion of the derivation tree still holds for the Tree-Local MC-TAG -- Different ways of walking over the derivation tree -- It can make a difference in terms of the structures that can be derived!

14 Tree-Local MC-LTAG and flexible semantics Three clauses, C1, C2, and C3, each clause can be either a single elementary tree or a multi- component tree set with two components The verb in C1 takes the verb in C2 as the argument and the verb in C2 takes the verb in C3 as the argument Flexible composition allows us to compose the three clauses in three ways

15 Tree-Local MC-LTAG and flexible semantics Three ways of composing C1, C2, and C3 C1 C2 C3 C1 C2 C3 C1 C2 C3 (1) (2) (3) The third mode of composition can give rise to new strings, which are not obtainable from the first two ways only

16 Scrambling: N3 N2 N1 V3 V2 V1 VP N3 VP VP N3 e V3 VP N2 VP VP N2 V2 e VP N1 VP VP N1 V1 e VP

17 Scrambling: N3 N2 N1 V3 V2 V1 VP N3 VP VP N3 e V3 VP N2 VP VP N2 V2 e VP N1 VP VP N1 V1 e VP (flexible composition)

18 Tree-local MC-TAG Usually two components only One component can be lexically empty (null) Components are not independent -- immediate domination -- domination -- co-indexing Flexible composition has to respect these constraints -- some examples -- Scrambled NP’s -- Extraposition from NP

19 (1) The gardener who the woman kept calling all day finally came. (1’) The gardener finally came who the woman kept calling all day. (2) The gardener who the woman who had lost her keys kept calling all day finally came. *(2’) The gardener who the woman kept calling all day finally came who had lost her keys. Extraposition from NP: An example

20 NP VP The gardener finally came S NP VP The gardener finally came S S S who the woman kept calling all day

21 NP VP The gardener finally came S NP VP The gardener finally came S S S who the woman kept calling all day b1: { b11 b12} NP S NP* S(i) S* S(i) e who the woman kept calling all day

22 NP VP The gardener finally came S NP VP The gardener finally came S S S who the woman kept calling all day b1: { b11 b12} NP S NP* S(i) S* S(i) e who the woman kept calling all day Tree local MC-LTAG for NP Extraposition

23 NP VP S S S (i) who had lost her keys NP S The gardener who the woman (i) kept calling all day finally came * (2) The gardener who the woman who had lost her keys kept calling all day finally came. *(2’) The gardener who the woman kept calling all day finally came who had lost her keys. Not possible even with flexible composition if the constraints between the two components are to be respected