1 Chapter Chapter 5 Grammars for Natural Language 5.1 Auxiliary Verbs and Verb Phrases 5.2 Movement Phenomena in Language 5.3 Handling Questions in Context-Free.

Slides:



Advertisements
Similar presentations
Chapter 4 Syntax Part IV.
Advertisements

Lecture 3a Clause functions Adapted from Mary Laughren.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Feature Structures and Parsing Unification Grammars Algorithms for NLP 18 November 2014.
Augmented Transition Networks
07/05/2005CSA2050: DCG31 CSA2050 Introduction to Computational Linguistics Lecture DCG3 Handling Subcategorisation Handling Relative Clauses.
1 Unification Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
BİL711 Natural Language Processing1 Problems with CFGs We know that CFGs cannot handle certain things which are available in natural languages. In particular,
1 Natural Language Processing Lecture 7 Unification Grammars Reading: James Allen NLU (Chapter 4)
Chapter 4 Syntax.
Movement Markonah : Honey buns, there’s something I wanted to ask you
Grammars for a Natural Language 5.1 – Auxiliary Verbs and Verb Phrases 5.2 – Movement Phenomena in Language 5.3 – Handling Questions in CFGs 5.4 – Relative.
Dr. Abdullah S. Al-Dobaian1 Ch. 2: Phrase Structure Syntactic Structure (basic concepts) Syntactic Structure (basic concepts)  A tree diagram marks constituents.
Long Distance Dependencies (Filler-Gap Constructions) and Relative Clauses October 10, : Grammars and Lexicons Lori Levin (Examples from Kroeger.
Grammatical Relations and Lexical Functional Grammar Grammar Formalisms Spring Term 2004.
The study of how words combine to form grammatical sentences.
1 Features and Augmented Grammars Allen ’ s Chapter 4 J&M ’ s Chapter 11.
1 Auxiliary Verbs and Movement Phenomena Allen ’ s Chapter 5 J&M ’ s Chapter 11.
Natural Language Processing - Feature Structures - Feature Structures and Unification.
COMP 4060 Natural Language Processing Using Features.
Parsing: Features & ATN & Prolog By
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
What is Syntax?  The rules that govern the structure of utterances; also called grammar  The basic organization of sentences is around syntax  build.
Artificial Intelligence 2005/06 Features, Gaps, Movement Questions and Passives.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
June 7th, 2008TAG+91 Binding Theory in LTAG Lucas Champollion University of Pennsylvania
Ambiguity and Transformations
Natural Language Processing Features Passives Questions Gaps Movement.
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Outline of English Syntax.
Computational Grammars Azadeh Maghsoodi. History Before First 20s 20s World War II Last 1950s Nowadays.
Natural Language Processing
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Constituency Tests Phrase Structure Rules
Syntax Nuha AlWadaani.
THE PARTS OF SYNTAX Don’t worry, it’s just a phrase ELL113 Week 4.
Meeting 3 Syntax Constituency, Trees, and Rules
1 Semantics Interpretation Allen ’ s Chapter 9 J&M ’ s Chapter 15.
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 14, Feb 27, 2007.
Chapter 4 Syntax Part II.
1.Syntax: the rules of sentence formation; the component of the mental grammar that represent speakers’ knowledge of the structure of phrase and sentence.
Chapter 16: Features and Unification Heshaam Faili University of Tehran.
IV. SYNTAX. 1.1 What is syntax? Syntax is the study of how sentences are structured, or in other words, it tries to state what words can be combined with.
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Constituent Parsing and Algorithms (with.
October 15, 2007 Non-finite clauses and control : Grammars and Lexicons Lori Levin.
Chapter 4: Syntax Part V.
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Section 11.3 Features structures in the Grammar ─ Jin Wang.
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 16, March 6, 2007.
Making it stick together…
 Chapter 8 (Part 2) Transformations Transformational Grammar Engl 424 Hayfa Alhomaid.
1 Recursive Transition Networks Allen ’ s Chapters 3 J&M ’ s Chapter 10.
1 Natural Language Processing Lectures 8-9 Auxiliary Verbs Movement Phenomena Reading: James Allen NLU (Chapter 5)
1 Natural Language Processing Lecture 6 Features and Augmented Grammars Reading: James Allen NLU (Chapter 4)
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
1 Some English Constructions Transformational Framework October 2, 2012 Lecture 7.
3.3 A More Detailed Look At Transformations Inversion (revised): Move Infl to C. Do Insertion: Insert interrogative do into an empty.
Principles and Parameters (II) Rajat Kumar Mohanty Department of Computer Science and Engineering Indian Institute of Technology Bombay.
Chapter 11: Parsing with Unification Grammars Heshaam Faili University of Tehran.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
X-Bar Theory. The part of the grammar regulating the structure of phrases has come to be known as X'-theory (X’-bar theory'). X-bar theory brings out.
The structure of verb phrases Kuiper and Allan Chapter
Welcome to the flashcards tool for ‘The Study of Language, 5 th edition’, Chapter 8 This is designed as a simple supplementary resource for this textbook,
Descriptive Grammar – 2S, 2016 Mrs. Belén Berríos Droguett
Beginning Syntax Linda Thomas
Lecture 4b: Verb Processes
Part I: Basics and Constituency
Introduction to Linguistics
PASSIVE VOICE AND RELATIVE CLAUSE
Principles and Parameters (I)
David Kauchak CS159 – Spring 2019
Presentation transcript:

1 Chapter Chapter 5 Grammars for Natural Language 5.1 Auxiliary Verbs and Verb Phrases 5.2 Movement Phenomena in Language 5.3 Handling Questions in Context-Free Grammars 5.4 Relative Clauses 5.5 The Hold Mechanism in ATNs 5.6 Gap Threading

2 Auxiliary Verbs and Verb Phrases (I) Lexically different auxiliaries are: Have : followed by past participle Be : followed by present participle (when passive by past participle) Do : alone but a base form after Modal (can, must,..) : base form In a sentence the first auxiliary must agree with the subject as below: Auxiliary COMPFORMConstructionExample Modalbasemodalcan see the house Havepastprtperfecthave seen the house Beingprogressiveis lifting the box Bepastprtpassivewas seen by the crowd

3 Auxiliary Verbs and Verb Phrases (II) How to handle auxiliary verbs: The main idea is to treat them as verbs that take VP as a complement. The rule then is: VP  (AUX COMPFORM ?s)(VP VFORM ?s) And auxiliary sequence: Modal + have + be (prograssive) + be (passive) Next problem is how to handle this sequence. The above rule solves the problem for most of the cases : *He has might see the movie already. Modals after “has” is illegal because modals don’t have past participle form.

4 Auxiliary Verbs and Verb Phrases (III) What about have preceding be: *I must be having been singing We solve this problem with two binary feature: +main : the VP contains the main verb +pass : the VP is passive VP  AUX[be] VP [ing, +main] VP  AUX[be] VP[ing,+pass] Or: VP[+pass]  AUX[be] VP[ pastprt,main] Passive sentences : In passive the VP misses the object. we define a new binary feature. VP[-passgap]  V[ _np] NP VP[+passgap]  V[ _np] Passgap a binary feature to -, and will get the value + if a passive rule is used.

5 Movement Phenomena in Language (I) Two most popular movement : Yes/no Question (local or bounded movement) wh_Question (unbounded movement) This movement can be handled easily but after transformation some constituent is missed. The main problem is to handle the missing. Two new feature definition : Gap : the place where subconstituent is missing Filler :the constituent which is moved What will the fat man angrily put __ in the corner? angrily put what in the corner Different Types of Movement 1- wh-movement 2-topicalization 3-adverb preposing 4-extraposition

6 Movement Phenomena in Language(II) S[-inv] V[_np,base] VP[base,-passgap]AUX[+modal] can Jack NPVP[pres,-pass] The dogsee NP S[-inv] V[_np,pastprt] VP[pastprt,+passgap]AUX[be] was Jack NPVP[past,+pass] seen active form passive form Figure 1:movement for active to passive transform

7 Handling Questions in Context-Free Grammars(I) Rule for extending the CFG to cover yes/no question: S[+inv]  (AUX AGR ?a SUBCAT ?v)(NP AGR ?a)(VP VFORM ?v) It enforces the subject-verb agreement & right VFORM that follows VP. GAP feature passes from the mother to the sub constituent to be placed correctly. there the appropriate constituent will be accepted using no input. The below rule shows this substitution (NP GAP ((CAT NP)(AGR ?a)) AGR ?a)  ε Rules to pass the GAP feature: 1-head constituent is non-lexical category (S GAP ?g)  (NP GAP --)(VP GAP ?g) 2-head constituent is lexical category VP  V[_np_vp:inf] NP PP (VP GAP ?g)  V[_np_vp:inf] (NP GAP ?g)(PP GAP --) (VP GAP ?g)  V[_np_vp:inf] (NP GAP --)(PP GAP ?g) Wh-words are identified as filler and introduced by feature WH. WH feature has two value Q and R. Head constituent

8 Handling Questions in Context-Free Grammars(II) A phrase contains a subphrase that has WH feature then the main phrase also has the same WH feature. In what store did you buy the picture? Rules to generate wh-questions With the above discussion we can add two rule to CFG to handle wh-questions S  (NP[Q,-gap] AGR ?a)(S[+inv] GAP (NP AGR ?a) S  (PP[Q,-gap] PFORM ?p)(S[+inv] GAP (PP PFORM ?p)

9 Handling Questions in Context-Free Grammars(II) Parsing with Gap S2 VFORM past 1 NP1 2 S1 NP1 WH Q AGR 3p 1 DET1 2 CNP1 S1 INV+ GAP (NP AGR 3p) VFORM past 1 AUX1 2 NP2 3 VP1 DET1 WH Q AGR 3p 1 QDET1 CNP1 AGR 3p 1 N1 NP2 AGR 3s 1 PRO1 VP1 VFORM inf GAP (NP AGR 3p) 1 V1 2 EMPTY-NP1 QDET1 WH Q AGR 3p N1 AGR 3p AUX1 AGR 3s VFORM past SUBCAT base PRO1 AGR 3s Which dogs did he see Figure 2: Chart for parsing gaps in CFG

10 Relative Clauses Rule to produce relative clauses: CNP  CNP REL REL  (NP WH R AGR ?a)(S[-inv,fin] GAP (NP AGR ?a)) REL  (PP WH R PFORM ?p)(S[-inv,fin] GAP (NP PFORM ?a)) What about that-clauses: If that be a relative pronoun whit WH feature R then the above rule covers that-clauses. the above rule does not allow the Q feature to move to relative clauses because REL is not the head feature. (CNP GAP ?g)  (CNP GAP ?g)(REL GAP --) Head constituent

11 The Hold Mechanism in ATNs A data structure called hold list maintains the constituent that are to be moved. Hold action : takes a constituent and places it on the hold list. Acceptation condition==The hold list is empty. VIR arc: takes a constituent name and if it is in the hold list removes it from the hold list. VP SWH-S S-INV VP-AUXVCOMPOBJ NP pop aux pop aux NP VIR NP NP VIR NP NP S/2WH * ∩ {Q R}WH:=WH * Action:=HOLD Figure 3: An S Network for question and relative clauses

12 Comparing the methods Three important property 1-coverage2-selectivity3-conciseness 1-Both method have necessary coverage. 2-But *Who did the man who saw hit the boy? Will be accepted in ATN. A way is the mechanism of HIDE and UNHIDE action. HIDE : hide the existing constituents on the hold list. UNHIDE : unhides the constiuents. We can execute HIDE action before the hold action. 3-it is hard to write a grammar that handles the movement constrains.

13 Gap Threading Gap Treading is the combination of both two previously stated methods. It is often used in logic programming. S(position-in,position-out,filler-in,filler-out) Position-in :the beginning of the phrase. Position-out:end of the phrase. Filler-in:the list of fillers in current constituent. Filler-out:the resulting list of fillers that were not used. This method is much the same of a logic implementation of the hold list method. But the mechanism to allow the propagation of the filler list or not can avoid the problem discussed in previous section.