1 Chart Parsing Allen ’ s Chapter 3 J & M ’ s Chapter 10.

Slides:



Advertisements
Similar presentations
Basic Parsing with Context-Free Grammars CS 4705 Julia Hirschberg 1 Some slides adapted from Kathy McKeown and Dan Jurafsky.
Advertisements

Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
Lecture # 11 Grammar Problems.
Parsing with Context Free Grammars Reading: Chap 13, Jurafsky & Martin
GRAMMAR & PARSING (Syntactic Analysis) NLP- WEEK 4.
Probabilistic Earley Parsing Charlie Kehoe, Spring 2004 Based on the 1995 paper by Andreas Stolcke: An Efficient Probabilistic Context-Free Parsing Algorithm.
CKY Parsing Ling 571 Deep Processing Techniques for NLP January 12, 2011.
Lexical and Syntactic Analysis Here, we look at two of the tasks involved in the compilation process –Given source code, we need to first break it into.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
1 Earley Algorithm Chapter 13.4 October 2009 Lecture #9.
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
Parsing context-free grammars Context-free grammars specify structure, not process. There are many different ways to parse input in accordance with a given.
Parsing with CFG Ling 571 Fei Xia Week 2: 10/4-10/6/05.
6/12/2015Prof. Hilfinger CS164 Lecture 111 Bottom-Up Parsing Lecture (From slides by G. Necula & R. Bodik)
Top-Down Parsing.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Earley’s algorithm Earley’s algorithm employs the dynamic programming technique to address the weaknesses of general top-down parsing. Dynamic programming.
CS 536 Spring Introduction to Bottom-Up Parsing Lecture 11.
Parsing — Part II (Ambiguity, Top-down parsing, Left-recursion Removal)
Prof. Fateman CS 164 Lecture 91 Bottom-Up Parsing Lecture 9.
1 Chapter 4: Top-Down Parsing. 2 Objectives of Top-Down Parsing an attempt to find a leftmost derivation for an input string. an attempt to construct.
LR(1) Languages An Introduction Professor Yihjia Tsai Tamkang University.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
COS 320 Compilers David Walker. last time context free grammars (Appel 3.1) –terminals, non-terminals, rules –derivations & parse trees –ambiguous grammars.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
1 Bottom-up parsing Goal of parser : build a derivation –top-down parser : build a derivation by working from the start symbol towards the input. builds.
2/20/2008Prof. Hilfinger CS164 Lecture 121 Earley’s Algorithm: General Context-Free Parsing Lecture 12 P. N. Hilfinger.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
Context-Free Grammar CSCI-GA.2590 – Lecture 3 Ralph Grishman NYU.
COP4020 Programming Languages Computing LL(1) parsing table Prof. Xin Yuan.
Parsing IV Bottom-up Parsing Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University.
Syntax and Semantics Structure of programming languages.
Parsing. Goals of Parsing Check the input for syntactic accuracy Return appropriate error messages Recover if possible Produce, or at least traverse,
Parsing Chapter 4 Parsing2 Outline Top-down v.s. Bottom-up Top-down parsing Recursive-descent parsing LL(1) parsing LL(1) parsing algorithm First.
Top-Down Parsing - recursive descent - predictive parsing
Chapter 5 Top-Down Parsing.
1 Natural Language Processing Lecture 11 Efficient Parsing Reading: James Allen NLU (Chapter 6)
-Mandakinee Singh (11CS10026).  What is parsing? ◦ Discovering the derivation of a string: If one exists. ◦ Harder than generating strings.  Two major.
1 Compiler Construction Syntax Analysis Top-down parsing.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
Review 1.Lexical Analysis 2.Syntax Analysis 3.Semantic Analysis 4.Code Generation 5.Code Optimization.
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
CMSC 331, Some material © 1998 by Addison Wesley Longman, Inc. 1 Chapter 4 Chapter 4 Bottom Up Parsing.
Syntax and Semantics Structure of programming languages.
Parsing with Context-Free Grammars References: 1.Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2.Speech and Language Processing, chapters 9,
Chart Parsing and Augmenting Grammars CSE-391: Artificial Intelligence University of Pennsylvania Matt Huenerfauth March 2005.
Sentence Parsing Parsing 3 Dynamic Programming. Jan 2009 Speech and Language Processing - Jurafsky and Martin 2 Acknowledgement  Lecture based on  Jurafsky.
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Chapter 4 Top-Down Parsing Recursive-Descent Gang S. Liu College of Computer Science & Technology Harbin Engineering University.
November 2004csa3050: Sentence Parsing II1 CSA350: NLP Algorithms Sentence Parsing 2 Top Down Bottom-Up Left Corner BUP Implementation in Prolog.
Parsing Top-Down.
Quick Speech Synthesis CMSC Natural Language Processing April 29, 2003.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
Parsing — Part II (Top-down parsing, left-recursion removal) Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students.
1 Nonrecursive Predictive Parsing  It is possible to build a nonrecursive predictive parser  This is done by maintaining an explicit stack.
CSE 5317/4305 L3: Parsing #11 Parsing #1 Leonidas Fegaras.
Instructor: Nick Cercone CSEB - 1 Parsing and Context Free Grammars Parsers, Top Down, Bottom Up, Left Corner, Earley.
Top-Down Predictive Parsing We will look at two different ways to implement a non- backtracking top-down parser called a predictive parser. A predictive.
Parsing methods: –Top-down parsing –Bottom-up parsing –Universal.
Bottom Up Parsing CS 671 January 31, CS 671 – Spring Where Are We? Finished Top-Down Parsing Starting Bottom-Up Parsing Lexical Analysis.
November 2004csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
Programming Languages Translator
Parsing and More Parsing
CSA2050 Introduction to Computational Linguistics
Parsing I: CFGs & the Earley Parser
NLP.
Presentation transcript:

1 Chart Parsing Allen ’ s Chapter 3 J & M ’ s Chapter 10

2 Chart Parsing General Principles: A Bottom-Up parsing method – Construct a parse starting from the input symbols – Build constituents from sub-constituents – When all constituents on the RHS of a rule are matched, create a constituent for the LHS of the rule The Chart allows storing partial analyses, so that they can be shared. Data structures used by the algorithm: – The Key: the current constituent we are attempting to “match” – An Active Arc: a grammar rule that has a partially matched RHS – The Agenda: Keeps track of newly found unprocessed constituents – The Chart: Records processed constituents (non-terminals) that span substrings of the input

3 Chart Parsing Steps in the Process: Input is processed left-to-right, one word at a time 1. Find all POS of the current word (terminal-level) 2. Initialize Agenda with all POS of the word 3. Pick a Key from the Agenda 4. Add all grammar rules that start with the Key as active arcs 5. Extend any existing active arcs with the Key 6. Add LHS constituents of newly completed rules to the Agenda 7. Add the Key to the Chart 8. If Agenda not empty – go to (3), else go to (1)

4 The Chart Parsing Algorithm Extending Active Arcs with a Key: -Each Active Arc has the form: [A  X 1 …  C…X m ] - A Key constituent has the form: C - When processing the Key C, we search the active arc list for an arc [A  X 1 …  C…X m ], and then create a new active arc [A  X 1 …C  … X m ] - If the new active arc is a completed rule: [A  X 1 …C  ], then we add A to the Agenda - After “using” the key to extend all relevant arcs, it is entered into the Chart

5 The Chart Parsing Algorithm The Main Algorithm: parsing input x = x 1 …x n 1. i = 0 2. If Agenda is empty and i C to the Agenda 3. Pick a Key constituent C from the Agenda 4. For each grammar rule of form A  CX 1 …X m, add [A   CX 1 …X m ] to the list of active arcs 5. Use Key to extend all relevant active arcs 6. Add LHS of any completed active arcs into the Agenda 7. Insert the Key into the Chart 8. If Key is S then Accept the input Else goto (2)

6 Chart Parsing - Example The Grammar: (1) S  NP VP (2) NP  ART ADJ N (3) NP  ART N (4) NP  ADJ N (5) VP  AUX VP (6) VP  V NP

7 Chart Parsing - Example The input: “x = The large can can hold the water” POS of Input Words: – the: ART – large: ADJ – can: N, AUX, V – hold: N, V – water: N, V

8 The large can can hold the water

9

10 The large can can hold the water

11 The large can can hold the water

12 The final Chart

13 Time Complexity of Chart Parsing Algorithm iterates for each word of input (i.e. n iterations) How many keys can be created in a single iteration i? – Each key in Agenda has the form C, 1  j  i – Thus O(n) keys Each key C can extend active arcs of the form [A  X 1 …  C…X m ], for 1  k  j Each key can thus extend at most O(n) active arcs Time required for each iteration is thus O(n 2 ) Time bound on entire algorithm is therefore O(n 3 )

14 Improved Chart Parser Increasing the Efficiency of the Algorithm Observation: The algorithm creates constituents that cannot “fit” into a global parse structure. Example: [NP  ADJ N] These can be eliminated using predictions Using Top-down predictions: – For each non-terminal A, define First(A) = the set of all leftmost non- terminals derivable from A – Compute in advance First(A) for all non-terminals A – When processing the key C we look for grammar rules that form an active arc [A   C X 2 …X m ] – Add the active arc only if A is predicted: there already exists an active arc [B  X 1 …  D…X m ] such that A  First(D) – For p k = 1 (arcs of the form [A   C X 2 …X m ] ), add the active arc only if A  First(S)

15 Improved Chart Parser - Example

16 The large can can hold the water

17 The large can can hold the water

18

19