Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.

Slides:



Advertisements
Similar presentations
Basic Parsing with Context-Free Grammars CS 4705 Julia Hirschberg 1 Some slides adapted from Kathy McKeown and Dan Jurafsky.
Advertisements

Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
PARSING WITH CONTEXT-FREE GRAMMARS
Parsing with Context Free Grammars Reading: Chap 13, Jurafsky & Martin
CKY Parsing Ling 571 Deep Processing Techniques for NLP January 12, 2011.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
1 Earley Algorithm Chapter 13.4 October 2009 Lecture #9.
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
CS Basic Parsing with Context-Free Grammars.
Parsing context-free grammars Context-free grammars specify structure, not process. There are many different ways to parse input in accordance with a given.
Albert Gatt LIN3022 Natural Language Processing Lecture 8.
Parsing with CFG Ling 571 Fei Xia Week 2: 10/4-10/6/05.
Context-Free Parsing. 2/37 Basic issues Top-down vs. bottom-up Handling ambiguity –Lexical ambiguity –Structural ambiguity Breadth first vs. depth first.
Earley’s algorithm Earley’s algorithm employs the dynamic programming technique to address the weaknesses of general top-down parsing. Dynamic programming.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
Syntactic Parsing with CFGs CMSC 723: Computational Linguistics I ― Session #7 Jimmy Lin The iSchool University of Maryland Wednesday, October 14, 2009.
CS 4705 Basic Parsing with Context-Free Grammars.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
Context-Free Grammar CSCI-GA.2590 – Lecture 3 Ralph Grishman NYU.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
LIN LIN6932: Topics in Computational Linguistics Hana Filip.
1 Basic Parsing with Context- Free Grammars Slides adapted from Dan Jurafsky and Julia Hirschberg.
Intro to NLP - J. Eisner1 Earley’s Algorithm (1970) Nice combo of our parsing ideas so far:  no restrictions on the form of the grammar:  A.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
中文信息处理 Chinese NLP Lecture 9.
1 CKY and Earley Algorithms Chapter 13 October 2012 Lecture #8.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
Chapter 10. Parsing with CFGs From: Chapter 10 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
LINGUISTICA GENERALE E COMPUTAZIONALE ANALISI SINTATTICA (PARSING)
10. Parsing with Context-free Grammars -Speech and Language Processing- 발표자 : 정영임 발표일 :
11 Syntactic Parsing. Produce the correct syntactic parse tree for a sentence.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
6/2/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
1 Basic Parsing with Context- Free Grammars Slides adapted from Julia Hirschberg and Dan Jurafsky.
Parsing with Context-Free Grammars References: 1.Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2.Speech and Language Processing, chapters 9,
Sentence Parsing Parsing 3 Dynamic Programming. Jan 2009 Speech and Language Processing - Jurafsky and Martin 2 Acknowledgement  Lecture based on  Jurafsky.
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
Quick Speech Synthesis CMSC Natural Language Processing April 29, 2003.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
Natural Language Processing Lecture 15—10/15/2015 Jim Martin.
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
Instructor: Nick Cercone CSEB - 1 Parsing and Context Free Grammars Parsers, Top Down, Bottom Up, Left Corner, Earley.
October 2005CSA3180: Parsing Algorithms 21 CSA3050: NLP Algorithms Parsing Algorithms 2 Problems with DFTD Parser Earley Parsing Algorithm.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Speech and Language Processing SLP Chapter 13 Parsing.
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
Basic Parsing with Context Free Grammars Chapter 13
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
CPSC 503 Computational Linguistics
CSCI 5832 Natural Language Processing
Lecture 14: Grammar and Parsing (II) November 11, 2004 Dan Jurafsky
Natural Language - General
Parsing and More Parsing
CSA2050 Introduction to Computational Linguistics
David Kauchak CS159 – Spring 2019
Parsing I: CFGs & the Earley Parser
David Kauchak CS159 – Spring 2019
Presentation transcript:

Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky

 To view past videos: ◦ p?c=133ae14752e27fde909fdbd64c06b337 p?c=133ae14752e27fde909fdbd64c06b337  Usually available only for 1 week. Right now, available for all previous lectures 2

3

4

5

 Declarative formalisms like CFGs, FSAs define the legal strings of a language -- but only tell you ‘this is a legal string of the language X’  Parsing algorithms specify how to recognize the strings of a language and assign each string one (or more) syntactic analyses 6

 Many possible CFGs for English, here is an example (fragment): ◦ S  NP VP ◦ VP  V NP ◦ NP  Det N | Adj NP ◦ N  boy | girl ◦ V  sees | likes ◦ Adj  big | small ◦ DetP  a | the ◦ *big the small girl sees a boy ◦ John likes a girl ◦ I like a girl ◦ I sleep ◦ The old dog the footsteps of the young the small boy likes a girl

S  NP VPVP  V S  Aux NP VPVP -> V PP S -> VPPP -> Prep NP NP  Det NomN  old | dog | footsteps | young | flight NP  PropNV  dog | include | prefer | book NP -> Pronoun Nom -> Adj NomAux  does Nom  NPrep  from | to | on | of Nom  N NomPropN  Bush | McCain | Obama Nom  Nom PPDet  that | this | a| the VP  V NPAdj -> old | green | red

Parse Tree for ‘The old dog the footsteps of the young’ for Prior CFGPrior CFG S NPVP NPV DET NOM N PP DETNOM N Theolddogthe footsteps of the young

 Searching FSAs ◦ Finding the right path through the automaton ◦ Search space defined by structure of FSA  Searching CFGs ◦ Finding the right parse tree among all possible parse trees ◦ Search space defined by the grammar  Constraints provided by the input sentence and the automaton or grammar 10

 Builds from the root S node to the leaves  Expectation-based  Common search strategy ◦ Top-down, left-to-right, backtracking ◦ Try first rule with LHS = S ◦ Next expand all constituents in these trees/rules ◦ Continue until leaves are POS ◦ Backtrack when candidate POS does not match input string 11

 “The old dog the footsteps of the young.”  Where does backtracking happen?  What are the computational disadvantages?  What are the advantages? 12

 Parser begins with words of input and builds up trees, applying grammar rules whose RHS matches Det N V Det N Prep Det N The old dog the footsteps of the young. Det Adj N Det N Prep Det N The old dog the footsteps of the young. Parse continues until an S root node reached or no further node expansion possible 13

Det N V Det N Prep Det N The old dog the footsteps of the young. Det Adj N Det N Prep Det N 14

 When does disambiguation occur?  What are the computational advantages and disadvantages? 15

 Top-Down parsers – they never explore illegal parses (e.g. which can’t form an S) -- but waste time on trees that can never match the input Top-Down parsers  Bottom-Up parsers – they never explore trees inconsistent with input -- but waste time exploring illegal parses (with no S root) Bottom-Up parsers  For both: find a control strategy -- how explore search space efficiently? ◦ Pursuing all parses in parallel or backtrack or …? ◦ Which rule to apply next? ◦ Which node to expand next? 16

Dynamic Programming Approaches – Use a chart to represent partial results  CKY Parsing Algorithm ◦ Bottom-up ◦ Grammar must be in Normal Form ◦ The parse tree might not be consistent with linguistic theory  Early Parsing Algorithm ◦ Top-down ◦ Expectations about constituents are confirmed by input ◦ A POS tag for a word that is not predicted is never added  Chart Parser 17

 Allows arbitrary CFGs  Fills a table in a single sweep over the input words ◦ Table is length N+1; N is number of words ◦ Table entries represent  Completed constituents and their locations  In-progress constituents  Predicted constituents 18

 The table-entries are called states and are represented with dotted-rules. S -> · VPA VP is predicted NP -> Det · NominalAn NP is in progress VP -> V NP · A VP has been found 19

 It would be nice to know where these things are in the input so… S -> · VP [0,0]A VP is predicted at the start of the sentence NP -> Det · Nominal [1,2]An NP is in progress; the Det goes from 1 to 2 VP -> V NP · [0,3]A VP has been found starting at 0 and ending at 3 20

21

 As with most dynamic programming approaches, the answer is found by looking in the table in the right place.  In this case, there should be an S state in the final column that spans from 0 to n+1 and is complete.  If that’s the case you’re done. ◦ S –> α · [0,n+1] 22

 March through chart left-to-right.  At each step, apply 1 of 3 operators ◦ Predictor  Create new states representing top-down expectations ◦ Scanner  Match word predictions (rule with word after dot) to words ◦ Completer  When a state is complete, see what rules were looking for that completed constituent 23

 Given a state ◦ With a non-terminal to right of dot (not a part- of-speech category) ◦ Create a new state for each expansion of the non-terminal ◦ Place these new states into same chart entry as generated state, beginning and ending where generating state ends. ◦ So predictor looking at  S ->. VP [0,0] ◦ results in  VP ->. Verb [0,0]  VP ->. Verb NP [0,0] 24

 Given a state ◦ With a non-terminal to right of dot that is a part-of- speech category ◦ If the next word in the input matches this POS ◦ Create a new state with dot moved over the non- terminal ◦ So scanner looking at VP ->. Verb NP [0,0] ◦ If the next word, “book”, can be a verb, add new state:  VP -> Verb. NP [0,1] ◦ Add this state to chart entry following current one ◦ Note: Earley algorithm uses top-down input to disambiguate POS! Only POS predicted by some state can get added to chart! 25

 Applied to a state when its dot has reached right end of role.  Parser has discovered a category over some span of input.  Find and advance all previous states that were looking for this category ◦ copy state, move dot, insert in current chart entry  Given: ◦ NP -> Det Nominal. [1,3] ◦ VP -> Verb. NP [0,1]  Add ◦ VP -> Verb NP. [0,3] 26

 Find an S state in the final column that spans from 0 to n+1 and is complete.  If that’s the case you’re done. ◦ S –> α · [0,n+1] 27

 More specifically… 1.Predict all the states you can upfront 2.Read a word 1.Extend states based on matches 2.Add new predictions 3.Go to 2 3.Look at N+1 to see if you have a winner 28

 Book that flight  We should find… an S from 0 to 3 that is a completed state… 29

S  NP VPVP  V S  Aux NP VPPP -> Prep NP NP  Det NomN  old | dog | footsteps | young NP  PropNV  dog | include | prefer Nom -> Adj NomAux  does Nom  NPrep  from | to | on | of Nom  N NomPropN  Bush | McCain | Obama Nom  Nom PPDet  that | this | a| the VP  V NPAdj -> old | green | red

31

32

33

 What kind of algorithms did we just describe ◦ Not parsers – recognizers  The presence of an S state with the right attributes in the right place indicates a successful recognition.  But no parse tree… no parser  That’s how we solve (not) an exponential problem in polynomial time 34

 With the addition of a few pointers we have a parser  Augment the “Completer” to point to where we came from. 35

S8 S9 S10 S11 S13 S12 S8 S9 S8

 All the possible parses for an input are in the table  We just need to read off all the backpointers from every complete S in the last column of the table  Find all the S -> X. [0,N+1]  Follow the structural traces from the Completer  Of course, this won’t be polynomial time, since there could be an exponential number of trees  We can at least represent ambiguity efficiently 37

 Depth-first search will never terminate if grammar is left recursive (e.g. NP --> NP PP) 38

 Solutions: ◦ Rewrite the grammar (automatically?) to a weakly equivalent one which is not left-recursive e.g. The man {on the hill with the telescope…} NP  NP PP (wanted: Nom plus a sequence of PPs) NP  Nom PP NP  Nom Nom  Det N …becomes… NP  Nom NP’ Nom  Det N NP’  PP NP’ (wanted: a sequence of PPs) NP’  e  Not so obvious what these rules mean…

◦ Harder to detect and eliminate non-immediate left recursion  NP --> Nom PP  Nom --> NP ◦ Fix depth of search explicitly ◦ Rule ordering: non-recursive rules first  NP --> Det Nom  NP --> NP PP 40

 Multiple legal structures ◦ Attachment (e.g. I saw a man on a hill with a telescope) ◦ Coordination (e.g. younger cats and dogs) ◦ NP bracketing (e.g. Spanish language teachers) 41

42 NP vs. VP Attachment

 Solution? ◦ Return all possible parses and disambiguate using “other methods” 43

 Parsing is a search problem which may be implemented with many control strategies ◦ Top-Down or Bottom-Up approaches each have problems  Combining the two solves some but not all issues ◦ Left recursion ◦ Syntactic ambiguity  Next time: Making use of statistical information about syntactic constituents ◦ Read Ch 14 44