1 Syntax Sudeshna Sarkar 25 Aug 2008. 2 Sentence-Types Declaratives: A plane left S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane.

Slides:



Advertisements
Similar presentations
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING Parsing: computing the grammatical structure of English sentences COMP3310.
Advertisements

March 1, 2009Dr. Muhammed Al_mulhem1 ICS482 Parsing Chapter 13 Muhammed Al-Mulhem March 1, 2009.
Mrach 1, 2009Dr. Muhammed Al-Mulhem1 ICS482 Formal Grammars Chapter 12 Muhammed Al-Mulhem March 1, 2009.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
Parsing with Context Free Grammars Reading: Chap 13, Jurafsky & Martin
CKY Parsing Ling 571 Deep Processing Techniques for NLP January 12, 2011.
1 Earley Algorithm Chapter 13.4 October 2009 Lecture #9.
1 LING 180 Autumn 2007 LINGUIST 180: Introduction to Computational Linguistics Dan Jurafsky, Marie-Catherine de Marneffe Lecture 9: Grammar and Parsing.
Parsing context-free grammars Context-free grammars specify structure, not process. There are many different ways to parse input in accordance with a given.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Parsing — Part II (Ambiguity, Top-down parsing, Left-recursion Removal)
Syntax and Context-Free Grammars CMSC 723: Computational Linguistics I ― Session #6 Jimmy Lin The iSchool University of Maryland Wednesday, October 7,
Syntactic Parsing with CFGs CMSC 723: Computational Linguistics I ― Session #7 Jimmy Lin The iSchool University of Maryland Wednesday, October 14, 2009.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
CPSC 503 Computational Linguistics
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING COMP3310 Natural Language Processing Eric Atwell, Language Research Group.
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 29– CYK; Inside Probability; Parse Tree construction) Pushpak Bhattacharyya CSE.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin and Rada Mihalcea.
Speech and Language Processing Lecture 12—02/24/2015 Susan W. Brown.
Session 13 Context-Free Grammars and Language Syntax Introduction to Speech and Natural Language Processing (KOM422 ) Credits: 3(3-0)
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
10/3/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 8 Giuseppe Carenini.
1 CKY and Earley Algorithms Chapter 13 October 2012 Lecture #8.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
6/2/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini.
1 LIN6932 Spring 2007 LIN6932 Topics in Computational Linguistics Lecture 6: Grammar and Parsing (I) February 15, 2007 Hana Filip.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
Sentence Parsing Parsing 3 Dynamic Programming. Jan 2009 Speech and Language Processing - Jurafsky and Martin 2 Acknowledgement  Lecture based on  Jurafsky.
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
Natural Language Processing Lecture 15—10/15/2015 Jim Martin.
Natural Language Processing Lecture 14—10/13/2015 Jim Martin.
1/22/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 8 Giuseppe Carenini.
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Speech and Language Processing Formal Grammars Chapter 12.
Speech and Language Processing SLP Chapter 13 Parsing.
Parsing with Context Free Grammars. Slide 1 Outline Why should you care? Parsing Top-Down Parsing Bottom-Up Parsing Bottom-Up Space (an example) Top -
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
CSC 594 Topics in AI – Natural Language Processing
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
Heng Ji September 13, 2016 SYNATCTIC PARSING Heng Ji September 13, 2016.
Speech and Language Processing
Basic Parsing with Context Free Grammars Chapter 13
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
CPSC 503 Computational Linguistics
Lecture 13: Grammar and Parsing (I) November 9, 2004 Dan Jurafsky
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
CS 388: Natural Language Processing: Syntactic Parsing
CSCI 5832 Natural Language Processing
Probabilistic and Lexicalized Parsing
CSCI 5832 Natural Language Processing
Heng Ji January 17, 2019 SYNATCTIC PARSING Heng Ji January 17, 2019.
David Kauchak CS159 – Spring 2019
Presentation transcript:

1 Syntax Sudeshna Sarkar 25 Aug 2008

2 Sentence-Types Declaratives: A plane left S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane leave? S -> Aux NP VP WH Questions: When did the plane leave? S -> WH Aux NP VP

3 Recursion We’ll have to deal with rules such as the following where the non-terminal on the left also appears somewhere on the right (directly). Nominal -> Nominal PP [[flight] [to Boston]] VP -> VP PP[[departed Miami] [at noon]]

4 Recursion Of course, this is what makes syntax interesting flights from Denver Flights from Denver to Miami Flights from Denver to Miami in February Flights from Denver to Miami in February on a Friday Flights from Denver to Miami in February on a Friday under $300 Flights from Denver to Miami in February on a Friday under $300 with lunch

5 The Point If you have a rule like VP -> V NP It only cares that the thing after the verb is an NP. It doesn’t have to know about the internal affairs of that NP

6 The Point

7 Conjunctive Constructions S -> S and S John went to NY and Mary followed him NP -> NP and NP VP -> VP and VP … In fact the right rule for English is X -> X and X

8 Problems Agreement Subcategorization Movement (for want of a better term)

9 Agreement This dog Those dogs This dog eats Those dogs eat *This dogs *Those dog *This dog eat *Those dogs eats

10 Agreement In English, subjects and verbs have to agree in person and number Determiners and nouns have to agree in number Many languages have agreement systems that are far more complex than this.

11 Subcategorization Sneeze: John sneezed Find: Please find [a flight to NY] NP Give: Give [me] NP [a cheaper fare] NP Help: Can you help [me] NP [with a flight] PP Prefer: I prefer [to leave earlier] TO-VP Told: I was told [United has a flight] S …

12 Subcategorization *John sneezed the book *I prefer United has a flight *Give with a flight Subcat expresses the constraints that a predicate (verb for now) places on the number and syntactic types of arguments it wants to take (occur with).

13 So? So the various rules for VPs overgenerate. They permit the presence of strings containing verbs and arguments that don’t go together For example VP -> V NP therefore Sneezed the book is a VP since “sneeze” is a verb and “the book” is a valid NP

14 So What? Now overgeneration is a problem for a generative approach. The grammar is supposed to account for all and only the strings in a language From a practical point of view... Not so clear that there’s a problem Why?

15 Possible CFG Solution S -> NP VP NP -> Det Nominal VP -> V NP … SgS -> SgNP SgVP PlS -> PlNp PlVP SgNP -> SgDet SgNom PlNP -> PlDet PlNom PlVP -> PlV NP SgVP ->SgV Np …

16 CFG Solution for Agreement It works and stays within the power of CFGs But its ugly And it doesn’t scale all that well

17 Forward Pointer It turns out that verb subcategorization facts will provide a key element for semantic analysis (determining who did what to who in an event).

18 Movement Core (canonical) example My travel agent booked the flight

19 Movement Core example [[My travel agent] NP [booked [the flight] NP ] VP ] S I.e. “book” is a straightforward transitive verb. It expects a single NP arg within the VP as an argument, and a single NP arg as the subject.

20 Movement What about? Which flight do you want me to have the travel agent book? The direct object argument to “book” isn’t appearing in the right place. It is in fact a long way from where its supposed to appear. And note that its separated from its verb by 2 other verbs.

21 The Point CFGs appear to be just about what we need to account for a lot of basic syntactic structure in English. But there are problems That can be dealt with adequately, although not elegantly, by staying within the CFG framework. There are simpler, more elegant, solutions that take us out of the CFG framework (beyond its formal power)

22 Grammars Before you can parse you need a grammar. So where do grammars come from? Grammar Engineering –Lovingly hand-crafted decades-long efforts by humans to write grammars (typically in some particular grammar formalism of interest to the linguists developing the grammar). TreeBanks –Semi-automatically generated sets of parse trees for the sentences in some corpus. Typically in a generic lowest common denominator formalism (of no particular interest to any modern linguist).

23 TreeBank Grammars Reading off the grammar… The grammar is the set of rules (local subtrees) that occur in the annotated corpus They tend to avoid recursion (and elegance and parsimony) Ie. they tend to the flat and redundant Penn TreeBank (III) has about grammar rules under this definition.

24 TreeBanks

25 TreeBanks

26 Sample Rules

27 Example

28 TreeBanks TreeBanks provide a grammar (of a sort). As we’ll see they also provide the training data for various ML approaches to parsing. But they can also provide useful data for more purely linguistic pursuits. You might have a theory about whether or not something can happen in particular language. Or a theory about the contexts in which something can happen. TreeBanks can give you the means to explore those theories. If you can formulate the questions in the right way and get the data you need.

29 Tgrep You might for example like to grep through a file filled with trees.

30 TreeBanks Finally, you should have noted a bit of a circular argument here. Treebanks provide a grammar because we can read the rules of the grammar out of the treebank. But how did the trees get in there in the first place? There must have been a grammar theory in there someplace…

31 TreeBanks Typically, not all of the sentences are hand-annotated by humans. They’re automatically parsed and then hand- corrected.

32 Parsing Parsing with CFGs refers to the task of assigning correct trees to input strings Correct here means a tree that covers all and only the elements of the input and has an S at the top It doesn’t actually mean that the system can select the correct tree from among all the possible trees

33 Parsing As with everything of interest, parsing involves a search which involves the making of choices We’ll start with some basic (meaning bad) methods before moving on to the one or two that you need to know

34 For Now Assume… You have all the words already in some buffer The input isn’t POS tagged We won’t worry about morphological analysis All the words are known

35 Top-Down Parsing Since we’re trying to find trees rooted with an S (Sentences) start with the rules that give us an S. Then work your way down from there to the words.

36 Top Down Space

37 Bottom-Up Parsing Of course, we also want trees that cover the input words. So start with trees that link up with the words in the right way. Then work your way up from there.

38 Bottom-Up Space

39 Bottom Up Space

40 Control Of course, in both cases we left out how to keep track of the search space and how to make choices Which node to try to expand next Which grammar rule to use to expand a node

41 Top-Down and Bottom-Up Top-down Only searches for trees that can be answers (i.e. S’s) But also suggests trees that are not consistent with any of the words Bottom-up Only forms trees consistent with the words But suggest trees that make no sense globally

42 Problems Even with the best filtering, backtracking methods are doomed if they don’t address certain problems Ambiguity Shared subproblems

43 Ambiguity

44 Shared Sub-Problems No matter what kind of search (top-down or bottom-up or mixed) that we choose. We don’t want to unnecessarily redo work we’ve already done.

45 Shared Sub-Problems Consider A flight from Indianapolis to Houston on TWA

46 Shared Sub-Problems Assume a top-down parse making bad initial choices on the Nominal rule. In particular… Nominal -> Nominal Noun Nominal -> Nominal PP

47 Shared Sub-Problems

48 Shared Sub-Problems

49 Shared Sub-Problems

50 Shared Sub-Problems

51 Parsing CKY Earley Both are dynamic programming solutions that run in O(n**3) time. CKY is bottom-up Earley is top-down

52 Sample Grammar

53 Dynamic Programming DP methods fill tables with partial results and Do not do too much avoidable repeated work Solve exponential problems in polynomial time (sort of) Efficiently store ambiguous structures with shared sub-parts.

54 CKY Parsing First we’ll limit our grammar to epsilon-free, binary rules (more later) Consider the rule A -> BC If there is an A in the input then there must be a B followed by a C in the input. If the A spans from i to j in the input then there must be some k st. i<k<j –Ie. The B splits from the C someplace.

55 CKY So let’s build a table so that an A spanning from i to j in the input is placed in cell [i,j] in the table. So a non-terminal spanning an entire string will sit in cell [0, n] If we build the table bottom up we’ll know that the parts of the A must go from i to k and from k to j

56 CKY Meaning that for a rule like A -> B C we should look for a B in [i,k] and a C in [k,j]. In other words, if we think there might be an A spanning i,j in the input… AND A -> B C is a rule in the grammar THEN There must be a B in [i,k] and a C in [k,j] for some i<k<j

57 CKY So to fill the table loop over the cell[i,j] values in some systematic way What constraint should we put on that? For each cell loop over the appropriate k values to search for things to add.

58 CKY Table

59 CKY Algorithm

60 CKY Parsing Is that really a parser?

61 Note We arranged the loops to fill the table a column at a time, from left to right, bottom to top. This assures us that whenever we’re filling a cell, the parts needed to fill it are already in the table (to the left and below)

62 Example

63 Other Ways to Do It? Are there any other sensible ways to fill the table that still guarantee that the cells we need are already filled?

64 Other Ways to Do It?

65 Sample Grammar

66 Problem What if your grammar isn’t binary? As in the case of the TreeBank grammar? Convert it to binary… any arbitrary CFG can be rewritten into Chomsky-Normal Form automatically. What does this mean? The resulting grammar accepts (and rejects) the same set of strings as the original grammar. But the resulting derivations (trees) are different.

67 Problem More specifically, rules have to be of the form A -> B C Or A -> w That is rules can expand to either 2 non-terminals or to a single terminal.

68 Binarization Intuition Eliminate chains of unit productions. Introduce new intermediate non-terminals into the grammar that distribute rules with length > 2 over several rules. So… S -> A B C –Turns into S -> X C X - A B Where X is a symbol that doesn’t occur anywhere else in the the grammar.

69 CNF Conversion

70 CKY Algorithm

71 Example Filling column 5

72 Example

73 Example

74 Example

75 Example

76 END