Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid.

Slides:



Advertisements
Similar presentations
Augmented Transition Networks
Advertisements

Basic Parsing with Context-Free Grammars CS 4705 Julia Hirschberg 1 Some slides adapted from Kathy McKeown and Dan Jurafsky.
Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
PARSING WITH CONTEXT-FREE GRAMMARS
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
1 Earley Algorithm Chapter 13.4 October 2009 Lecture #9.
Understanding Natural Language
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
CS Basic Parsing with Context-Free Grammars.
Parsing context-free grammars Context-free grammars specify structure, not process. There are many different ways to parse input in accordance with a given.
Natural Language Query Interface Mostafa Karkache & Bryce Wenninger.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
Syntactic Parsing with CFGs CMSC 723: Computational Linguistics I ― Session #7 Jimmy Lin The iSchool University of Maryland Wednesday, October 14, 2009.
CS 4705 Basic Parsing with Context-Free Grammars.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
TEORIE E TECNICHE DEL RICONOSCIMENTO Linguistica computazionale in Python: -Analisi sintattica (parsing)
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
1 CKY and Earley Algorithms Chapter 13 October 2012 Lecture #8.
Chapter 10. Parsing with CFGs From: Chapter 10 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
LINGUISTICA GENERALE E COMPUTAZIONALE ANALISI SINTATTICA (PARSING)
10. Parsing with Context-free Grammars -Speech and Language Processing- 발표자 : 정영임 발표일 :
Fall 2004 Lecture Notes #4 EECS 595 / LING 541 / SI 661 Natural Language Processing.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Parsing with Context Free Grammars.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
October 2008CSA3180: Sentence Parsing1 CSA3180: NLP Algorithms Sentence Parsing Algorithms 2 Problems with DFTD Parser.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Lexicalized and Probabilistic Parsing Read J & M Chapter 12.
Rules, Movement, Ambiguity
CSA2050 Introduction to Computational Linguistics Parsing I.
Natural Language - General
Basic Parsing Algorithms: Earley Parser and Left Corner Parsing
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
November 2004csa3050: Sentence Parsing II1 CSA350: NLP Algorithms Sentence Parsing 2 Top Down Bottom-Up Left Corner BUP Implementation in Prolog.
Quick Speech Synthesis CMSC Natural Language Processing April 29, 2003.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
English Syntax Read J & M Chapter 9.. Two Kinds of Issues Linguistic – what are the facts about language? The rules of syntax (grammar) Algorithmic –
December 2011CSA3202: PCFGs1 CSA3202: Human Language Technology Probabilistic Phrase Structure Grammars (PCFGs)
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
Instructor: Nick Cercone CSEB - 1 Parsing and Context Free Grammars Parsers, Top Down, Bottom Up, Left Corner, Earley.
October 2005CSA3180: Parsing Algorithms 21 CSA3050: NLP Algorithms Parsing Algorithms 2 Problems with DFTD Parser Earley Parsing Algorithm.
November 2009HLT: Sentence Parsing1 HLT Sentence Parsing Algorithms 2 Problems with Depth First Top Down Parsing.
NLP. Introduction to NLP #include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse =
November 2004csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
Basic Parsing with Context Free Grammars Chapter 13
Table-driven parsing Parsing performed by a finite state machine.
Natural Language - General
Parsing and More Parsing
CSA2050 Introduction to Computational Linguistics
Parsing I: CFGs & the Earley Parser
David Kauchak CS159 – Spring 2019
Presentation transcript:

Context-Free Parsing Read J & M Chapter 10.

Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid of ND? YesNo Inherent ambiguity? NoYes Parse time  (n)  (n 3 ) (*) Algorithm to minimize? YesNo

Parsing Formal Languages vs Natural Ones Formal Languages Natural Languages Do we get to design for efficiency? YesNo Is existence of a single parse critical? YesNo Can we get what we need without ambiguity? YesApparently not

Example Comparison Formal language: 17 * First, we define which interpretation we want. Second, we exploit parsing techniques so that it is not even necessary to try multiple paths to get to that answer. If a = 2 then if b = 3 then i := 7 else i := 9 English: I hit the boy with a bat.

Ambiguity Lexical category ambiguity: Type this report. Structural ambiguity: Attachment ambiguity: I hit the boy with the bat. I saw the Statue of Liberty flying over New York. A Belgian furniture shop is offering special packages for divorced men who hate shopping in a country where half of all marriages end in a divorce after five years. Coordination ambiguity: Would you like pancakes or bacon and eggs? Noun phrase bracketing: I’d like Canadian bacon and eggs. Quantifier scope ambiguity: All the boys read a book about politics.

How Bad is the Ambiguity Problem? Consider: Show me the meal on flight 286 from SF to Denver and NY. Bracketings: How many are there? ConstituentsParses

What’s the Fix? There isn’t an easy one. Two approaches: Semantic filtering Representing families of parses, not individual ones.

Ambiguity and Nondeterminism Imply Search Two basic approaches: Top down parsing Bottom up parsing Example: Book that flight.

A Miniature Grammar

A Top-Down, Recursive Descent Parser Def (S: S := Build(S, NP(), VP()); If success then return S; S := Build(S, Aux(), NP(), VP()); If success then return S; S := Build(S, VP()); If success then return S; S := Build(S, S(), Conj(), S()); If success then return S; Return fail) Def (NP: NP := Build(NP, Det(), Nom()) ; If success then return NP; NP := Build(NP, PropN() ) ; If success then return NP; NP := Build(NP, NP(), S()); If success then return NP; Return fail) Def (VP: VP := Build(VP, V()); If success then return VP; VP := Build(VP, V(), NP()); If success then return VP; Return fail) Def (PropN: PrN:= Lookup(“PropN”); If success then return PropN; Return fail) Def (V: V:= Lookup(“V”); If success then returnV; Return fail) Def (Det: Det:= Lookup(“Det”); If success then return Det; Return fail)

Example Let’s try to parse: Lucy purred.

Adding Bottom Up Filtering Consider the sentence: Listen to me. We know that we cannot use either of the first two S rules because listen cannot be on the left corner of either an NP or an Aux. Facts such as this can be compiled from the grammar and stored in a table that lists, for each constituent, those parts of speech that can form the left corner. Example using S:Left corners: S  NP VPDet, PropN S  Aux NP VP Aux S  VPV S  S Conj S

Problems with Top-Down Parsers Left recursive rules: Let’s try: Cats and dogs make good pets. Rewrite A  A  |  asA   A A   A |  Ambiguity We will only find the first parse. Repeated parsing of subtrees: Imagine we’d solved the conjunction problem. Now consider: The large furry cat with the huge tail purred and the baby smiled.

The Earley Algorithm Goal: find one or more edges from 0 to 3 labeled S.

Could We Do it with an FSM? Not the way we’ve been doing it because of the recursive rules. But the FSM idea may be useful if we do one of the following: Just parse fragments of sentences. Augment the FSM formalism to allow more powerful operations.

Parsing Fragments Example: noun phrases. Maybe we don’t need to handle: Lawyers whose clients committed fraud Lawyers who committed fraud Clients whose lawyers committed fraud But we do care about: The huge and friendly cat Then we could build an FSM like:

Augmenting the FSM - Recursion A Recursive Transition Network has the same power as a PDA or a context-free parser. Here’s a fragment of an RTN that recognizes sentences: S Conj S

Augmenting the FSM – Allow Actions Who did she say she saw ____ coming down the hill? OBJ  QWORD Augmented Transition Networks (ATNs)

Demos A parser with a nice graphical interface: A parser with machine readable output: bin/engcg?snt=Who+did+you+say+you+saw%3F&h=on A shallow parser: Whatever you pick, it needs to connect with the modules that come before and after it.