NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.

Slides:



Advertisements
Similar presentations
Basic Parsing with Context-Free Grammars CS 4705 Julia Hirschberg 1 Some slides adapted from Kathy McKeown and Dan Jurafsky.
Advertisements

Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
PARSING WITH CONTEXT-FREE GRAMMARS
Parsing with Context Free Grammars Reading: Chap 13, Jurafsky & Martin
Chapter 12 Lexicalized and Probabilistic Parsing Guoqiang Shan University of Arizona November 30, 2006.
Introduction and Jurafsky Model Resource: A Probabilistic Model of Lexical and Syntactic Access and Disambiguation, Jurafsky 1996.
CKY Parsing Ling 571 Deep Processing Techniques for NLP January 12, 2011.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
1 Earley Algorithm Chapter 13.4 October 2009 Lecture #9.
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
Probabilistic Parsing: Enhancements Ling 571 Deep Processing Techniques for NLP January 26, 2011.
PCFG Parsing, Evaluation, & Improvements Ling 571 Deep Processing Techniques for NLP January 24, 2011.
Parsing context-free grammars Context-free grammars specify structure, not process. There are many different ways to parse input in accordance with a given.
Parsing with PCFG Ling 571 Fei Xia Week 3: 10/11-10/13/05.
Earley’s algorithm Earley’s algorithm employs the dynamic programming technique to address the weaknesses of general top-down parsing. Dynamic programming.
Syntactic Parsing with CFGs CMSC 723: Computational Linguistics I ― Session #7 Jimmy Lin The iSchool University of Maryland Wednesday, October 14, 2009.
CS 4705 Basic Parsing with Context-Free Grammars.
Context Free Grammar S -> NP VP NP -> det (adj) N
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור תשע Bottom Up Parsing עידו דגן.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 29– CYK; Inside Probability; Parse Tree construction) Pushpak Bhattacharyya CSE.
1 Syntax Sudeshna Sarkar 25 Aug Sentence-Types Declaratives: A plane left S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane.
1 CKY and Earley Algorithms Chapter 13 October 2012 Lecture #8.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
SI485i : NLP Set 8 PCFGs and the CKY Algorithm. PCFGs We saw how CFGs can model English (sort of) Probabilistic CFGs put weights on the production rules.
Chapter 10. Parsing with CFGs From: Chapter 10 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
LINGUISTICA GENERALE E COMPUTAZIONALE ANALISI SINTATTICA (PARSING)
11 Syntactic Parsing. Produce the correct syntactic parse tree for a sentence.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
6/2/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
NLP. Introduction to NLP Background –Developed by Jay Earley in 1970 –No need to convert the grammar to CNF –Left to right Complexity –Faster than O(n.
Sentence Parsing Parsing 3 Dynamic Programming. Jan 2009 Speech and Language Processing - Jurafsky and Martin 2 Acknowledgement  Lecture based on  Jurafsky.
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
Computerlinguistik II / Sprachtechnologie Vorlesung im SS 2010 (M-GSW-10) Prof. Dr. Udo Hahn Lehrstuhl für Computerlinguistik Institut für Germanistische.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
Natural Language Processing Lecture 15—10/15/2015 Jim Martin.
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
October 2005CSA3180: Parsing Algorithms 21 CSA3050: NLP Algorithms Parsing Algorithms 2 Problems with DFTD Parser Earley Parsing Algorithm.
NLP. Introduction to NLP Time flies like an arrow –Many parses –Some (clearly) more likely than others –Need for a probabilistic ranking method.
NLP. Introduction to NLP #include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse =
November 2004csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Syntactic Parsing: Part I Niranjan Balasubramanian Stony Brook University March 8 th and Mar 22 nd, 2016 Many slides adapted from: Ray Mooney, Michael.
Speech and Language Processing SLP Chapter 13 Parsing.
1 Statistical methods in NLP Diana Trandabat
CSC 594 Topics in AI – Natural Language Processing
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
Basic Parsing with Context Free Grammars Chapter 13
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 6/19/2018
Probabilistic CKY Parser
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
CS 388: Natural Language Processing: Syntactic Parsing
Natural Language - General
Parsing and More Parsing
David Kauchak CS159 – Spring 2019
David Kauchak CS159 – Spring 2019
NLP.
Presentation transcript:

NLP

Introduction to NLP

Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building a parse for a substring [i,j] based on all parses [i,k] and [k, j] that are included in it. Complexity –O(n 3 ) for recognizing an input string of length n

CKY (Cocke-Kasami-Younger) –bottom-up –requires a normalized (binarized) grammar Earley parser –top-down –more complicated

["the", "child", "ate", "the", "cake", "with", "the", "fork"] S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

the child ate the cake with the fork

the child ate the cake with the fork DT

N the child ate the cake with the fork

DT N NP the child ate the cake with the fork

DT N NP the child ate the cake with the fork

DT N V the child ate the cake with the fork NP

DT N V the child ate the cake with the fork NP

DT N V N NP the child ate the cake with the fork

DT N V N NP the child ate the cake with the fork

DT N V N NP the child ate the cake with the fork

DT N V N NP VP NP the child ate the cake with the fork

DT N V N NP VP NP the child ate the cake with the fork

DT N V N NP S VP NP the child ate the cake with the fork

DT N V N NP S VP NP the child ate the cake with the fork

DT N V N PRP NP S VP NP the child ate the cake with the fork

DT N V N PRP DT N NP S VP NP PP NP the child ate the cake with the fork

DT N V N PRP DT N NP S VP NP PP NP the child ate the cake with the fork

DT N V N PRP DT N NP S VP NP PP NP the child ate the cake with the fork

DT N V N PRP DT N NP S S VP NP PP NP the child ate the cake with the fork

DT N V N PRP DT N NP S S VP NP PP NP the child ate the cake with the fork [0] DT [1] N [2] ==> [0] NP [2] [3] DT [4] N [5] ==> [3] NP [5] [6] DT [7] N [8] ==> [6] NP [8] [2] V [3] NP [5] ==> [2] VP [5] [5] PRP [6] NP [8] ==> [5] PP [8] [0] NP [2] VP [5] ==> [0] S [5] [3] NP [5] PP [8] ==> [3] NP [8] [2] V [3] NP [8] ==> [2] VP [8] [2] VP [5] PP [8] ==> [2] VP [8] [0] NP [2] VP [8] ==> [0] S [8]

What is the meaning of each of these sentences?

(S (NP (DT the) (N child)) (VP (VP (V ate) (NP (DT the) (N cake))) (PP (PRP with) (NP (DT the) (N fork)))))

(S (NP (DT the) (N child)) (VP (VP (V ate) (NP (DT the) (N cake))) (PP (PRP with) (NP (DT the) (N fork))))) (S (NP (DT the) (N child)) (VP (V ate) (NP (NP (DT the) (N cake)) (PP (PRP with) (NP (DT the) (N fork))))))

There are O(n 2 ) cells in the table Single parse –Each cell requires a linear lookup. –Total time complexity is O(n 3 ) All parses –Total time complexity is exponential

["take", "this", "book"] S -> NP VP | Aux NP VP | VP NP -> PRON | Det Nom Nom -> N | Nom N | Nom PP PP -> PRP NP VP -> V | V NP | VP PP Det -> 'the' | 'a' | 'this' PRON -> 'he' | 'she' N -> 'book' | 'boys' | 'girl' PRP -> 'with' | 'in' V -> 'takes' | 'take'

["take", "this", "book"] S -> NP VP | Aux NP VP | VP NP -> PRON | Det Nom Nom -> N | Nom N | Nom PP PP -> PRP NP VP -> V | V NP | VP PP Det -> 'the' | 'a' | 'this' PRON -> 'he' | 'she' N -> 'book' | 'boys' | 'girl' PRP -> 'with' | 'in' V -> 'takes' | 'take'

All rules have to be in binary form: –X  Y Z or X  w This introduces new non-terminals for –hybrid rules –n-ary rules –unary rules

S → NP VP S → Aux NP VP S → VP NP → Pronoun NP → Proper-Noun NP → Det Nominal Nominal → Noun Nominal → Nominal Noun Nominal → Nominal PP VP → Verb VP → Verb NP VP → VP PP PP → Prep NP Original version From Jurafsky and Martin

S → NP VP S → Aux NP VP S → VP NP → Pronoun NP → Proper-Noun NP → Det Nominal Nominal → Noun Nominal → Nominal Noun Nominal → Nominal PP VP → Verb VP → Verb NP VP → VP PP PP → Prep NP Original version CNF version S → NP VP S → X1 VP X1 → Aux NP S → book | include | prefer S → Verb NP S → VP PP NP → I | he | she | me NP → Houston | NWA NP → Det Nominal Nominal → book | flight | meal | money Nominal → Nominal Noun Nominal → Nominal PP VP → book | include | prefer VP → Verb NP VP → VP PP PP → Prep NP

S → NP VP S → Aux NP VP S → VP NP → Pronoun NP → Proper-Noun NP → Det Nominal Nominal → Noun Nominal → Nominal Noun Nominal → Nominal PP VP → Verb VP → Verb NP VP → VP PP PP → Prep NP Original version CNF version S → NP VP S → X1 VP X1 → Aux NP S → book | include | prefer S → Verb NP S → VP PP NP → I | he | she | me NP → Houston | NWA NP → Det Nominal Nominal → book | flight | meal | money Nominal → Nominal Noun Nominal → Nominal PP VP → book | include | prefer VP → Verb NP VP → VP PP PP → Prep NP

All rules have to be in binary form: –X  Y Z or X  w New non-terminals for hybrid rules, n-ary and unary rules: –INF-VP  to VP becomes INF-VP  TO TO  to –S  Aux NP VP becomes S  R1 VP R1  Aux NP –S  VP VP  Verb VP  Verb NP VP  Verb PP becomes S  book S  buy S  R2 PP S  Verb PP –etc.

Weak equivalence only –Same language, different structure –If the grammar had to be converted to CNF, then the final parse tree doesn’t match the original grammar –However, it can be converted back using a specific procedure Syntactic ambiguity –(Deterministic) CKY has no way to perform syntactic disambiguation

NLP