NLP. Introduction to NLP Background –Developed by Jay Earley in 1970 –No need to convert the grammar to CNF –Left to right Complexity –Faster than O(n.

Slides:



Advertisements
Similar presentations
Syntax. Definition: a set of rules that govern how words are combined to form longer strings of meaning meaning like sentences.
Advertisements

Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
Parsing with Context Free Grammars Reading: Chap 13, Jurafsky & Martin
CKY Parsing Ling 571 Deep Processing Techniques for NLP January 12, 2011.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
Parsing context-free grammars Context-free grammars specify structure, not process. There are many different ways to parse input in accordance with a given.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Parsing with PCFG Ling 571 Fei Xia Week 3: 10/11-10/13/05.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
Syntax and Context-Free Grammars CMSC 723: Computational Linguistics I ― Session #6 Jimmy Lin The iSchool University of Maryland Wednesday, October 7,
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Outline of English Syntax.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
Constituency Tests Phrase Structure Rules
Syntax Nuha AlWadaani.
Phrases and Sentences: Grammar
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
1 Basic Parsing with Context- Free Grammars Slides adapted from Dan Jurafsky and Julia Hirschberg.
Intro to NLP - J. Eisner1 Earley’s Algorithm (1970) Nice combo of our parsing ideas so far:  no restrictions on the form of the grammar:  A.
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
CS 4705 Parsing More Efficiently and Accurately. Review Top-Down vs. Bottom-Up Parsers Left-corner table provides more efficient look- ahead Left recursion.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
10. Parsing with Context-free Grammars -Speech and Language Processing- 발표자 : 정영임 발표일 :
Introduction to Linguistics Ms. Suha Jawabreh Lecture 18.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
6/2/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini.
1 LIN6932 Spring 2007 LIN6932 Topics in Computational Linguistics Lecture 6: Grammar and Parsing (I) February 15, 2007 Hana Filip.
Notes on Pinker ch.7 Grammar, parsing, meaning. What is a grammar? A grammar is a code or function that is a database specifying what kind of sounds correspond.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
ENGLISH SYNTAX Introduction to Transformational Grammar.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Rules, Movement, Ambiguity
Sentence Parsing Parsing 3 Dynamic Programming. Jan 2009 Speech and Language Processing - Jurafsky and Martin 2 Acknowledgement  Lecture based on  Jurafsky.
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
Quick Speech Synthesis CMSC Natural Language Processing April 29, 2003.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
Instructor: Nick Cercone CSEB - 1 Parsing and Context Free Grammars Parsers, Top Down, Bottom Up, Left Corner, Earley.
October 2005CSA3180: Parsing Algorithms 21 CSA3050: NLP Algorithms Parsing Algorithms 2 Problems with DFTD Parser Earley Parsing Algorithm.
NLP. Introduction to NLP Time flies like an arrow –Many parses –Some (clearly) more likely than others –Need for a probabilistic ranking method.
NLP. Introduction to NLP #include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse =
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Speech and Language Processing SLP Chapter 13 Parsing.
Natural Language Processing Vasile Rus
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
Statistical NLP Winter 2009
Beginning Syntax Linda Thomas
Introduction to Linguistics
Syntax.
CPSC 503 Computational Linguistics
Syntax.
CSCI 5832 Natural Language Processing
CS 388: Natural Language Processing: Syntactic Parsing
Earley’s Algorithm (1970) Nice combo of our parsing ideas so far:
Parsing and More Parsing
David Kauchak CS159 – Spring 2019
David Kauchak CS159 – Spring 2019
NLP.
Presentation transcript:

NLP

Introduction to NLP

Background –Developed by Jay Earley in 1970 –No need to convert the grammar to CNF –Left to right Complexity –Faster than O(n 3 ) in many cases

Looks for both full and partial constituents Example: –S  Aux. NP VP When reading word k, it has already identified all hypotheses that are consistent with words 1 to k-1 Example: –If the parser matches NP in the example above –S  Aux NP. VP

It uses a dynamic programming table, just like CKY Example entry in column 1 – [0:1] VP -> VP. PP –Created when processing word 1 –Corresponds to words 0 to 1 (these words correspond to the VP part of the RHS of the rule) –The dot separates the completed (known) part from the incomplete (and possibly unattainable) part

Three types of entries –‘scan’ – for words –‘predict’ – for non-terminals –‘complete’ – otherwise

S -> NP VP S -> Aux NP VP S -> VP NP -> PRON NP -> Det Nom Nom -> N Nom -> Nom N Nom -> Nom PP PP -> PRP NP VP -> V VP -> V NP VP -> VP PP Det -> 'the' Det -> 'a' Det -> 'this' PRON -> 'he' PRON -> 'she' N -> 'book' N -> 'boys' N -> 'girl' PRP -> 'with' PRP -> 'in' V -> 'takes' V -> 'take'

|. take. this. book.| |[ ]..| [0:1] 'take' |. [ ].| [1:2] 'this' |.. [ ]| [2:3] 'book’ Example created using NLTK

|. take. this. book.| |[ ]..| [0:1] 'take' |. [ ].| [1:2] 'this' |.. [ ]| [2:3] 'book' |>...| [0:0] S -> * NP VP |>...| [0:0] S -> * Aux NP VP |>...| [0:0] S -> * VP |>...| [0:0] VP -> * V |>...| [0:0] VP -> * V NP |>...| [0:0] VP -> * VP PP |>...| [0:0] V -> * 'take' |>...| [0:0] NP -> * PRON |>...| [0:0] NP -> * Det Nom

|. take. this. book.| |[ ]..| [0:1] 'take' |. [ ].| [1:2] 'this' |.. [ ]| [2:3] 'book' |>...| [0:0] S -> * NP VP |>...| [0:0] S -> * Aux NP VP |>...| [0:0] S -> * VP |>...| [0:0] VP -> * V |>...| [0:0] VP -> * V NP |>...| [0:0] VP -> * VP PP |>...| [0:0] V -> * 'take' |>...| [0:0] NP -> * PRON |>...| [0:0] NP -> * Det Nom |[ ]..| [0:1] V -> 'take' * |[ ]..| [0:1] VP -> V * |[ >..| [0:1] VP -> V * NP |. >..| [1:1] NP -> * PRON |. >..| [1:1] NP -> * Det Nom

|. take. this. book.| |[ ]..| [0:1] 'take' |. [ ].| [1:2] 'this' |.. [ ]| [2:3] 'book' |>...| [0:0] S -> * NP VP |>...| [0:0] S -> * Aux NP VP |>...| [0:0] S -> * VP |>...| [0:0] VP -> * V |>...| [0:0] VP -> * V NP |>...| [0:0] VP -> * VP PP |>...| [0:0] V -> * 'take' |>...| [0:0] NP -> * PRON |>...| [0:0] NP -> * Det Nom |[ ]..| [0:1] V -> 'take' * |[ ]..| [0:1] VP -> V * |[ >..| [0:1] VP -> V * NP |. >..| [1:1] NP -> * PRON |. >..| [1:1] NP -> * Det Nom |. >..| [1:1] Det -> * 'this’ |[ ]..| [0:1] S -> VP * |[ >..| [0:1] VP -> VP * PP |. >..| [1:1] PP -> * PRP NP |. [ ].| [1:2] Det -> 'this' * |. [ >.| [1:2] NP -> Det * Nom |.. >.| [2:2] Nom -> * N |.. >.| [2:2] Nom -> * Nom N |.. >.| [2:2] Nom -> * Nom PP |.. >.| [2:2] N -> * 'book' |.. [ ]| [2:3] N -> 'book' * |.. [ ]| [2:3] Nom -> N * |. [ ]| [1:3] NP -> Det Nom * |.. [ >| [2:3] Nom -> Nom * N |.. [ >| [2:3] Nom -> Nom * PP |... >| [3:3] PP -> * PRP NP |[===================================]| [0:3] VP -> V NP * |[===================================]| [0:3] S -> VP * |[ >| [0:3] VP -> VP * PP (S (VP (V take) (NP (Det this) (Nom (N book)))))

NLP

Introduction to NLP

Number –Chen is/people are Person –I am/Chen is Tense –Chen was reading/Chen is reading/Chen will be reading Case –not in English but in many other languages such as German, Russian, Greek Gender –not in English but in many other languages such as German, French, Spanish

Many combinations of rules are needed to express agreement –S  NP VP –S  1sgNP 1sgVP –S  2sgNP 2sgVP –S  3sgNP 3sgVP –…–… –1sgNP  1sgN –...

Direct object The dog ate a sausage Prepositional phrase Mary left the car in the garage Predicative adjective The receptionist looked worried Bare infinitive She helped me buy this place To-infinitive The girl wanted to be alone Participial phrase He stayed crying after the movie ended That-clause Ravi doesn’t believe that it will rain tomorrow Question-form clauses She wondered where to go

Non-independence –All NPs 11% NP PP, 9% DT NN, 6% PRP –NPs under S 9% NP PP, 9% DT NN, 21% PRP –NPs under VP 23% NP PP, 7% DT NN, 4% PRP –(example from Dan Klein) Lexicalized grammars –later

Syntax helps understand the meaning of a sentence. –Bob gave Alice a flower –Who gave a flower to Alice? –What did Bob give to Alice? Context-free grammars are an appopriate representation for syntactic information Dynamic programming is needed for efficient parsing –Cubic time to find one parse –Still exponential time to find all parses –Why?

Why does it still take an exponential time to find all parses? –Very simple – because the number of parses can be exponential

NLP