NLP. Introduction to NLP #include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse =

Slides:



Advertisements
Similar presentations
Bottom up Parsing Bottom up parsing trys to transform the input string into the start symbol. Moves through a sequence of sentential forms (sequence of.
Advertisements

Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
Grammars, constituency and order A grammar describes the legal strings of a language in terms of constituency and order. For example, a grammar for a fragment.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
GRAMMAR & PARSING (Syntactic Analysis) NLP- WEEK 4.
Probabilistic Parsing Chapter 14, Part 2 This slide set was adapted from J. Martin, R. Mihalcea, Rebecca Hwa, and Ray Mooney.
LING 581: Advanced Computational Linguistics Lecture Notes January 19th.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
CS Basic Parsing with Context-Free Grammars.
Albert Gatt LIN3022 Natural Language Processing Lecture 8.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
CS 4705 Basic Parsing with Context-Free Grammars.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
1 Bottom-up parsing Goal of parser : build a derivation –top-down parser : build a derivation by working from the start symbol towards the input. builds.
LING 364: Introduction to Formal Semantics Lecture 5 January 26th.
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
Context-Free Grammar CSCI-GA.2590 – Lecture 3 Ralph Grishman NYU.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
LING/C SC/PSYC 438/538 Lecture 27 Sandiway Fong. Administrivia 2 nd Reminder – 538 Presentations – Send me your choices if you haven’t already.
October 2008csa3180: Setence Parsing Algorithms 1 1 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up.
For Monday Read chapter 23, sections 1-2 FOIL exercise due.
LING 388: Language and Computers Sandiway Fong Lecture 7.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Constituent Parsing and Algorithms (with.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
CS626: NLP, Speech and the Web Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 15, 17: Parsing Ambiguity, Probabilistic Parsing, sample seminar 17.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Albert Gatt Corpora and Statistical Methods Lecture 11.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-16: Probabilistic parsing; computing probability of.
Rules, Movement, Ambiguity
CSA2050 Introduction to Computational Linguistics Parsing I.
NLP. Introduction to NLP Background –Developed by Jay Earley in 1970 –No need to convert the grammar to CNF –Left to right Complexity –Faster than O(n.
Parsing with Context-Free Grammars References: 1.Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2.Speech and Language Processing, chapters 9,
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
For Friday No reading Program 4 due. Program 4 Any questions?
csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
CS 4705 Lecture 7 Parsing with Context-Free Grammars.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
Parsing and Code Generation Set 24. Parser Construction Most of the work involved in constructing a parser is carried out automatically by a program,
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Time flies like an arrow –Many parses –Some (clearly) more likely than others –Need for a probabilistic ranking method.
GRAMMARS & PARSING. Parser Construction Most of the work involved in constructing a parser is carried out automatically by a program, referred to as a.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-15: Probabilistic parsing; PCFG (contd.)
NLP. Parsing ( (S (NP-SBJ (NP (NNP Pierre) (NNP Vinken) ) (,,) (ADJP (NP (CD 61) (NNS years) ) (JJ old) ) (,,) ) (VP (MD will) (VP (VB join) (NP (DT.
November 2004csa3050: Parsing Algorithms 11 CSA350: NLP Algorithms Parsing Algorithms 1 Top Down Bottom-Up Left Corner.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
LING/C SC 581: Advanced Computational Linguistics Lecture Notes Feb 17 th.
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
Bottom-up parsing Goal of parser : build a derivation
Basic Parsing with Context Free Grammars Chapter 13
CS 388: Natural Language Processing: Syntactic Parsing
Parsing and More Parsing
David Kauchak CS159 – Spring 2019
David Kauchak CS159 – Spring 2019
NLP.
Presentation transcript:

NLP

Introduction to NLP

#include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse = reverse * 10; reverse = reverse + n%10; n = n/10; } printf("Reverse of entered number is = %d\n", reverse); return 0; }

Rather different than computer languages –Can you think in which ways?

Rather different than computer languages –No types for words –No brackets around phrases –Ambiguity Words Parses –Implied information

Parsing means associating tree structures to a sentence, given a grammar (often a CFG) –There may be exactly one such tree structure –There may be many such structures –There may be none Grammars (e.g., CFG) are declarative –They don’t specify how the parse tree will be constructed

PP attachment –I saw the man with the telescope Gaps –Mary likes Physics but hates Chemistry Coordination scope –Small boys and girls are playing Particles vs. prepositions –She ran up a large bill Gerund vs. adjective –Frightening kids can cause trouble

Grammar checking –I want to return this shoes. Question answering –How many people in sales make $40K or more per year? Machine translation –E.g., word order – SVO vs. SOV Information extraction –Breaking Bad takes place in New Mexico. Speech generation Speech understanding Interpretation

NLP

Introduction to NLP

A context-free grammar is a 4-tuple (N, ,R,S) –N: non-terminal symbols –  : terminal symbols (disjoint from N) –R: rules (A   ), where  is a string from (   N)* –S: start symbol from N

["the", "child", "ate", "the", "cake", "with", "the", "fork"] S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

["the", "child", "ate", "the", "cake", "with", "the", "fork"] S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate' Heads marked in bold face

Sentences are not just bags of words –Alice bought Bob flowers –Bob bought Alice flowers Context-free view of language –A prepositional phrase looks the same whether it is part of the subject NP or part of the VP Constituent order –SVO (subject verb object) –SOV (subject object verb)

Auxiliary verbs –The dog may have eaten my homework Imperative sentences –Leave the book on the table Interrogative sentences –Did the customer have a complaint? Negative sentences –The customer didn’t have a complaint

S -> NP VP | Aux NP VP | VP NP -> PRON | Det Nom Nom -> N | Nom N | Nom PP PP -> PRP NP VP -> V | V NP | VP PP Det -> 'the' | 'a' | 'this' PRON -> 'he' | 'she' N -> 'book' | 'boys' | 'girl' PRP -> 'with' | 'in' V -> 'takes' | 'take' What changes were made to the grammar?

S -> NP VP | Aux NP VP | VP NP -> PRON | Det Nom Nom -> N | Nom N | Nom PP PP -> PRP NP VP -> V | V NP | VP PP Det -> 'the' | 'a' | 'this' PRON -> 'he' | 'she' N -> 'book' | 'boys' | 'girl' PRP -> 'with' | 'in' V -> 'takes' | 'take'

S -> NP VP | Aux NP VP | VP NP -> PRON | Det Nom Nom -> N | Nom N | Nom PP PP -> PRP NP VP -> V | V NP | VP PP Det -> 'the' | 'a' | 'this' PRON -> 'he' | 'she' N -> 'book' | 'boys' | 'girl' PRP -> 'with' | 'in' V -> 'takes' | 'take'

( (S (NP-SBJ (NP (NNP Pierre) (NNP Vinken) ) (,,) (ADJP (NP (CD 61) (NNS years) ) (JJ old) ) (,,) ) (VP (MD will) (VP (VB join) (NP (DT the) (NN board) ) (PP-CLR (IN as) (NP (DT a) (JJ nonexecutive) (NN director) )) (NP-TMP (NNP Nov.) (CD 29) ))) (..) )) ( (S (NP-SBJ (NNP Mr.) (NNP Vinken) ) (VP (VBZ is) (NP-PRD (NP (NN chairman) ) (PP (IN of) (NP (NP (NNP Elsevier) (NNP N.V.) ) (,,) (NP (DT the) (NNP Dutch) (VBG publishing) (NN group) ))))) (..) ))

A leftmost derivation is a sequence of strings s 1, s 2,..., s n –s 1 = S, the start symbol –s n includes only terminal symbols Example: –[S] –[S] [NP VP] –[S] [NP VP] [DT N VP] –…–… –[S] [NP VP] [DT N VP]... [the child ate the cake with the fork]

the fork S thechildate NP VP cake the DTN NPPP with PRPNP DTN N VP NP

NLP

Introduction to NLP

S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

There are two types of constraints on the parses –From the input sentence –From the grammar Therefore, two general approaches to parsing –Top-down –Bottom-up

S Top down parsing S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

S VP NP Top down parsing S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

S VP NP Top down parsing PP NP S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

S VP NP Top down parsing PP NP S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

S VP NP Top down parsing N DT S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

the fork S thechildate NP VP cake the DTN NPPP with PRPNP DTN N VP NP Top down parsing S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

the fork the childate VP cake the DT N NP with PRP PP DTN NP N DT VP NP S Bottom up parsing S -> NP VP NP -> DT N | NP PP PP -> PRP NP VP -> V NP | VP PP DT -> 'a' | 'the' N -> 'child' | 'cake' | 'fork' PRP -> 'with' | 'to' V -> 'saw' | 'ate'

Bottom up –explores options that won’t lead to a full parse. Top down –explores options that don’t match the full sentence. Dynamic programming –caches of intermediate results (memoization) Cocke-Kasami-Younger (CKY) parser –based on dynamic programming

Introduction to NLP

A bottom-up parser –Tries to match the RHS of a production until it can build an S Shift operation –Each word in the input sentence is pushed onto a stack Reduce operation –If the top n words on the top of the stack match the RHS of a production, then they are popped and replaced by the LHS of the production Stopping condition –The process stops when the input sentence has been processed and S has been popped from the stack.

[ * the child ate the cake]

S [ 'the' * child ate the cake]

[ * the child ate the cake] S [ 'the' * child ate the cake] R [ DT * child ate the cake]

[ * the child ate the cake] S [ 'the' * child ate the cake] R [ DT * child ate the cake] S [ DT 'child' * ate the cake] R [ DT N * ate the cake]

[ * the child ate the cake] S [ 'the' * child ate the cake] R [ DT * child ate the cake] S [ DT 'child' * ate the cake] R [ DT N * ate the cake] R [ NP * ate the cake] S [ NP 'ate' * the cake]

[ * the child ate the cake] S [ 'the' * child ate the cake] R [ DT * child ate the cake] S [ DT 'child' * ate the cake] R [ DT N * ate the cake] R [ NP * ate the cake] S [ NP 'ate' * the cake] R [ NP V * the cake] S [ NP V 'the' * cake] R [ NP V DT * cake] S [ NP V DT 'cake' * ]

[ * the child ate the cake] S [ 'the' * child ate the cake] R [ DT * child ate the cake] S [ DT 'child' * ate the cake] R [ DT N * ate the cake] R [ NP * ate the cake] S [ NP 'ate' * the cake] R [ NP V * the cake] S [ NP V 'the' * cake] R [ NP V DT * cake] S [ NP V DT 'cake' * ] R [ NP V DT N * ] R [ NP V NP * ] R [ NP VP * ] R [ S * ]

[ * the child ate the cake] S [ 'the' * child ate the cake] R [ DT * child ate the cake] S [ DT 'child' * ate the cake] R [ DT N * ate the cake] R [ NP * ate the cake] S [ NP 'ate' * the cake] R [ NP V * the cake] S [ NP V 'the' * cake] R [ NP V DT * cake] S [ NP V DT 'cake' * ] R [ NP V DT N * ] R [ NP V NP * ] R [ NP VP * ] R [ S * ] (S (NP (DT the) (N child)) (VP (V ate) (NP (DT the) (N cake))))

NLP