CS : Language Technology For The Web/Natural Language Processing

Slides:



Advertisements
Similar presentations
LING 388: Language and Computers
Advertisements

CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 2 (06/01/06) Prof. Pushpak Bhattacharyya IIT Bombay Part of Speech (PoS)
Probabilistic Parsing Chapter 14, Part 2 This slide set was adapted from J. Martin, R. Mihalcea, Rebecca Hwa, and Ray Mooney.
September PROBABILISTIC CFGs & PROBABILISTIC PARSING Universita’ di Venezia 3 Ottobre 2003.
Introduction and Jurafsky Model Resource: A Probabilistic Model of Lexical and Syntactic Access and Disambiguation, Jurafsky 1996.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Probabilistic Context Free Grammars (Chapter 14) Muhammed Al-Mulhem March 1,
Applications of Sequence Learning CMPT 825 Mashaal A. Memon
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
6/9/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 11 Giuseppe Carenini.
Parsing with PCFG Ling 571 Fei Xia Week 3: 10/11-10/13/05.
SI485i : NLP Set 9 Advanced PCFGs Some slides from Chris Manning.
Generative and Discriminative Models in NLP: A Survey Kristina Toutanova Computer Science Department Stanford University.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
Tree Kernels for Parsing: (Collins & Duffy, 2001) Advanced Statistical Methods in NLP Ling 572 February 28, 2012.
BİL711 Natural Language Processing1 Statistical Parse Disambiguation Problem: –How do we disambiguate among a set of parses of a given sentence? –We want.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
1 Statistical Parsing Chapter 14 October 2012 Lecture #9.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
1 Semi-Supervised Approaches for Learning to Parse Natural Languages Rebecca Hwa
CS626: NLP, Speech and the Web Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 15, 17: Parsing Ambiguity, Probabilistic Parsing, sample seminar 17.
University of Edinburgh27/10/20151 Lexical Dependency Parsing Chris Brew OhioState University.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Albert Gatt Corpora and Statistical Methods Lecture 11.
Linguistic Essentials
Lexicalized and Probabilistic Parsing Read J & M Chapter 12.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-16: Probabilistic parsing; computing probability of.
CSA2050 Introduction to Computational Linguistics Parsing I.
NLP. Introduction to NLP Background –Developed by Jay Earley in 1970 –No need to convert the grammar to CNF –Left to right Complexity –Faster than O(n.
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-14: Probabilistic parsing; sequence labeling, PCFG.
Introduction to Syntactic Parsing Roxana Girju November 18, 2004 Some slides were provided by Michael Collins (MIT) and Dan Moldovan (UT Dallas)
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
Syntax 3rd class Chapter 4. Syntactic Categories 1. That glass suddenly broke. 2. A jogger ran toward the end of the lane. 3. These dead trees might block.
1 Semi-Supervised Approaches for Learning to Parse Natural Languages Slides are from Rebecca Hwa, Ray Mooney.
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
DERIVATION S RULES USEDPROBABILITY P(s) = Σ j P(T,S) where t is a parse of s = Σ j P(T) P(T) – The probability of a tree T is the product.
NLP. Introduction to NLP Time flies like an arrow –Many parses –Some (clearly) more likely than others –Need for a probabilistic ranking method.
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 13 (17/02/06) Prof. Pushpak Bhattacharyya IIT Bombay Top-Down Bottom-Up.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-15: Probabilistic parsing; PCFG (contd.)
NLP. Parsing ( (S (NP-SBJ (NP (NNP Pierre) (NNP Vinken) ) (,,) (ADJP (NP (CD 61) (NNS years) ) (JJ old) ) (,,) ) (VP (MD will) (VP (VB join) (NP (DT.
NLP. Introduction to NLP #include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse =
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Parsing Algos.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 25– Probabilistic Parsing) Pushpak Bhattacharyya CSE Dept., IIT Bombay 14 th March,
LING/C SC 581: Advanced Computational Linguistics Lecture Notes Feb 17 th.
Roadmap Probabilistic CFGs –Handling ambiguity – more likely analyses –Adding probabilities Grammar Parsing: probabilistic CYK Learning probabilities:
Probabilistic and Lexicalized Parsing. Probabilistic CFGs Weighted CFGs –Attach weights to rules of CFG –Compute weights of derivations –Use weights to.
Syntax 2.
Syntax and parsing Introduction to Computational Linguistics – 28 March 2017.
CSC 594 Topics in AI – Natural Language Processing
Statistical NLP Winter 2009
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Natural Language Processing
CS : Speech, NLP and the Web/Topics in AI
LING/C SC/PSYC 438/538 Lecture 21 Sandiway Fong.
CS 388: Natural Language Processing: Statistical Parsing
Probabilistic and Lexicalized Parsing
LING/C SC 581: Advanced Computational Linguistics
LING/C SC/PSYC 438/538 Lecture 3 Sandiway Fong.
Probabilistic and Lexicalized Parsing
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 26
LING/C SC/PSYC 438/538 Lecture 26 Sandiway Fong.
Linguistic Essentials
CS : Language Technology for the Web/Natural Language Processing
David Kauchak CS159 – Spring 2019
Probabilistic Parsing
David Kauchak CS159 – Spring 2019
LING/C SC/PSYC 438/538 Lecture 3 Sandiway Fong.
Presentation transcript:

CS626-460: Language Technology For The Web/Natural Language Processing PCFG and Syntactic Disambiguation

Ambiguity Ambiguity Lexical Syntactic Discourse Preposition attachment ambiguity Clause attachment ambiguity

Prepositional Attachment Ambiguity saw the boy with the telescope V NP1 P NP2 VP1 PP1 Where does PP1 attach? With NP1 or With VP1

Clause Attachment Ambiguity Ram told the child that he loved that Shyam came to the ground Parse 1 S VP S1 S2 Here the clause “that he loved” attaches to the clause “that Shyam came to the ground” VP S3 VP PP NP V NP CP NP V CP NP V P NP NNP VB DT NN C PRP VBZ C NNP VB P DT NN Ram told the child that he loved that Shyam came to the ground

Clause Attachment Ambiguity Ram told the child that he loved that Shyam came to the ground Parse 2 Here the clause “that he loved” attaches to the clause “told the child” S VP S3 VP VP VP VP PP NP V NP CP NP V CP NP V P NP NNP VB DT NN C PRP VBZ C NNP VB P DT NN Ram told the child that he loved that Shyam came to the ground

PP Attachment Disambiguation PP Attachment Disambiguation Methods Rule-Based Methods Statistical Methods Hindle & Ruth (1980), Ratnaparkhi (1996)

Probabilistic Formulation Given V NP1 P NP2 where does PP (P NP2) attach? It attaches to V if: P (Vattach |V NP1 P NP2) > P (NP1attach |V NP1 P NP2) and attaches to NP1 if: P (NP1attach|V NP1 P NP2) > P (Vattach|V NP1 P NP2)

Assumptions Assumption – 1 (Independence assumption) Assumption – 2 P (Xattach |V NP1 P NP2 <other words>) = P (Xattach |V NP1 P NP2) Even then the problem can be complex if NP1 ad NP2 are complex Assumption – 2 P (Xattach |V NP1 P NP2) = P (Xattach |V N1 P N2) where N1 is head(NP1) ad N2 is head(NP2) Example Ram saw the boy with the telescope. P (sawattach |saw the boy with the telescope) = P (sawattach |saw boy with telescope) Assumption – 3 (used sometimes) P (Xattach |V N1 P N2) = P (Xattach |V N1 P)

Resources Needed Observation It’s a two class classification problem. So a good candidate would be SVM, BP-NN or perceptron Training Corpora In the form of tuples <V, N1, P, a> where a = 1 for Vattach and a = 2 for Nattach V N1 P a see boy with 1 drink tea find box 2

EXTRA SLIDES

Clause Attachment Ambiguity NP VP NP V NP S1 NP V NP CP S2 NP V NP CP NP VP NP V NP CP NP V CP S3 NP V NP CP NP V CP NP VP NP V NP CP NP V CP NP V PP NP V NP CP NP V CP NP V P NP NNP VB DT NN C PRP VBZ C NNP VB P DT NN Ram told the child that he loved that Shyam came to the ground