Recitation Akshay Srivatsan

Slides:



Advertisements
Similar presentations
Albert Gatt Corpora and Statistical Methods Lecture 11.
Advertisements

Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
Parsing with Context Free Grammars Reading: Chap 13, Jurafsky & Martin
1 Statistical NLP: Lecture 12 Probabilistic Context Free Grammars.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Probabilistic Context Free Grammars (Chapter 14) Muhammed Al-Mulhem March 1,
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
Intro to NLP - J. Eisner1 Probabilistic CKY.
Parsing with PCFG Ling 571 Fei Xia Week 3: 10/11-10/13/05.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Probabilistic Parsing Ling 571 Fei Xia Week 4: 10/18-10/20/05.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
1 Statistical Parsing Chapter 14 October 2012 Lecture #9.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
10. Parsing with Context-free Grammars -Speech and Language Processing- 발표자 : 정영임 발표일 :
Parsing I: Earley Parser CMSC Natural Language Processing May 1, 2003.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-17: Probabilistic parsing; inside- outside probabilities.
Probabilistic CKY Roger Levy [thanks to Jason Eisner]
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 26– Recap HMM; Probabilistic Parsing cntd) Pushpak Bhattacharyya CSE Dept., IIT.
Chart Parsing and Augmenting Grammars CSE-391: Artificial Intelligence University of Pennsylvania Matt Huenerfauth March 2005.
Natural Language - General
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Time flies like an arrow –Many parses –Some (clearly) more likely than others –Need for a probabilistic ranking method.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-15: Probabilistic parsing; PCFG (contd.)
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 25– Probabilistic Parsing) Pushpak Bhattacharyya CSE Dept., IIT Bombay 14 th March,
Speech and Language Processing SLP Chapter 13 Parsing.
Natural Language Processing : Probabilistic Context Free Grammars Updated 8/07.
Parsing with Context Free Grammars. Slide 1 Outline Why should you care? Parsing Top-Down Parsing Bottom-Up Parsing Bottom-Up Space (an example) Top -
Roadmap Probabilistic CFGs –Handling ambiguity – more likely analyses –Adding probabilities Grammar Parsing: probabilistic CYK Learning probabilities:
Probabilistic and Lexicalized Parsing. Probabilistic CFGs Weighted CFGs –Attach weights to rules of CFG –Compute weights of derivations –Use weights to.
1 Statistical methods in NLP Diana Trandabat
CSC 594 Topics in AI – Natural Language Processing
CSC 594 Topics in AI – Natural Language Processing
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
The Expectation Maximization (EM) Algorithm
Statistical NLP Winter 2009
Modeling Grammaticality
Context-free grammars, derivation trees, and ambiguity
Basic Parsing with Context Free Grammars Chapter 13
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 6/19/2018
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
Probabilistic CKY Parser
Inside-Outside & Forward-Backward Algorithms are just Backprop
Natural Language Processing
CS : Speech, NLP and the Web/Topics in AI
CS 388: Natural Language Processing: Statistical Parsing
CS : Speech, NLP and the Web/Topics in AI
Probabilistic and Lexicalized Parsing
CSCI 5832 Natural Language Processing
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
CS 388: Natural Language Processing: Syntactic Parsing
CSCI 5832 Natural Language Processing
Natural Language - General
Parsing Tricks Intro to NLP - J. Eisner.
The Expectation Maximization (EM) Algorithm
CS : Language Technology for the Web/Natural Language Processing
David Kauchak CS159 – Spring 2019
Modeling Grammaticality
CS 224n / Lx 237 section Tuesday, May
Probabilistic Parsing
Weighted Parsing, Probabilistic Parsing
David Kauchak CS159 – Spring 2019
Prof. Pushpak Bhattacharyya, IIT Bombay
Presentation transcript:

11-711 Recitation Akshay Srivatsan 2017-10-06 Pineapples are not a single fruit but a colony of small fruits Number of rows of fruit is a Fibonacci number in any direction at any angle Syntactic ambiguity---surely you must be kidding me?

11-711 AUDIENCES THE FOLLOWING PREVIEW HAS BEEN APPROVED FOR BY THE MOTION PICTURE ASSOCIATION OF AMERICA, INC. www.filmratings.com www.mpaa.org

Reminder on HMMs Everything you didn’t pay attention to last week actually matters

Reminder on HMMs HMMs are just right-branching trees

Context-Free Grammars NP V ROOT ate S N S NP VP NP VP Previously FSA, hard to visualize for trees Rules describe transitions, emissions, and now splits (types of rules) Think of grammars as puzzle pieces If the NTs match, you can attach Like HMM transitions, conditional probability on expansion Det N N N Det V NP cake I the

The Generative Story ROOT S NP VP N V NP Det N I ate the cake

Parsing ROOT S NP VP N V Det I ate the cake Recover parse from words (why is this hard?) I ate the cake

Second sentence assumes an unlikely parse of the first

Viterbi – Decoding an HMM Recover most likely path with DP Det Det Det Det JJ JJ JJ JJ

CKY – Decoding a Tree NP VP PP S NP VP PP S NP VP PP S This trellis is hard to visualize, so we use a chart Note that this is more analogous to the backward algorithm than the forward NP VP PP S NP VP PP S

The Chart 1 2 3 4 Denote possible locations for nonterminals Each cell can contain as many elements as there are nonterminals Index chart with spans 1 2 3 4

The Chart S VP NP NP V Det N I ate the cake

The Chart S VP NP NP V Det N 1 2 3 4 I ate the cake What about unary rules? Spans combine to form larger spans (chalkboard) NP V Det N 1 2 3 4 I ate the cake

“Time flies like an arrow”

“Fruit flies like a banana”

Parsing with CKY 1 2 3 4 5 time flies like an arrow PCFG Rules 1 : NP -> Det N 3 : VP -> V NP 2 : PP -> P NP 2 : VP -> VP PP 1 : S -> NP VP 3 : NP -> NP NP 1 : NP -> time 3 : NP -> flies 2 : VP -> flies 3 : V -> like 2 : P -> like 1 : Det -> an 1 : N -> arrow 17 S 13 S 13 S 11 VP 9 VP 7 PP 7 NP Cells are roughly equivalent to time steps in HMM Use negative log prob Most likely path given by most likely of local decisions Get a redundancy at top of tree – only keep most likely 3 NP 4 S 3 NP 3 V 1 NP 1 Det 1 N 2 VP 2 P 1 2 3 4 5 time flies like an arrow