David Kauchak CS159 – Spring 2019

Slides:



Advertisements
Similar presentations
Syntax and Context-Free Grammars Julia Hirschberg CS 4705 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
Advertisements

Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Context-Free Grammars Julia Hirschberg CS 4705 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
FSG, RegEx, and CFG Chapter 2 Of Sag’s Syntactic Theory.
Introduction to Syntax Owen Rambow September 30.
Probabilistic Parsing Chapter 14, Part 2 This slide set was adapted from J. Martin, R. Mihalcea, Rebecca Hwa, and Ray Mooney.
Introduction and Jurafsky Model Resource: A Probabilistic Model of Lexical and Syntactic Access and Disambiguation, Jurafsky 1996.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Probabilistic Context Free Grammars (Chapter 14) Muhammed Al-Mulhem March 1,
Introduction to Syntax Owen Rambow October
Introduction to Syntax, with Part-of-Speech Tagging Owen Rambow September 17 & 19.
CS 4705 Basic Parsing with Context-Free Grammars.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
Context Free Grammar S -> NP VP NP -> det (adj) N
Basic Parsing with Context- Free Grammars 1 Some slides adapted from Julia Hirschberg and Dan Jurafsky.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
CFGS – TAKE 2 David Kauchak CS30 – Spring Admin Today’s mentor hours moved to 6-8pm Assignment 4 graded Assignment 5 - how’s it going? - part A.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
SI485i : NLP Set 8 PCFGs and the CKY Algorithm. PCFGs We saw how CFGs can model English (sort of) Probabilistic CFGs put weights on the production rules.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
CPE 480 Natural Language Processing Lecture 4: Syntax Adapted from Owen Rambow’s slides for CSc Fall 2006.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-16: Probabilistic parsing; computing probability of.
CSA2050 Introduction to Computational Linguistics Parsing I.
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
1 Context Free Grammars Chapter 9 (Much influenced by Owen Rambow) October 2009 Lecture #7.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
December 2011CSA3202: PCFGs1 CSA3202: Human Language Technology Probabilistic Phrase Structure Grammars (PCFGs)
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Time flies like an arrow –Many parses –Some (clearly) more likely than others –Need for a probabilistic ranking method.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-15: Probabilistic parsing; PCFG (contd.)
NLP. Introduction to NLP #include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse =
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 25– Probabilistic Parsing) Pushpak Bhattacharyya CSE Dept., IIT Bombay 14 th March,
1 Statistical methods in NLP Course 5 Diana Trandab ă ț
Natural Language Processing : Probabilistic Context Free Grammars Updated 8/07.
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
General Information on Context-free and Probabilistic Context-free Grammars İbrahim Hoça CENG784, Fall 2013.
Statistical Parsing IP disclosure: Content borrowed from J&M 3 rd edition and Raymond Mooney.
1 Statistical methods in NLP Diana Trandabat
Natural Language Processing Vasile Rus
COSC 6336 Natural Language Processing Statistical Parsing
Modeling Grammaticality
Basic Parsing with Context Free Grammars Chapter 13
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 6/19/2018
Probabilistic CKY Parser
SYNTAX.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
CSC NLP -Context-Free Grammars
CS 388: Natural Language Processing: Statistical Parsing
CPSC 503 Computational Linguistics
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
CS 388: Natural Language Processing: Syntactic Parsing
Natural Language - General
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Parsing and More Parsing
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 26
Introduction to Syntax
CS : Language Technology for the Web/Natural Language Processing
Modeling Grammaticality
David Kauchak CS159 – Spring 2019
Presentation transcript:

David Kauchak CS159 – Spring 2019 Grammars David Kauchak CS159 – Spring 2019 some slides adapted from Ray Mooney

Admin Assignment 3 out: due next Monday Quiz #1

Context free grammar S  NP VP left hand side (single symbol) right hand side (one or more symbols)

Formally… G = (NT, T, P, S) NT: finite set of nonterminal symbols T: finite set of terminal symbols, NT and T are disjoint P: finite set of productions of the form A  , A  NT and   (T  NT)* S  NT: start symbol

CFG: Example Many possible CFGs for English, here is an example (fragment): S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the | is shorthand

Derivations in a CFG S What can we do? S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the S What can we do?

Derivations in a CFG S S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the S

Derivations in a CFG NP VP What can we do? S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the NP VP What can we do?

Derivations in a CFG NP VP S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the NP VP

Derivations in a CFG DetP N VP S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the DetP N VP

Derivations in a CFG DetP N VP S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the DetP N VP

Derivations in a CFG the boy VP S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the the boy VP

Derivations in a CFG the boy likes NP S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the the boy likes NP

Derivations in a CFG the boy likes a girl S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the the boy likes a girl

Derivations in a CFG; Order of Derivation Irrelevant S  NP VP VP  V NP NP  DetP N | DetP AdjP N AdjP  Adj | Adv AdjP N  boy | girl V  sees | likes Adj  big | small Adv  very DetP  a | the NP VP DetP N VP NP V NP the boy likes a girl

Derivations of CFGs the boy likes a girl String rewriting system: we derive a string Derivation history shows constituent tree: S NP VP DetP the boy likes a girl N V NP the boy likes DetP N a girl

Parsing Parsing is the field of NLP interested in automatically determining the syntactic structure of a sentence Parsing can be thought of as determining what sentences are “valid” English sentences As a byproduct, we often can get the structure

Parsing Given a CFG and a sentence, determine the possible parse tree(s) I eat sushi with tuna S -> NP VP NP -> N NP -> PRP NP -> N PP VP -> V NP VP -> V NP PP PP -> IN N PRP -> I V -> eat N -> sushi N -> tuna IN -> with What parse trees are possible for this sentence? How did you do it? What if the grammar is much larger?

Parsing I eat sushi with tuna I eat sushi with tuna S -> NP VP NP -> PRP NP -> N PP NP -> N VP -> V NP VP -> V NP PP PP -> IN N PRP -> I V -> eat N -> sushi N -> tuna IN -> with S S VP VP NP NP NP PP NP PP PRP V N IN N PRP V N IN N I eat sushi with tuna I eat sushi with tuna What is the difference between these parses?

Parsing ambiguity I eat sushi with tuna I eat sushi with tuna S -> NP VP NP -> PRP NP -> N PP NP -> N VP -> V NP VP -> V NP PP PP -> IN N PRP -> I V -> eat N -> sushi N -> tuna IN -> with S S VP VP NP NP NP PP NP PP PRP V N IN N PRP V N IN N I eat sushi with tuna I eat sushi with tuna How can we decide between these?

A Simple PCFG Probabilities! S  NP VP 1.0 VP  V NP 0.7 VP  VP PP 0.3 PP  P NP 1.0 P  with 1.0 V  saw 1.0 NP  NP PP 0.4 NP  astronomers 0.1 NP  ears 0.18 NP  saw 0.04 NP  stars 0.18 NP  telescope 0.1

Just like n-gram language modeling, PCFGs break the sentence generation process into smaller steps/probabilities The probability of a parse is the product of the PCFG rules

What are the different interpretations here? Which do you think is more likely?

= 1.0 * 0.1 * 0.7 * 1.0 * 0.4 * 0.18 * 1.0 * 1.0 * 0.18 = 0.0009072 = 1.0 * 0.1 * 0.3 * 0.7 * 1.0 * 0.18 * 1.0 * 1.0 * 0.18 = 0.0006804

Parsing problems Pick a model Train (or learn) a model Parsing e.g. CFG, PCFG, … Train (or learn) a model What CFG/PCFG rules should I use? Parameters (e.g. PCFG probabilities)? What kind of data do we have? Parsing Determine the parse tree(s) given a sentence

PCFG: Training If we have example parsed sentences, how can we learn a set of PCFGs? Tree Bank S NP VP John V NP PP put the dog in the pen S → NP VP S → VP NP → Det A N NP → NP PP NP → PropN A → ε A → Adj A PP → Prep NP VP → V NP VP → VP PP 0.9 0.1 0.5 0.3 0.2 0.6 0.4 1.0 0.7 English Supervised PCFG Training S NP VP John V NP PP put the dog in the pen .

Extracting the rules I eat sushi with tuna S  NP VP NP  PRP PRP  I VP  V NP V  eat NP  N PP N  sushi PP  IN N IN  with N  tuna VP NP NP PP PRP V N IN N I eat sushi with tuna What CFG rules occur in this tree?

Estimating PCFG Probabilities We can extract the rules from the trees S  NP VP NP  PRP PRP  I VP  V NP V  eat NP  N PP N  sushi … S  NP VP 1.0 VP  V NP 0.7 VP  VP PP 0.3 PP  P NP 1.0 P  with 1.0 V  saw 1.0 How do we go from the extracted CFG rules to PCFG rules?

Estimating PCFG Probabilities Extract the rules from the trees Calculate the probabilities using MLE

Estimating PCFG Probabilities Occurrences S  NP VP 10 S  V NP 3 S  VP PP 2 NP  N 7 NP  N PP 3 NP  DT N 6 P( S  V NP) = ? P( S  V NP) = P( S  V NP | S) = count(S  V NP) count(S) 3 15 =

Grammar Equivalence What does it mean for two grammars to be equal?

Grammar Equivalence Weak equivalence: grammars generate the same set of strings Grammar 1: NP  DetP N and DetP  a | the Grammar 2: NP  a N | the N Strong equivalence: grammars have the same set of derivation trees With CFGs, possible only with useless rules Grammar 3: NP  a N | the N, DetP  many

Normal Forms Every CFG has a weakly equivalent CFG in CNF There are weakly equivalent normal forms (Chomsky Normal Form, Greibach Normal Form) A CFG is in Chomsky Normal Form (CNF) if all productions are of one of two forms: A  B C with A, B, C nonterminals A  a, with A a nonterminal and a a terminal Every CFG has a weakly equivalent CFG in CNF

CNF Grammar S -> VP S -> VP VP -> VB NP VP -> VB NP VP -> VP2 PP VP2 -> VB NP NP -> DT NN NP -> NN NP -> NP PP PP -> IN NP DT -> the IN -> with VB -> film VB -> trust NN -> man NN -> film NN -> trust S -> VP VP -> VB NP VP -> VB NP PP NP -> DT NN NP -> NN NP -> NP PP PP -> IN NP DT -> the IN -> with VB -> film VB -> trust NN -> man NN -> film NN -> trust

Probabilistic Grammar Conversion Original Grammar Chomsky Normal Form S → NP VP S → Aux NP VP S → VP NP → Pronoun NP → Proper-Noun NP → Det Nominal Nominal → Noun Nominal → Nominal Noun Nominal → Nominal PP VP → Verb VP → Verb NP VP → VP PP PP → Prep NP 0.8 0.1 0.2 0.6 0.3 0.5 1.0 S → NP VP S → X1 VP X1 → Aux NP S → book | include | prefer 0.01 0.004 0.006 S → Verb NP S → VP PP NP → I | he | she | me 0.1 0.02 0.02 0.06 NP → Houston | NWA 0.16 .04 NP → Det Nominal Nominal → book | flight | meal | money 0.03 0.15 0.06 0.06 Nominal → Nominal Noun Nominal → Nominal PP VP → book | include | prefer 0.1 0.04 0.06 VP → Verb NP VP → VP PP PP → Prep NP 0.8 0.1 1.0 0.05 0.03 0.6 0.2 0.5 0.3 35

States What is the capitol of this state? Helena (Montana)

Grammar questions Can we determine if a sentence is grammatical? Given a sentence, can we determine the syntactic structure? Can we determine how likely a sentence is to be grammatical? to be an English sentence? Can we generate candidate, grammatical sentences? Next time: parsing