LING 388: Language and Computers

Slides:



Advertisements
Similar presentations
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Advertisements

LING 388: Language and Computers Sandiway Fong Lecture 5: 9/5.
LING 364: Introduction to Formal Semantics Lecture 24 April 13th.
LING 388: Language and Computers Sandiway Fong Lecture 5: 9/8.
LING/C SC/PSYC 438/538 Lecture 22 Sandiway Fong 1.
LING 388: Language and Computers Sandiway Fong Lecture 2.
LING 388: Language and Computers Sandiway Fong Lecture 15.
LING 388 Language and Computers Lecture 22 11/25/03 Sandiway FONG.
LING 438/538 Computational Linguistics Sandiway Fong Lecture 7: 9/12.
LING 364: Introduction to Formal Semantics Lecture 9 February 9th.
LING 438/538 Computational Linguistics Sandiway Fong Lecture 9: 9/21.
LING/C SC/PSYC 438/538 Computational Linguistics Sandiway Fong Lecture 8: 9/18.
LING/C SC/PSYC 438/538 Computational Linguistics Sandiway Fong Lecture 7: 9/11.
LING 388 Language and Computers Lecture 18 10/30/03 Sandiway FONG.
LING 388 Language and Computers Lecture 12 10/9/03 Sandiway FONG.
LING 388: Language and Computers Sandiway Fong Lecture 8.
SI485i : NLP Set 9 Advanced PCFGs Some slides from Chris Manning.
LING/C SC/PSYC 438/538 Lecture 19 Sandiway Fong 1.
LING 388: Language and Computers Sandiway Fong Lecture 11.
LING 388: Language and Computers Sandiway Fong Lecture 17.
11/22/1999 JHU CS /Jan Hajic 1 Introduction to Natural Language Processing ( ) Shift-Reduce Parsing in Detail Dr. Jan Hajič CS Dept., Johns.
LING 388: Language and Computers Sandiway Fong Lecture 7.
LING 388: Language and Computers Sandiway Fong Lecture 3.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
SI485i : NLP Set 8 PCFGs and the CKY Algorithm. PCFGs We saw how CFGs can model English (sort of) Probabilistic CFGs put weights on the production rules.
LING 388: Language and Computers Sandiway Fong Lecture 18.
LING 388: Language and Computers Sandiway Fong Lecture 26 11/22.
Context-Free Parsing Read J & M Chapter 10.. Basic Parsing Facts Regular LanguagesContext-Free Languages Required Automaton FSMPDA Algorithm to get rid.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
LING 388: Language and Computers Sandiway Fong Lecture 10.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
LING 388: Language and Computers Sandiway Fong Lecture 13.
Notes on Pinker ch.7 Grammar, parsing, meaning. What is a grammar? A grammar is a code or function that is a database specifying what kind of sounds correspond.
Linguistic Essentials
CS460/626 : Natural Language Processing/Speech, NLP and the Web (Lecture 29– CYK; Inside Probability; Parse Tree construction) Pushpak Bhattacharyya CSE.
LING 388: Language and Computers Sandiway Fong Lecture 12.
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 17 (14/03/06) Prof. Pushpak Bhattacharyya IIT Bombay Formulation of Grammar.
CSA2050 Introduction to Computational Linguistics Parsing I.
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
LING 388: Language and Computers Sandiway Fong Lecture 21.
LING 388: Language and Computers Sandiway Fong Lecture 25.
11 Project, Part 3. Outline Basics of supervised learning using Naïve Bayes (using a simpler example) Features for the project 2.
LING/C SC/PSYC 438/538 Lecture 20 Sandiway Fong 1.
LING 388: Language and Computers Sandiway Fong Lecture 16.
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
October 2005CSA3180: Parsing Algorithms 21 CSA3050: NLP Algorithms Parsing Algorithms 2 Problems with DFTD Parser Earley Parsing Algorithm.
NLP. Introduction to NLP Time flies like an arrow –Many parses –Some (clearly) more likely than others –Need for a probabilistic ranking method.
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 13 (17/02/06) Prof. Pushpak Bhattacharyya IIT Bombay Top-Down Bottom-Up.
LING 388: Language and Computers Sandiway Fong Lecture 20.
NLP. Introduction to NLP #include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse =
LING/C SC/PSYC 438/538 Lecture 19 Sandiway Fong 1.
Computational Intelligence 696i Language Homework 1 Answers Sandiway Fong.
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Parsing Algos.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
LING/C SC 581: Advanced Computational Linguistics Lecture Notes Feb 17 th.
Statistical NLP Winter 2009
LING/C SC/PSYC 438/538 Lecture 23 Sandiway Fong.
CS : Speech, NLP and the Web/Topics in AI
LING/C SC/PSYC 438/538 Lecture 21 Sandiway Fong.
LING 581: Advanced Computational Linguistics
LING/C SC 581: Advanced Computational Linguistics
LING/C SC/PSYC 438/538 Lecture 3 Sandiway Fong.
CS : Language Technology For The Web/Natural Language Processing
LING/C SC/PSYC 438/538 Lecture 24 Sandiway Fong.
LING/C SC/PSYC 438/538 Lecture 25 Sandiway Fong.
LING/C SC/PSYC 438/538 Lecture 26 Sandiway Fong.
CS : Language Technology for the Web/Natural Language Processing
David Kauchak CS159 – Spring 2019
LING/C SC 581: Advanced Computational Linguistics
Presentation transcript:

LING 388: Language and Computers Sandiway Fong 10/15 Lecture 15

Adminstrivia Reminder Extra credit homework out today Homework 4 due tonight… Extra credit homework out today due Thursday easy way to pick up extra points…

Last Time Applied the transformation: to NP and VP rules: np(np(DT,NN)) --> dt(DT), nn(NN). np(np(NP,PP)) --> np(NP), pp(PP). vp(vp(VBD,NP)) --> vbd(VBD), np(NP). vp(vp(VP,PP)) --> vp(VP), pp(PP). x(X) --> [z], w(X,x(z)). x(x(z)) --> [z]. w(W,X) --> [y], w(W,x(X,y)). w(x(X,y),X) --> [y]. x(x(X,y)) --> x(X), [y]. x(x(z)) --> [z]. [z] [y] x x [z] [y] x x

Last Time

Last Time 2nd page:

Last Time Parses for: Stanford Parser: I saw a boy with a telescope with a limp with no money Stanford Parser:

Last Time Parses for: I saw a boy with a telescope with a limp with no money 3 PPs with a telescope with a limp with no money 14 parses …

saw a boy a telescope a limp no money

saw a boy a telescope a limp no money

saw a boy a telescope a limp no money

saw a boy a telescope a limp no money

saw a boy a telescope a limp no money

saw a telescope a limp no money a boy

saw a telescope a limp no money a boy

saw a telescope a limp no money a boy

saw a telescope a limp no money a boy

saw a telescope a limp no money a boy

saw no money a boy a telescope a limp

saw a limp no money a boy a telescope

saw a limp no money a boy a telescope

saw no money a boy a telescope a limp

Why 14 parses? Parses for: I saw a boy with a telescope with a limp with no money Two attachment points VP (saw a boy) and NP (a boy) for the three PPs (1. with a telescope, 2. with a limp, 3. with no money) Four cases: 1. NP (1,2,3) VP 2. NP (1,2) VP (3) 3. NP (1) VP (2,3) 4. NP VP (1,2,3) Two items to attach: two possibilities XP 1 XP 1 2 2 Total number of parses: 5 + 2 + 2 + 5 = 14 Three items to attach: five possibilities XP 1 XP 1 XP 1 XP 1 XP 1 2 2 2 2 2 3 3 3 3 3

General recursive formula for # parses Let dk = configuration at depth k (k is depth of last PP) Formula: dk  d1 + ..+ dk + dk+1 (k=1,2,3,…) Start: d1 (case: one PP, only one place to attach it) d1+ d2 (d1+ d2 ) + (d1+ d2 + d3 ) = 2 d1+ 2 d2 + 1 d3 2 (d1+ d2 ) + 2 (d1+ d2 + d3 ) + (d1+ d2 + d3 + d4 ) = 5 d1+ 5 d2 + 3 d3 + 1 d4 XP 1 XP 1 XP 1 2 2 2 3 3 3 Depth: 1 1 2 Total for one PP: 1 Total for two PPs: 2 Total for 3 PPs: 5 Total for 4 PPs: 14

General recursive formula for # parses In principle, the formula allows to extrapolate the number of syntactically valid parses to an arbitrary number of PPs in a row… perhaps of academic interest only…

General recursive formula for # parses Use the formula in conjunction with the possible attachments at the top level: Case: just one PP phrase, e.g. [PP1 with a telescope] NP (1) VP 1 d1 NP VP (1) 1 d1 = 2 (total) Case: two PP phrases, e.g. [PP1with a telescope] [PP2 with a limp] NP (1,2) VP d1+ d2 NP (1) VP (2) 1 d1 x 1 d1 NP VP (1,2) d1+ d2 = 5 (total)

General recursive formula for # parses Case: three PP phrases, e.g. [PP1with a telescope] [PP2 with a limp] [PP3 with no money] NP (1,2,3) VP 2 d1+ 2 d2 + d3 NP (1,2) VP (3) (d1+ d2 ) x 1 d1 NP (1) VP (2,3) 1 d1 x (d1+ d2 ) NP VP (1,2,3) 2 d1+ 2 d2 + d3 = 14 (total)

General recursive formula for # parses Let’s see if our formula correctly predicts the number of parses for four PPs: Case: three PP phrases, e.g. [PP1with a telescope] [PP2 with a limp] [PP3 with no money] ] [PP4 with a smile] NP (1,2,3,4) VP 5 d1+ 5 d2 + 3 d3 + d4 NP (1,2,3) VP (4) (2 d1+ 2 d2 + d3 ) x 1 d1 NP (1,2) VP (3,4) (d1+ d2 ) x (d1+ d2 ) NP (1) VP (2,3,4) 1 d1 x (2 d1+ 2 d2 + d3 ) NP VP (1,2,3,4) 5 d1+ 5 d2 + 3 d3 + d4= 42 (total)

Extended grammar

Counting Parses To count the number of parses, run the command: Note: findall(Parse,s(Parse,[i,saw,a,boy,with,a,telescope,with,a,limp,with,no,money,with,a,smile],[]),List), length(List,Number). Note: findall/3 finds all solutions to the s/3 query, each time it finds a solution it puts it into a list (called List here) we evaluate the length of that List = Number

Counting Parses

Counting Parses

Counting Parses

Counting Parses

Counting Parses

Counting Parses Number of parses increases exponentially..

Homework 5 Extra Credit (optional) We know there are 5 attachment possibilities for 2 PPs following V NP Show that all 5 are possible in natural language i.e. construct sentences where each of the 5 possibilities would be the most likely parse (you may change the words for each example) e.g. I saw a boy with a telescope on a heavy tripod