Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture 12 22 August 2007.

Slides:



Advertisements
Similar presentations
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING Parsing: computing the grammatical structure of English sentences COMP3310.
Advertisements

Mrach 1, 2009Dr. Muhammed Al-Mulhem1 ICS482 Formal Grammars Chapter 12 Muhammed Al-Mulhem March 1, 2009.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Natural Language Processing - Parsing 1 - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment / Binding Bottom vs. Top Down Parsing.
PARSING WITH CONTEXT-FREE GRAMMARS
Parsing with Context Free Grammars Reading: Chap 13, Jurafsky & Martin
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language, Syntax, Parsing Problems in Parsing Ambiguity, Attachment.
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
1 LING 180 Autumn 2007 LINGUIST 180: Introduction to Computational Linguistics Dan Jurafsky, Marie-Catherine de Marneffe Lecture 9: Grammar and Parsing.
Albert Gatt LIN3022 Natural Language Processing Lecture 8.
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Grammar and Parsing.
Syntax and Context-Free Grammars CMSC 723: Computational Linguistics I ― Session #6 Jimmy Lin The iSchool University of Maryland Wednesday, October 7,
Syntactic Parsing with CFGs CMSC 723: Computational Linguistics I ― Session #7 Jimmy Lin The iSchool University of Maryland Wednesday, October 14, 2009.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
CPSC 503 Computational Linguistics
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
Syntax Construction of phrases and sentences from morphemes and words. Usually the word syntax refers to the way words are arranged together. Syntactic.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING COMP3310 Natural Language Processing Eric Atwell, Language Research Group.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin and Rada Mihalcea.
Speech and Language Processing Lecture 12—02/24/2015 Susan W. Brown.
Session 13 Context-Free Grammars and Language Syntax Introduction to Speech and Natural Language Processing (KOM422 ) Credits: 3(3-0)
TEORIE E TECNICHE DEL RICONOSCIMENTO Linguistica computazionale in Python: -Analisi sintattica (parsing)
1 Syntax Sudeshna Sarkar 25 Aug Sentence-Types Declaratives: A plane left S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane.
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
10/3/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 8 Giuseppe Carenini.
Chapter 10. Parsing with CFGs From: Chapter 10 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
LINGUISTICA GENERALE E COMPUTAZIONALE ANALISI SINTATTICA (PARSING)
Parsing with Context Free Grammars CSC 9010 Natural Language Processing Paula Matuszek and Mary-Angela Papalaskari This slide set was adapted from: Jim.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
6/2/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 9 Giuseppe Carenini.
1 LIN6932 Spring 2007 LIN6932 Topics in Computational Linguistics Lecture 6: Grammar and Parsing (I) February 15, 2007 Hana Filip.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
Rules, Movement, Ambiguity
Parsing with Context-Free Grammars References: 1.Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2.Speech and Language Processing, chapters 9,
Natural Language - General
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
Natural Language Processing Lecture 15—10/15/2015 Jim Martin.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
Natural Language Processing Lecture 14—10/13/2015 Jim Martin.
1/22/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 8 Giuseppe Carenini.
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Speech and Language Processing Formal Grammars Chapter 12.
Speech and Language Processing SLP Chapter 13 Parsing.
Parsing with Context Free Grammars. Slide 1 Outline Why should you care? Parsing Top-Down Parsing Bottom-Up Parsing Bottom-Up Space (an example) Top -
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
CSC 594 Topics in AI – Natural Language Processing
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
Speech and Language Processing
Basic Parsing with Context Free Grammars Chapter 13
CPSC 503 Computational Linguistics
Lecture 13: Grammar and Parsing (I) November 9, 2004 Dan Jurafsky
CSC 594 Topics in AI – Natural Language Processing
CSCI 5832 Natural Language Processing
CPSC 503 Computational Linguistics
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
CS 388: Natural Language Processing: Syntactic Parsing
CSCI 5832 Natural Language Processing
Lecture 14: Grammar and Parsing (II) November 11, 2004 Dan Jurafsky
Natural Language - General
CSCI 5832 Natural Language Processing
Presentation transcript:

Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007

Lecture 1, 7/21/2005Natural Language Processing2 LING 180 SYMBSYS 138 Intro to Computer Speech and Language Processing Lecture 13: Grammar and Parsing (I) October 24, 2006 Dan Jurafsky Thanks to Jim Martin for many of these slides!

Lecture 1, 7/21/2005Natural Language Processing3 Outline for Grammar/Parsing Week  Context-Free Grammars and Constituency  Some common CFG phenomena for English Sentence-level constructions NP, PP, VP Coordination Subcategorization  Top-down and Bottom-up Parsing  Earley Parsing  Quick sketch of probabilistic parsing

Lecture 1, 7/21/2005Natural Language Processing4 Review  Parts of Speech Basic syntactic/morphological categories that words belong to  Part of Speech tagging Assigning parts of speech to all the words in a sentence

Lecture 1, 7/21/2005Natural Language Processing5 Syntax  Syntax: from Greek syntaxis, “setting out together, arrangmenet’  Refers to the way words are arranged together, and the relationship between them.  Distinction: Prescriptive grammar: how people ought to talk Descriptive grammar: how they do talk  Goal of syntax is to model the knowledge of that people unconsciously have about the grammar of their native language

Lecture 1, 7/21/2005Natural Language Processing6 Syntax  Why should you care? Grammar checkers Question answering Information extraction Machine translation

Lecture 1, 7/21/2005Natural Language Processing7 4 key ideas of syntax  Constituency (we’ll spend most of our time on this)  Grammatical relations  Subcategorization  Lexical dependencies Plus one part we won’t have time for:  Movement/long-distance dependency

Lecture 1, 7/21/2005Natural Language Processing8 Context-Free Grammars  Capture constituency and ordering Ordering:  What are the rules that govern the ordering of words and bigger units in the language? Constituency: How words group into units and how the various kinds of units behave

Lecture 1, 7/21/2005Natural Language Processing9 Constituency  Noun phrases (NPs)  Three parties from Brooklyn  A high-class spot such as Mindy’s  The Broadway coppers  They  Harry the Horse  The reason he comes into the Hot Box  How do we know these form a constituent? They can all appear before a verb:  Three parties from Brooklyn arrive…  A high-class spot such as Mindy’s attracts…  The Broadway coppers love…  They sit…

Lecture 1, 7/21/2005Natural Language Processing10 Constituency (II) They can all appear before a verb:  Three parties from Brooklyn arrive…  A high-class spot such as Mindy’s attracts…  The Broadway coppers love…  They sit But individual words can’t always appear before verbs:  *from arrive…  *as attracts…  *the is  *spot is… Must be able to state generalizations like:  Noun phrases occur before verbs

Lecture 1, 7/21/2005Natural Language Processing11 Constituency (III)  Preposing and postposing: On September 17th, I’d like to fly from Atlanta to Denver I’d like to fly on September 17th from Atlanta to Denver I’d like to fly from Atlanta to Denver on September 17th.  But not: *On September, I’d like to fly 17th from Atlanta to Denver *On I’d like to fly September 17th from Atlanta to Denver

Lecture 1, 7/21/2005Natural Language Processing12 CFG Examples  S -> NP VP  NP -> Det NOMINAL  NOMINAL -> Noun  VP -> Verb  Det -> a  Noun -> flight  Verb -> left

Lecture 1, 7/21/2005Natural Language Processing13 CFGs  S -> NP VP This says that there are units called S, NP, and VP in this language That an S consists of an NP followed immediately by a VP Doesn’t say that that’s the only kind of S Nor does it say that this is the only place that NPs and VPs occur

Lecture 1, 7/21/2005Natural Language Processing14 Generativity  As with FSAs and FSTs you can view these rules as either analysis or synthesis machines Generate strings in the language Reject strings not in the language Impose structures (trees) on strings in the language

Lecture 1, 7/21/2005Natural Language Processing15 Derivations  A derivation is a sequence of rules applied to a string that accounts for that string Covers all the elements in the string Covers only the elements in the string

Lecture 1, 7/21/2005Natural Language Processing16 Derivations as Trees

Lecture 1, 7/21/2005Natural Language Processing17 CFGs more formally  A context-free grammar has 4 parameters (“is a 4- tuple”) 1) A set of non-terminal symbols (“variables”) N 2) A set of terminal symbols  (disjoint from N) 3) A set of productions P, each of the form  A ->   Where A is a non-terminal and  is a string of symbols from the infinite set of strings (   N)* 4) A designated start symbol S

Lecture 1, 7/21/2005Natural Language Processing18 Defining a CF language via derivation  A string A derives a string B if A can be rewritten as B via some series of rule applications  More formally: If A ->  is a production of P  and  are any strings in the set (   N)* Then we say that   A  directly derives   Or  A    Derivation is a generalization of direct derivation Let  1,  2, …  m be strings in (   N)*, m>= 1, s.t.   1   2,  2   3 …  m-1   m  We say that  1 derives  m or  1*   m We then formally define language L G generated by grammar G  As set of strings composed of terminal symbols derived from S  L G = {w | w is in  * and S *  w}

Lecture 1, 7/21/2005Natural Language Processing19 Parsing  Parsing is the process of taking a string and a grammar and returning a (many?) parse tree(s) for that string

Lecture 1, 7/21/2005Natural Language Processing20 Context?  The notion of context in CFGs has nothing to do with the ordinary meaning of the word context in language.  All it really means is that the non-terminal on the left-hand side of a rule is out there all by itself (free of context) A -> B C Means that I can rewrite an A as a B followed by a C regardless of the context in which A is found

Lecture 1, 7/21/2005Natural Language Processing21 Key Constituents (English)  Sentences  Noun phrases  Verb phrases  Prepositional phrases

Lecture 1, 7/21/2005Natural Language Processing22 Sentence-Types  Declaratives: A plane left S -> NP VP  Imperatives: Leave! S -> VP  Yes-No Questions: Did the plane leave? S -> Aux NP VP  WH Questions: When did the plane leave? S -> WH Aux NP VP

Lecture 1, 7/21/2005Natural Language Processing23 NPs  NP -> Pronoun I came, you saw it, they conquered  NP -> Proper-Noun Los Angeles is west of Texas John Hennesey is the president of Stanford  NP -> Det Noun The president  NP -> Nominal  Nominal -> Noun Noun A morning flight to Denver

Lecture 1, 7/21/2005Natural Language Processing24 PPs  PP -> Preposition NP From LA To Boston On Tuesday With lunch

Lecture 1, 7/21/2005Natural Language Processing25 Recursion  We’ll have to deal with rules such as the following where the non-terminal on the left also appears somewhere on the right (directly). NP -> NP PP[[The flight] [to Boston]] VP -> VP PP[[departed Miami] [at noon]]

Lecture 1, 7/21/2005Natural Language Processing26 Recursion  Of course, this is what makes syntax interesting flights from Denver Flights from Denver to Miami Flights from Denver to Miami in February Flights from Denver to Miami in February on a Friday Flights from Denver to Miami in February on a Friday under $300 Flights from Denver to Miami in February on a Friday under $300 with lunch

Lecture 1, 7/21/2005Natural Language Processing27 Recursion  Of course, this is what makes syntax interesting [[flights] [from Denver]] [[[Flights] [from Denver]] [to Miami]] [[[[Flights] [from Denver]] [to Miami]] [in February]] [[[[[Flights] [from Denver]] [to Miami]] [in February]] [on a Friday]] Etc.

Lecture 1, 7/21/2005Natural Language Processing28 Implications of recursion and context- freeness  If you have a rule like VP -> V NP It only cares that the thing after the verb is an NP. It doesn’t have to know about the internal affairs of that NP

Lecture 1, 7/21/2005Natural Language Processing29 The Point  VP -> V NP  I hate flights from Denver Flights from Denver to Miami Flights from Denver to Miami in February Flights from Denver to Miami in February on a Friday Flights from Denver to Miami in February on a Friday under $300 Flights from Denver to Miami in February on a Friday under $300 with lunch

Lecture 1, 7/21/2005Natural Language Processing30 Bracketed Notation  [ S [ NP [ PRO I] [ VP [ V prefer [ NP [ NP [ Det a] [ Nom [ N morning] [ N flight]]]]

Lecture 1, 7/21/2005Natural Language Processing31 Coordination Constructions  S -> S and S John went to NY and Mary followed him  NP -> NP and NP  VP -> VP and VP  …  In fact the right rule for English is X -> X and X

Lecture 1, 7/21/2005Natural Language Processing32 Problems  Agreement  Subcategorization  Movement (for want of a better term)

Lecture 1, 7/21/2005Natural Language Processing33 Agreement  This dog  Those dogs  This dog eats  Those dogs eat  *This dogs  *Those dog  *This dog eat  *Those dogs eats

Lecture 1, 7/21/2005Natural Language Processing34 Possible CFG Solution  S -> NP VP  NP -> Det Nominal  VP -> V NP  …  SgS -> SgNP SgVP  PlS -> PlNp PlVP  SgNP -> SgDet SgNom  PlNP -> PlDet PlNom  PlVP -> PlV NP  SgVP ->SgV Np  …

Lecture 1, 7/21/2005Natural Language Processing35 CFG Solution for Agreement  It works and stays within the power of CFGs  But its ugly  And it doesn’t scale all that well

Lecture 1, 7/21/2005Natural Language Processing36 Subcategorization  Sneeze: John sneezed  Find: Please find [a flight to NY] NP  Give: Give [me] NP [a cheaper fare] NP  Help: Can you help [me] NP [with a flight] PP  Prefer: I prefer [to leave earlier] TO-VP  Said: You said [United has a flight] S ……

Lecture 1, 7/21/2005Natural Language Processing37 Subcategorization  *John sneezed the book  *I prefer United has a flight  *Give with a flight  Subcat expresses the constraints that a predicate (verb for now) places on the number and syntactic types of arguments it wants to take (occur with).

Lecture 1, 7/21/2005Natural Language Processing38 So?  So the various rules for VPs overgenerate. They permit the presence of strings containing verbs and arguments that don’t go together For example VP -> V NP therefore Sneezed the book is a VP since “sneeze” is a verb and “the book” is a valid NP

Lecture 1, 7/21/2005Natural Language Processing39 Subcategorization  Sneeze: John sneezed  Find: Please find [a flight to NY] NP  Give: Give [me] NP [a cheaper fare] NP  Help: Can you help [me] NP [with a flight] PP  Prefer: I prefer [to leave earlier] TO-VP  Told: I was told [United has a flight] S  …

Lecture 1, 7/21/2005Natural Language Processing40 Forward Pointer  It turns out that verb subcategorization facts will provide a key element for semantic analysis (determining who did what to who in an event).

Lecture 1, 7/21/2005Natural Language Processing41 Possible CFG Solution  VP -> V  VP -> V NP  VP -> V NP PP  …  VP -> IntransV  VP -> TransV NP  VP -> TransVwPP NP PP  …

Lecture 1, 7/21/2005Natural Language Processing42 Movement  Core example My travel agent booked the flight

Lecture 1, 7/21/2005Natural Language Processing43 Movement  Core example [[My travel agent] NP [booked [the flight] NP ] VP ] S  I.e. “book” is a straightforward transitive verb. It expects a single NP arg within the VP as an argument, and a single NP arg as the subject.

Lecture 1, 7/21/2005Natural Language Processing44 Movement  What about? Which flight do you want me to have the travel agent book?  The direct object argument to “book” isn’t appearing in the right place. It is in fact a long way from where its supposed to appear.  And note that its separated from its verb by 2 other verbs.

Lecture 1, 7/21/2005Natural Language Processing45 CFGs: a summary  CFGs appear to be just about what we need to account for a lot of basic syntactic structure in English.  But there are problems That can be dealt with adequately, although not elegantly, by staying within the CFG framework.  There are simpler, more elegant, solutions that take us out of the CFG framework (beyond its formal power)  Syntactic theories: HPSG, LFG, CCG, Minimalism, etc

Lecture 1, 7/21/2005Natural Language Processing46 Other Syntactic stuff  Grammatical Relations Subject  I booked a flight to New York  The flight was booked by my agent. Object  I booked a flight to New York Complement  I said that I wanted to leave

Lecture 1, 7/21/2005Natural Language Processing47 Dependency Parsing  Word to word links instead of constituency  Based on the European rather than American traditions  But dates back to the Greeks  The original notions of Subject, Object and the progenitor of subcategorization (called ‘valence’) came out of Dependency theory.  Dependency parsing is quite popular as a computational model  Since relationships between words are quite useful

Lecture 1, 7/21/2005Natural Language Processing48 Parsing  Parsing: assigning correct trees to input strings  Correct tree: a tree that covers all and only the elements of the input and has an S at the top  For now: enumerate all possible trees A further task: disambiguation: means choosing the correct tree from among all the possible trees.

Lecture 1, 7/21/2005Natural Language Processing49 Parsing: examples using Rion Snow’s visualizer  Stanford parser  Minipar parser  Link grammar parser

Lecture 1, 7/21/2005Natural Language Processing50 Treebanks  Parsed corpora in the form of trees  Examples:

Lecture 1, 7/21/2005Natural Language Processing51 Parsed Corpora: Treebanks  The Penn Treebank The Brown corpus The WSJ corpus  Tgrep

Lecture 1, 7/21/2005Natural Language Processing52 Parsing  As with everything of interest, parsing involves a search which involves the making of choices  We’ll start with some basic (meaning bad) methods before moving on to the one or two that you need to know

Lecture 1, 7/21/2005Natural Language Processing53 For Now  Assume… You have all the words already in some buffer The input isn’t pos tagged We won’t worry about morphological analysis All the words are known

Lecture 1, 7/21/2005Natural Language Processing54 Top-Down Parsing  Since we’re trying to find trees rooted with an S (Sentences) start with the rules that give us an S.  Then work your way down from there to the words.

Lecture 1, 7/21/2005Natural Language Processing55 Top Down Space

Lecture 1, 7/21/2005Natural Language Processing56 Bottom-Up Parsing  Of course, we also want trees that cover the input words. So start with trees that link up with the words in the right way.  Then work your way up from there.

Lecture 1, 7/21/2005Natural Language Processing57 Bottom-Up Space

Lecture 1, 7/21/2005Natural Language Processing58 Control  Of course, in both cases we left out how to keep track of the search space and how to make choices Which node to try to expand next Which grammar rule to use to expand a node

Lecture 1, 7/21/2005Natural Language Processing59 Top-Down, Depth-First, Left-to-Right Search

Lecture 1, 7/21/2005Natural Language Processing60 Example

Lecture 1, 7/21/2005Natural Language Processing61 Example

Lecture 1, 7/21/2005Natural Language Processing62 Example

Lecture 1, 7/21/2005Natural Language Processing63 Control  Does this sequence make any sense?

Lecture 1, 7/21/2005Natural Language Processing64 Top-Down and Bottom-Up  Top-down Only searches for trees that can be answers (i.e. S’s) But also suggests trees that are not consistent with the words  Bottom-up Only forms trees consistent with the words Suggest trees that make no sense globally

Lecture 1, 7/21/2005Natural Language Processing65 So Combine Them  There are a million ways to combine top-down expectations with bottom-up data to get more efficient searches  Most use one kind as the control and the other as a filter As in top-down parsing with bottom-up filtering

Lecture 1, 7/21/2005Natural Language Processing66 Bottom-Up Filtering

Lecture 1, 7/21/2005Natural Language Processing67 Top-Down, Depth-First, Left-to-Right Search

Lecture 1, 7/21/2005Natural Language Processing68 TopDownDepthFirstLeftoRight (II)

Lecture 1, 7/21/2005Natural Language Processing69 TopDownDepthFirstLeftoRight (III) flight

Lecture 1, 7/21/2005Natural Language Processing70 TopDownDepthFirstLeftoRight (IV) flight

Lecture 1, 7/21/2005Natural Language Processing71 Adding Bottom-Up Filtering

Lecture 1, 7/21/2005Natural Language Processing72 3 problems with TDDFLtR Parser  Left-Recursion  Ambiguity  Inefficient reparsing of subtrees

Lecture 1, 7/21/2005Natural Language Processing73 Left-Recursion  What happens in the following situation S -> NP VP S -> Aux NP VP NP -> NP PP NP -> Det Nominal … With the sentence starting with  Did the flight…

Lecture 1, 7/21/2005Natural Language Processing74 Ambiguity  One morning I shot an elephant in my pyjamas. How he got into my pajamas I don’t know. (Groucho Marx)

Lecture 1, 7/21/2005Natural Language Processing75 Lots of ambiguity  VP -> VP PP  NP -> NP PP  Show me the meal on flight 286 from SF to Denver  14 parses!

Lecture 1, 7/21/2005Natural Language Processing76 Lots of ambiguity  Church and Patil (1982) Number of parses for such sentences grows at rate of number of parenthesizations of arithmetic expressions Which grow with Catalan numbers PPs Parses

Lecture 1, 7/21/2005Natural Language Processing77 Avoiding Repeated Work  Parsing is hard, and slow. It’s wasteful to redo stuff over and over and over.  Consider an attempt to top-down parse the following as an NP A flight from Indi to Houston on TWA

Lecture 1, 7/21/2005Natural Language Processing78 flight

Lecture 1, 7/21/2005Natural Language Processing79 flight

Lecture 1, 7/21/2005Natural Language Processing80

Lecture 1, 7/21/2005Natural Language Processing81

Lecture 1, 7/21/2005Natural Language Processing82 Dynamic Programming  We need a method that fills a table with partial results that Does not do (avoidable) repeated work Does not fall prey to left-recursion Can find all the pieces of an exponential number of trees in polynomial time.

Lecture 1, 7/21/2005Natural Language Processing83  in bigram POS tagging, we condition a tag only on the preceding tag  why not... use more context (ex. use trigram model)  more precise:  “is clearly marked” --> verb, past participle  “he clearly marked” --> verb, past tense  combine trigram, bigram, unigram models condition on words too  but with an n-gram approach, this is too costly (too many parameters to model) Possible improvements