Download presentation
Presentation is loading. Please wait.
Published byMerilyn Warren Modified over 8 years ago
1
3/20/2016CPSC503 Winter 20081 CPSC 503 Computational Linguistics Lecture 7 Giuseppe Carenini
2
3/20/2016CPSC503 Winter 20082 Today 29/9 Finish POS tagging Start Syntax / Parsing (Chp 12!)
3
3/20/2016CPSC503 Winter 20083 PoS Tagging Dictionary word i -> set of tags from Tagset Brainpower_NN,_, not_RB physical_JJ plant_NN,_, is_VBZ now_RB a_DT firm_NN 's_POS chief_JJ asset_NN._. ………. Brainpower, not physical plant, is now a firm's chief asset. ………… Input text Output Tagger
4
3/20/2016CPSC503 Winter 20084 Tagger Types Rule-based ‘95 Stochastic –HMM tagger ~ >= ’92 –Transformation-based tagger (Brill) ~ >= ’95 –Maximum Entropy Models ~ >= ’97
5
3/20/2016CPSC503 Winter 20085 Rule-Based (ENGTWOL ’95-99) 1.A lexicon transducer returns for each word all possible morphological parses 2.A set of ~3,000 constraints is applied to rule out inappropriate PoS Step 1: sample I/O “Pavlov had show that salivation….” Pavlov N SG PROPER had HAVE V PAST SVO HAVE PCP2 SVO shown SHOW PCP2 SVOO …… that ADV PRON DEM SG CS …….. ……. Sample Constraint Example: Adverbial “that” rule Given input: “that” If (+1 A/ADV/QUANT) (+2 SENT-LIM) (NOT -1 SVOC/A) Then eliminate non-ADV tags Else eliminate ADV
6
3/20/2016CPSC503 Winter 20086 HMM Stochastic Tagging Tags corresponds to an HMM states Words correspond to the HMM alphabet symbols Tagging: given a sequence of words (observations), find the most likely sequence of tags (states) But this is…..! We need: State transition and symbol emission probabilities 1) From hand- tagged corpus 2) No tagged corpus: parameter estimation (Baum-Welch)
7
3/20/2016CPSC503 Winter 20087 Evaluating Taggers Accuracy: percent correct (most current taggers 96-7%) *test on unseen data!* Human Celing: agreement rate of humans on classification (96-7%) Unigram baseline: assign each token to the class it occurred in most frequently in the training set (race -> NN). (91%) What is causing the errors? Build a confusion matrix…
8
3/20/2016 Speech and Language Processing - Jurafsky and Martin 8 Confusion matrix Look at a confusion matrix Precision ? Recall ?
9
3/20/2016 Speech and Language Processing - Jurafsky and Martin 9 Error Analysis (textbook) Look at a confusion matrix See what errors are causing problems –Noun (NN) vs ProperNoun (NNP) vs Adj (JJ) –Preterite (VBD) vs Participle (VBN) vs Adjective (JJ)
10
3/20/2016CPSC503 Winter 200810 Today 29/9 Finish POS tagging English Syntax Context-Free Grammar for English –Rules –Trees –Recursion –Problems Start Parsing
11
3/20/2016CPSC503 Winter 200811 Knowledge-Formalisms Map (next three lectures) Logical formalisms (First-Order Logics) Rule systems (and prob. versions) (e.g., (Prob.) Context-Free Grammars) State Machines (and prob. versions) (Finite State Automata,Finite State Transducers, Markov Models) Morphology Syntax Pragmatics Discourse and Dialogue Semantics AI planners
12
3/20/2016CPSC503 Winter 200812 Syntax Def. The study of how sentences are formed by grouping and ordering words Example: Ming and Sue prefer morning flights * Ming Sue flights morning and prefer Groups behave as single unit wrt Substitution, Movement, Coordination
13
3/20/2016CPSC503 Winter 200813 Syntactic Notions so far... N-grams: prob. distr. for next word can be effectively approximated knowing previous n words POS categories are based on: –distributional properties (what other words can occur nearby) –morphological properties (affixes they take)
14
3/20/2016CPSC503 Winter 200814 Syntax: Useful tasks Why should you care? –Grammar checkers –Basis for semantic interpretation Question answering Information extraction Summarization –Machine translation –……
15
3/20/2016 From Speech and Language Processing - Jurafsky and Martin 15 Grammars and Constituency Of course, there’s nothing easy or obvious about how we come up with right set of constituents and the rules that govern how they combine... That’s why there are so many different theories of grammar and competing analyses of the same data. The approach to grammar, and the analyses, adopted here are very generic (and don’t correspond to any modern linguistic theory of grammar).
16
3/20/2016CPSC503 Winter 200816 Key Constituents – with heads (English) Noun phrases Verb phrases Prepositional phrases Adjective phrases Sentences (Det) N (PP) (Qual) V (NP) (Deg) P (NP) (Deg) A (PP) (NP) (I)(VP) Some simple specifiers Category Typical functionExamples Determiner specifier of N the, a, this, no.. Qualifier specifier of V never, often.. Degree word specifier of A or P very, almost.. Complements? (Specifier) X (Complement)
17
3/20/2016CPSC503 Winter 200817 Key Constituents: Examples Noun phrases Verb phrases Prepositional phrases Adjective phrases Sentences (Det) N (PP) the cat on the table (Qual) V (NP) never eat a cat (Deg) P (NP) almost inthe net (Deg) A (PP) very happy about it (NP) (I)(VP) a mouse --ate it
18
3/20/2016CPSC503 Winter 200818 Context Free Grammar (Example) S -> NP VP NP -> Det NOMINAL NOMINAL -> Noun VP -> Verb Det -> a Noun -> flight Verb -> left Terminal Non-terminal Start-symbol
19
3/20/2016CPSC503 Winter 200819 CFG more complex Example LexiconGrammar with example phrases
20
3/20/2016CPSC503 Winter 200820 CFGs Define a Formal Language (un/grammatical sentences) Generative Formalism –Generate strings in the language –Reject strings not in the language –Impose structures (trees) on strings in the language
21
3/20/2016CPSC503 Winter 200821 CFG: Formal Definitions 4-tuple (non-term., term., productions, start) (N, , P, S) P is a set of rules A ; A N, ( N)* A derivation is the process of rewriting 1 into m ( both strings in ( N)*) by applying a sequence of rules: 1 * m L G = W|w * and S * w
22
3/20/2016CPSC503 Winter 200822 Derivations as Trees flight Nominal Context Free?
23
3/20/2016CPSC503 Winter 200823 CFG Parsing It is completely analogous to running a finite-state transducer with a tape –It’s just more powerful Chpt. 13 Parser I prefer a morning flight flight Nominal
24
3/20/2016CPSC503 Winter 200824 Other Options Regular languages (FSA) A x B or A x –Too weak (e.g., cannot deal with recursion in a general way – no center-embedding) CFGs A (also produce more understandable and “useful” structure) Context-sensitive A ; ≠ –Can be computationally intractable Turing equiv. ; ≠ –Too powerful / Computationally intractable
25
3/20/2016CPSC503 Winter 200825 Common Sentence-Types Declaratives: A plane left S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane leave? S -> Aux NP VP WH Questions: Which flights serve breakfast? S -> WH NP VP When did the plane leave? S -> WH Aux NP VP
26
3/20/2016CPSC503 Winter 200826 NP: more details NP -> Specifiers N Complements NP -> (Predet)(Det)(Card)(Ord)(Quant) (AP) Nom e.g., all the other cheap cars Nom -> Nom PP (PP) (PP) e.g., reservation on BA456 from NY to YVR Nom -> Nom GerundVP e.g., flight arriving on Monday Nom -> Nom RelClause Nom RelClause ->(who | that) VP e.g., flight that arrives in the evening
27
3/20/2016CPSC503 Winter 200827 Conjunctive Constructions S -> S and S –John went to NY and Mary followed him NP -> NP and NP –John went to NY and Boston VP -> VP and VP –John went to NY and visited MOMA … In fact the right rule for English is X -> X and X
28
3/20/2016CPSC503 Winter 200828 Problems with CFGs Agreement Subcategorization
29
3/20/2016CPSC503 Winter 200829 Agreement In English, –Determiners and nouns have to agree in number –Subjects and verbs have to agree in person and number Many languages have agreement systems that are far more complex than this (e.g., gender).
30
3/20/2016CPSC503 Winter 200830 Agreement This dog Those dogs This dog eats You have it Those dogs eat *This dogs *Those dog *This dog eat *You has it *Those dogs eats
31
3/20/2016CPSC503 Winter 200831 Possible CFG Solution S -> NP VP NP -> Det Nom VP -> V NP … SgS -> SgNP SgVP PlS -> PlNp PlVP SgNP -> SgDet SgNom PlNP -> PlDet PlNom PlVP -> PlV NP SgVP ->SgV NP … Sg = singular Pl = plural OLD GrammarNEW Grammar
32
3/20/2016CPSC503 Winter 200832 CFG Solution for Agreement It works and stays within the power of CFGs But it doesn’t scale all that well (explosion in the number of rules)
33
3/20/2016CPSC503 Winter 200833 Subcategorization *John sneezed the book *I prefer United has a flight *Give with a flight Def. It expresses constraints that a predicate (verb here) places on the number and type of its arguments (see first table)
34
3/20/2016CPSC503 Winter 200834 Subcategorization Sneeze: John sneezed Find: Please find [a flight to NY] NP Give: Give [me] NP [a cheaper fare] NP Help: Can you help [me] NP [with a flight] PP Prefer: I prefer [to leave earlier] TO-VP Told: I was told [United has a flight] S …
35
3/20/2016CPSC503 Winter 200835 So? So the various rules for VPs overgenerate. –They allow strings containing verbs and arguments that don’t go together –For example: VP -> V NP therefore Sneezed the book VP -> V S therefore go she will go there
36
3/20/2016CPSC503 Winter 200836 Possible CFG Solution VP -> V VP -> V NP VP -> V NP PP … VP -> IntransV VP -> TransV NP VP -> TransPP to NP PP to … TransPP to -> hand,give,.. This solution has the same problem as the one for agreement OLD Grammar NEW Grammar
37
3/20/2016CPSC503 Winter 200837 CFG for NLP: summary CFGs cover most syntactic structure in English. But there are problems (overgeneration) –That can be dealt with adequately, although not elegantly, by staying within the CFG framework. There are simpler, more elegant, solutions that take us out of the CFG framework: LFG, XTAGS… Chpt 15 “Features and Unification”
38
3/20/2016CPSC503 Winter 200838 Dependency Grammars Syntactic structure: binary relations between words Links: grammatical function or very general semantic relation Abstract away from word-order variations (simpler grammars) Useful features in many NLP applications (for classification, summarization and NLG)
39
3/20/2016CPSC503 Winter 200839 Today 29/9 English Syntax Context-Free Grammar for English –Rules –Trees –Recursion –Problems Start Parsing
40
3/20/2016CPSC503 Winter 200840 Parsing with CFGs Assign valid trees: covers all and only the elements of the input and has an S at the top Parser I prefer a morning flight flight Nominal CFG Sequence of words Valid parse trees
41
3/20/2016CPSC503 Winter 200841 Parsing as Search S -> NP VP S -> Aux NP VP NP -> Det Noun VP -> Verb Det -> a Noun -> flight Verb -> left, arrive Aux -> do, does Search space of possible parse trees CFG defines Parsing : find all trees that cover all and only the words in the input
42
3/20/2016CPSC503 Winter 200842 Constraints on Search Parser I prefer a morning flight flight Nominal CFG (search space) Sequence of wordsValid parse trees Search Strategies: Top-down or goal-directed Bottom-up or data-directed
43
3/20/2016CPSC503 Winter 200843 Top-Down Parsing Since we’re trying to find trees rooted with an S (Sentences) start with the rules that give us an S. Then work your way down from there to the words. flight Input:
44
3/20/2016CPSC503 Winter 200844 Next step: Top Down Space When POS categories are reached, reject trees whose leaves fail to match all words in the input ……..
45
3/20/2016CPSC503 Winter 200845 Bottom-Up Parsing Of course, we also want trees that cover the input words. So start with trees that link up with the words in the right way. Then work your way up from there. flight
46
3/20/2016CPSC503 Winter 200846 Two more steps: Bottom-Up Space flight ……..
47
3/20/2016CPSC503 Winter 200847 Top-Down vs. Bottom-Up Top-down –Only searches for trees that can be answers –But suggests trees that are not consistent with the words Bottom-up –Only forms trees consistent with the words –Suggest trees that make no sense globally
48
3/20/2016CPSC503 Winter 200848 So Combine Them Top-down: control strategy to generate trees Bottom-up: to filter out inappropriate parses Top-down Control strategy: Depth vs. Breadth first Which node to try to expand next Which grammar rule to use to expand a node (left-most) (textual order)
49
3/20/2016CPSC503 Winter 200849 Top-Down, Depth-First, Left-to- Right Search Sample sentence: “Does this flight include a meal?”
50
3/20/2016CPSC503 Winter 200850 Example “Does this flight include a meal?”
51
3/20/2016CPSC503 Winter 200851 flight Example “Does this flight include a meal?”
52
3/20/2016CPSC503 Winter 200852 flight Example “Does this flight include a meal?”
53
3/20/2016CPSC503 Winter 200853 Adding Bottom-up Filtering The following sequence was a waste of time because an NP cannot generate a parse tree starting with an AUX Aux
54
3/20/2016CPSC503 Winter 200854 Bottom-Up Filtering CategoryLeft Corners SDet, Proper-Noun, Aux, Verb NPDet, Proper-Noun NominalNoun VPVerb Aux
55
3/20/2016CPSC503 Winter 200855 Problems with TD-BU-filtering Left recursion Ambiguity Repeated Parsing SOLUTION: Earley Algorithm (once again dynamic programming!)
56
3/20/2016CPSC503 Winter 200856 For Next Time Read Chapter 13 (Parsing) Optional: Read Chapter 16 (Features and Unification) – skip algorithms and implementation
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.