POS TAGGING AND HMM Tim Teks Mining Adapted from Heng Ji.

Slides:



Advertisements
Similar presentations
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING Machine Learning PoS-Taggers COMP3310 Natural Language Processing Eric.
Advertisements

School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING PoS-Tagging theory and terminology COMP3310 Natural Language Processing.
POS TAGGING AND SYNTACTIC PARSING Heng Ji September 10, 2014.
CPSC 422, Lecture 16Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 16 Feb, 11, 2015.
Outline Why part of speech tagging? Word classes
Word Classes and Part-of-Speech (POS) Tagging
Ch 9 Part of Speech Tagging (slides adapted from Dan Jurafsky, Jim Martin, Dekang Lin, Rada Mihalcea, and Bonnie Dorr and Mitch Marcus.) ‏
1 Part of Speech tagging Lecture 9 Slides adapted from: Dan Jurafsky, Julia Hirschberg, Jim Martin.
Chapter 8. Word Classes and Part-of-Speech Tagging From: Chapter 8 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech.
BİL711 Natural Language Processing
Part-of-speech tagging. Parts of Speech Perhaps starting with Aristotle in the West (384–322 BCE) the idea of having parts of speech lexical categories,
Part of Speech Tagging Importance Resolving ambiguities by assigning lower probabilities to words that don’t fit Applying to language grammatical rules.
Natural Language Processing Lecture 8—9/24/2013 Jim Martin.
Hidden Markov Models IP notice: slides from Dan Jurafsky.
Hidden Markov Model (HMM) Tagging  Using an HMM to do POS tagging  HMM is a special case of Bayesian inference.
Part II. Statistical NLP Advanced Artificial Intelligence Part of Speech Tagging Wolfram Burgard, Luc De Raedt, Bernhard Nebel, Lars Schmidt-Thieme Most.
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Part-Of-Speech (POS) Tagging.
Viterbi Algorithm Ralph Grishman G Natural Language Processing.
Learning Bit by Bit Hidden Markov Models. Weighted FSA weather The is outside
Ch 10 Part-of-Speech Tagging Edited from: L. Venkata Subramaniam February 28, 2002.
POS based on Jurafsky and Martin Ch. 8 Miriam Butt October 2003.
Tagging – more details Reading: D Jurafsky & J H Martin (2000) Speech and Language Processing, Ch 8 R Dale et al (2000) Handbook of Natural Language Processing,
Viterbi Algorithm. Computing Probabilities viterbi [ s, t ] = max(s’) ( viterbi [ s’, t-1] * transition probability P(s | s’) * emission probability P.
Syllabus Text Books Classes Reading Material Assignments Grades Links Forum Text Books עיבוד שפות טבעיות - שיעור חמישי POS Tagging Algorithms עידו.
Part of speech (POS) tagging
Part-of-Speech Tagging & Sequence Labeling
Word classes and part of speech tagging Chapter 5.
BIOI 7791 Projects in bioinformatics Spring 2005 March 22 © Kevin B. Cohen.
8. Word Classes and Part-of-Speech Tagging 2007 년 5 월 26 일 인공지능 연구실 이경택 Text: Speech and Language Processing Page.287 ~ 303.
1 POS Tagging: Introduction Heng Ji Feb 2, 2008 Acknowledgement: some slides from Ralph Grishman, Nicolas Nicolov, J&M.
Lemmatization Tagging LELA /20 Lemmatization Basic form of annotation involving identification of underlying lemmas (lexemes) of the words in.
Part II. Statistical NLP Advanced Artificial Intelligence Applications of HMMs and PCFGs in NLP Wolfram Burgard, Luc De Raedt, Bernhard Nebel, Lars Schmidt-Thieme.
Parts of Speech Sudeshna Sarkar 7 Aug 2008.
Part of Speech Tagging & Hidden Markov Models Mitch Marcus CSE 391.
Lecture 6 POS Tagging Methods Topics Taggers Rule Based Taggers Probabilistic Taggers Transformation Based Taggers - Brill Supervised learning Readings:
인공지능 연구실 정 성 원 Part-of-Speech Tagging. 2 The beginning The task of labeling (or tagging) each word in a sentence with its appropriate part of speech.
CSA2050: Introduction to Computational Linguistics Part of Speech (POS) Tagging II Transformation Based Tagging Brill (1995)
10/24/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 6 Giuseppe Carenini.
Sequence Models With slides by me, Joshua Goodman, Fei Xia.
10/30/2015CPSC503 Winter CPSC 503 Computational Linguistics Lecture 7 Giuseppe Carenini.
13-1 Chapter 13 Part-of-Speech Tagging POS Tagging + HMMs Part of Speech Tagging –What and Why? What Information is Available? Visible Markov Models.
Word classes and part of speech tagging Chapter 5.
Speech and Language Processing Ch8. WORD CLASSES AND PART-OF- SPEECH TAGGING.
Tokenization & POS-Tagging
CSA2050: Introduction to Computational Linguistics Part of Speech (POS) Tagging I Introduction Tagsets Approaches.
Word classes and part of speech tagging 09/28/2004 Reading: Chap 8, Jurafsky & Martin Instructor: Rada Mihalcea Note: Some of the material in this slide.
Viterbi Algorithm CSCI-GA.2590 – Natural Language Processing Ralph Grishman NYU.
CSA3202 Human Language Technology HMMs for POS Tagging.
CPSC 422, Lecture 15Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 15 Oct, 14, 2015.
Part-of-speech tagging
Word classes and part of speech tagging. Slide 1 Outline Why part of speech tagging? Word classes Tag sets and problem definition Automatic approaches.
Part-of-Speech Tagging & Sequence Labeling Hongning Wang
Word classes and part of speech tagging Chapter 5.
Part-of-Speech Tagging CSCI-GA.2590 – Lecture 4 Ralph Grishman NYU.
SI485i : NLP Set 7 Syntax and Parsing. 2 Syntax Grammar, or syntax: The kind of implicit knowledge of your native language that you had mastered by the.
6/18/2016CPSC503 Winter CPSC 503 Computational Linguistics Lecture 6 Giuseppe Carenini.
Part-of-Speech Tagging CSE 628 Niranjan Balasubramanian Many slides and material from: Ray Mooney (UT Austin) Mausam (IIT Delhi) * * Mausam’s excellent.
Speech and Language Processing SLP Chapter 5. 10/31/1 2 Speech and Language Processing - Jurafsky and Martin 2 Today  Parts of speech (POS)  Tagsets.
CSC 594 Topics in AI – Natural Language Processing
Lecture 9: Part of Speech
CSCI 5832 Natural Language Processing
CS4705 Part of Speech tagging
CSC 594 Topics in AI – Natural Language Processing
CSC 594 Topics in AI – Natural Language Processing
KEY CHALLENGES Overview POS Tagging
CSCI 5832 Natural Language Processing
Natural Language Processing
Natural Language Processing
Part-of-Speech Tagging Using Hidden Markov Models
Natural Language Processing (NLP)
Presentation transcript:

POS TAGGING AND HMM Tim Teks Mining Adapted from Heng Ji

Outline POS Tagging and HMM

3/39 What is Part-of-Speech (POS) Generally speaking, Word Classes (=POS) : Verb, Noun, Adjective, Adverb, Article, … We can also include inflection: Verbs: Tense, number, … Nouns: Number, proper/common, … Adjectives: comparative, superlative, … …

4/39 Parts of Speech 8 (ish) traditional parts of speech Noun, verb, adjective, preposition, adverb, article, interjection, pronoun, conjunction, etc Called: parts-of-speech, lexical categories, word classes, morphological classes, lexical tags... Lots of debate within linguistics about the number, nature, and universality of these We’ll completely ignore this debate.

5/39 7 Traditional POS Categories Nnounchair, bandwidth, pacing Vverbstudy, debate, munch ADJadjpurple, tall, ridiculous ADVadverbunfortunately, slowly, Pprepositionof, by, to PROpronounI, me, mine DETdeterminerthe, a, that, those

6/39 POS Tagging The process of assigning a part-of-speech or lexical class marker to each word in a collection. WORD tag theDET koalaN put V the DET keysN onP theDET tableN

7/39 Penn TreeBank POS Tag Set Penn Treebank: hand-annotated corpus of Wall Street Journal, 1M words 46 tags Some particularities: to /TO not disambiguated Auxiliaries and verbs not distinguished

8/39 Penn Treebank Tagset

9/39 Why POS tagging is useful? Speech synthesis: How to pronounce “ lead ” ? INsult inSULT OBject obJECT OVERflow overFLOW DIScountdisCOUNT CONtent conTENT Stemming for information retrieval Can search for “aardvarks” get “aardvark” Parsing and speech recognition and etc Possessive pronouns (my, your, her) followed by nouns Personal pronouns (I, you, he) likely to be followed by verbs Need to know if a word is an N or V before you can parse Information extraction Finding names, relations, etc. Machine Translation

10/39 Open and Closed Classes Closed class: a small fixed membership Prepositions: of, in, by, … Auxiliaries: may, can, will had, been, … Pronouns: I, you, she, mine, his, them, … Usually function words (short common words which play a role in grammar) Open class: new ones can be created all the time English has 4: Nouns, Verbs, Adjectives, Adverbs Many languages have these 4, but not all!

11/39 Open Class Words Nouns Proper nouns (Boulder, Granby, Eli Manning) English capitalizes these. Common nouns (the rest). Count nouns and mass nouns Count: have plurals, get counted: goat/goats, one goat, two goats Mass: don’t get counted (snow, salt, communism) (*two snows) Adverbs: tend to modify things Unfortunately, John walked home extremely slowly yesterday Directional/locative adverbs (here,home, downhill) Degree adverbs (extremely, very, somewhat) Manner adverbs (slowly, slinkily, delicately) Verbs In English, have morphological affixes (eat/eats/eaten)

12/39 Closed Class Words Examples : prepositions: on, under, over, … particles: up, down, on, off, … determiners: a, an, the, … pronouns: she, who, I,.. conjunctions: and, but, or, … auxiliary verbs: can, may should, … numerals: one, two, three, third, …

13/39 Prepositions from CELEX

14/39 English Particles

15/39 Conjunctions

16/39 POS Tagging Choosing a Tagset There are so many parts of speech, potential distinctions we can draw To do POS tagging, we need to choose a standard set of tags to work with Could pick very coarse tagsets N, V, Adj, Adv. More commonly used set is finer grained, the “Penn TreeBank tagset”, 45 tags PRP$, WRB, WP$, VBG Even more fine-grained tagsets exist

17/39 Using the Penn Tagset The/DT grand/JJ jury/NN commmented/VBD on/IN a/DT number/NN of/IN other/JJ topics/NNS./. Prepositions and subordinating conjunctions marked IN (“although/IN I/PRP..”) Except the preposition/complementizer “to” is just marked “TO”.

18/39 POS Tagging Words often have more than one POS: back The back door = JJ On my back = NN Win the voters back = RB Promised to back the bill = VB The POS tagging problem is to determine the POS tag for a particular instance of a word. These examples from Dekang Lin

19/39 How Hard is POS Tagging? Measuring Ambiguity

20/39 Current Performance How many tags are correct? About 97% currently But baseline is already 90% Baseline algorithm: Tag every word with its most frequent tag Tag unknown words as nouns How well do people do?

21/39 Quick Test: Agreement? the students went to class plays well with others fruit flies like a banana DT: the, this, that NN: noun VB: verb P: prepostion ADV: adverb

22/39 Quick Test the students went to class DT NN VB P NN plays well with others VB ADV P NN NN NN P DT fruit flies like a banana NN NN VB DT NN NN VB P DT NN NN NN P DT NN NN VB VB DT NN

23/39 How to do it? History Brown Corpus Created (EN-US) 1 Million Words Brown Corpus Tagged HMM Tagging (CLAWS) 93%-95% Greene and Rubin Rule Based - 70% LOB Corpus Created (EN-UK) 1 Million Words DeRose/Church Efficient HMM Sparse Data 95%+ British National Corpus (tagged by CLAWS) POS Tagging separated from other NLP Transformation Based Tagging (Eric Brill) Rule Based – 95%+ Tree-Based Statistics (Helmut Shmid) Rule Based – 96%+ Neural Network 96%+ Trigram Tagger (Kempe) 96%+ Combined Methods 98%+ Penn Treebank Corpus (WSJ, 4.5M) LOB Corpus Tagged

24/39 Two Methods for POS Tagging 1. Rule-based tagging (ENGTWOL) 2. Stochastic 1. Probabilistic sequence models HMM (Hidden Markov Model) tagging MEMMs (Maximum Entropy Markov Models)

25/39 Rule-Based Tagging Start with a dictionary Assign all possible tags to words from the dictionary Write rules by hand to selectively remove tags Leaving the correct tag for each word.

26/39 Rule-based taggers Early POS taggers all hand-coded Most of these (Harris, 1962; Greene and Rubin, 1971) and the best of the recent ones, ENGTWOL (Voutilainen, 1995) based on a two-stage architecture Stage 1: look up word in lexicon to give list of potential POSs Stage 2: Apply rules which certify or disallow tag sequences Rules originally handwritten; more recently Machine Learning methods can be used

27/39 Start With a Dictionary she:PRP promised:VBN,VBD toTO back:VB, JJ, RB, NN the:DT bill: NN, VB Etc… for the ~100,000 words of English with more than 1 tag

28/39 Assign Every Possible Tag NN RB VBNJJ VB PRPVBD TOVB DT NN Shepromised to back the bill

29/39 Write Rules to Eliminate Tags Eliminate VBN if VBD is an option when VBN|VBD follows “ PRP” NN RB JJ VB PRPVBD TO VB DTNN Shepromisedtoback thebill VBN

30/39 POS tagging The involvement of ion channels in B and T lymphocyte activation is DT NN IN NN NNS IN NN CC NN NN NN VBZ supported by many reports of changes in ion fluxes and membrane VBN IN JJ NNS IN NNS IN NN NNS CC NN ……………………………………………………………………………………. Machine Learning Algorithm training We demonstrate that … Unseen text We demonstrate PRP VBP that … IN

31/39 Goal of POS Tagging  We want the best set of tags for a sequence of words (a sentence)  W — a sequence of words  T — a sequence of tags  Example: P((NN NN P DET ADJ NN) | (heat oil in a large pot)) Our Goal Our Goal

32/39 But, the Sparse Data Problem … Rich Models often require vast amounts of data Count up instances of the string "heat oil in a large pot" in the training corpus, and pick the most common tag assignment to the string.. Too many possible combinations

33/39 POS Tagging as Sequence Classification We are given a sentence (an “observation” or “sequence of observations”) Secretariat is expected to race tomorrow What is the best sequence of tags that corresponds to this sequence of observations? Probabilistic view: Consider all possible sequences of tags Out of this universe of sequences, choose the tag sequence which is most probable given the observation sequence of n words w 1 …w n.

34/39 Getting to HMMs We want, out of all sequences of n tags t 1 …t n the single tag sequence such that P(t 1 …t n |w 1 …w n ) is highest. Hat ^ means “our estimate of the best one” Argmax x f(x) means “the x such that f(x) is maximized”

35/39 Getting to HMMs This equation is guaranteed to give us the best tag sequence But how to make it operational? How to compute this value? Intuition of Bayesian classification: Use Bayes rule to transform this equation into a set of other probabilities that are easier to compute

36/39 Reminder: Apply Bayes’ Theorem (1763) posteriorposterior priorprior likelihoodlikelihood marginal likelihood Reverend Thomas Bayes — Presbyterian minister ( ) Our Goal: To maximize it!

37/39 How to Count  P(W|T) and P(T) can be counted from a large hand-tagged corpus; and smooth them to get rid of the zeroes

38/39 Count P(W|T) and P(T)  Assume each word in the sequence depends only on its corresponding tag:

39/39 Make a Markov assumption and use N-grams over tags... P(T) is a product of the probability of N-grams that make it up Make a Markov assumption and use N-grams over tags... P(T) is a product of the probability of N-grams that make it up Count P(T) historyhistory

40/39 Part-of-speech tagging with Hidden Markov Models wordstags output probability transition probability

41/39 Analyzing Fish sleep.

42/39 A Simple POS HMM startnounverb end

43/39 Word Emission Probabilities P ( word | state ) A two-word language: “ fish ” and “ sleep ” Suppose in our training corpus, “ fish ” appears 8 times as a noun and 5 times as a verb “ sleep ” appears twice as a noun and 5 times as a verb Emission probabilities: Noun P(fish | noun) :0.8 P(sleep | noun) :0.2 Verb P(fish | verb) :0.5 P(sleep | verb) :0.5

44/39 Viterbi Probabilities

45/39 startnounverb end

46/39 startnounverb end Token 1: fish

47/39 startnounverb end Token 1: fish

48/39 startnounverb end Token 2: sleep (if ‘fish’ is verb)

49/39 startnounverb end Token 2: sleep (if ‘fish’ is verb)

50/39 startnounverb end Token 2: sleep (if ‘fish’ is a noun)

51/39 startnounverb end Token 2: sleep (if ‘fish’ is a noun)

52/39 startnounverb end Token 2: sleep take maximum, set back pointers

53/39 startnounverb end Token 2: sleep take maximum, set back pointers

54/39 startnounverb end Token 3: end

55/39 startnounverb end Token 3: end take maximum, set back pointers

56/39 startnounverb end Decode: fish = noun sleep = verb

Markov Chain for a Simple Name Tagger START END PER X George:0.3 W.:0.3 discussed:0.7 $:1.0 LOC Bush:0.3 Iraq:0.1 George:0.2 Iraq:0.8 Transition Probability Emission Probability

58/39 Exercise Tag names in the following sentence: George. W. Bush discussed Iraq.

59/39 POS taggers Brill ’ s tagger TnT tagger Stanford tagger SVMTool GENIA tagger More complete list at: