Download presentation
Presentation is loading. Please wait.
1
Learning Bit by Bit Hidden Markov Models
2
Weighted FSA weather The is outside 1.0.7.3
3
Markov Chain Computing probability of an observed sequence of events
4
Markov Chain weather The is outside.7.3 Observation = “The weather outside” wind.5.1.9
5
Parts of Speech Grammatical constructs like noun, verb
6
POS examples Nnounchair, bandwidth, pacing Vverbstudy, debate, munch ADJadjectivepurple, tall, ridiculous ADVadverbunfortunately, slowly Pprepositionof, by, to PROpronounI, me, mine DETdeterminerthe, a, that, those
7
Parts of Speech-uses Speech recognition Speech synthesis Data mining Translation
8
POS Tagging Words often have more than one POS: back – The back door = JJ – On my back = NN – Win the voters back = RB – Promised to back the bill = VB The POS tagging problem is to determine the POS tag for a particular instance of a word.
9
POS Tagging Sentence = sequence of observations Ie. “Secretariat is expected to race tomorrow”
10
Disambiguating “race”
11
Hidden Markov Model Observed Hidden
12
Hidden Markov Model 2 kinds of probabilities: – Tag transitions – Word likelihoods
13
Hidden Markov Model Tag transition prob = P( tag | previous tag) – ie. P(VB | TO)
14
Hidden Markov Model Word likelihood probability = P(word | tag) – ie. P(“race” | VB)
15
Actual probabilities: – P (NN | TO) =.00047 – P (VB | TO) =.83
16
Actual probabilities: – P (NR| VB) =.0027 – P (NR| NN) =.0012
17
Actual probabilities: – P (race | NN) =.00057 – P (race | VB) =.00012
19
Hidden Markov Model Probability “to race tomorrow” =“TO VB NR” P(VB|TO) * P(NR|VB) * P(race|VB).83 *.0027 *.00012 = 0.00000026892
20
Hidden Markov Model Probability “to race tomorrow” =“TO NN NR” P(NN|TO) * P(NR|NN) * P(race|NN).00047*.0012*.00057 = 0.00000000032148
21
Hidden Markov Model Probability “to race tomorrow” =“TO NN NR” = 0.00000000032148 Probability “to race tomorrow” =“TO VB NR” = 0.00000026892
22
Bayesian Inference Correct answer = max (P (hypothesis | observed))
23
Bayesian Inference Prior probability = likelihood of the hypothesis
24
Bayesian Inference Likelihood = probability that the evidence matches the hypothesis
25
Bayesian Inference Bayesian vs. Frequentists Subjectivity
26
Examples
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.