Download presentation
Presentation is loading. Please wait.
1
Markov Models
2
Markov Models Markov Models can be seen as special cases of finite state automata Specifically, weighted finite state automata Very simply, one can imagine a markov model as being a FSA decorated with probabilities (There’s more to it than that, but this is sufficient for now.)
3
Markov Models start TO DT NN VB NNP IN end end 0.2 1 0.05 0.7 0.1 0.95
0.5 0.25 0.1 NN 0.1 0.5 0.2 0.3 VB 0.05 0.25 0.2 NNP 0.7 0.2 0.1 0.9 IN 0.1 end end
4
Markov Models A Markov Model consists of the following:
A finite set of states S = {1, 2, …, n}. A set of transition probabilities between states P = {p11, p12, …, pnn}. For each transition probability pij P, pij 0. And the following property holds: pij = 1 j S
5
Markov Models The set of transition probabilities can be arranged in a matrix: p11 p12 …p1j A = p21 p22 …p2j … Sang also describes a matrix Π, which is the set of probabilities for each starting state [ ]
6
Markov Model Formally, then, a Markov Model is a 3-tuple (S, Π, A).
For NLP applications, how are the probabilities derived? Can be hand-coded Usually derived from data, often a corpus
7
Markov Models and NLP Exercise
Build the transition probability matrix and over this set of data The duck died. The car killed the duck. The duck died under her car. We duck under the car. We retrieve the poor duck. Build the starting probability matrix
8
Markov Models and NLP Exercise
What’s the probability for some given output? The duck died under her car. We duck under the car. The duck under the car. We retrieve killed the duck. We the poor duck died. We retrieve the poor duck under the car. For a given start state (The, We), what’s the most likely string to be output?
9
Markov Models The homework
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.