Download presentation
Presentation is loading. Please wait.
Published byKevin Casey Modified over 9 years ago
1
HIDDEN MARKOV CHAINS Prof. Alagar Rangan Dept of Industrial Engineering Eastern Mediterranean University North Cyprus Source: Probability Models Sheldon M.Ross
2
MARKOV CHAINS Toss a coin repeatedly. Denote Head=1 Denote Head=1 Tail =0 Tail =0 Let Yn=outcome of n th Toss P( Yn=1) = p P( Yn=1) = p P( Yn=0) = q P( Yn=0) = q Y1,Y2,… are iid random variables.
3
Sn = Y1 + Y2 + …+ Yn Sn = Y1 + Y2 + …+ Yn Sn is the accumulated number of Heads in the first n trials. Sn ~ Markov chain ; Time n=0,1,2,… Time n=0,1,2,… States j=0,1,2,… States j=0,1,2,…
4
Xn ~ Markov Chain Xn ~ Markov Chain One step Transition Probability Matrix 0 1 2... 012...
5
n - step Transition Probabilities n - step Transition Probabilities The corresponding Matrix The corresponding Matrix Simple Results: (a) (a) (b) Expected sojourn time in a state (b) Expected sojourn time in a state (c) Steady state Probability (c) Steady state Probability
6
Xn ~ Markov Chain Xn ~ Markov Chain 0 1 2... 0 1 2... One step Transition Probability Matrix
7
Examples: Weather Forecasting States : Dry day, Wet day, state of the nth day Communication System States : signals 0, 1 signals leaving the nth stage of the system. signals leaving the nth stage of the system. Moods of a Professor States: cheerful, ok, unhappy. (C) (O) (U) (C) (O) (U) Mood of the Professor on the nth day. Mood of the Professor on the nth day. 123 1 0 1 0 DryWet Dry Wet UCO U C O
8
Hidden Markov Chain Models Let Xn be a Markov chain with one step Transition Probability Matrix Let S be a set of signals. A signal from S is emitted each time the Markov chain enters a state. If the Markov chain enters state j, then the signal s is emitted with probability, with
9
The above model in which the sequence of signals S1,S2,… is observed while the sequence of the underlying Markov chain states X1,X2,… is unobserved is called a hidden Markov chain Model. signal signal time time state of the chain state of the chain
10
Examples : Production Process State Signal Good state(1) Good state(1) Poor state(2) Poor state(2) acceptable quality.99 Production Process Process acceptable.04 unacceptable.01 unacceptable.96 12 1 2
11
Moods of the Professor Professor Professor Condition of a Patient subject to Therapy. Signal Processing C O U Grades High average Grades average Patient Improving Deteriorating Red Cell count high Red Cell count low Signals sent 0 1 Signals received as 0 Signals received as 1
12
Let be the random vector of the first n signals. For a fixed sequence of signals, let and It can be shown that Now starting with 1 2
13
We can recursively determine using, which will determine. Note We can also compute the above using backward recursion using 21
14
Example: Let Let the first 3 items produced be a,u,a
15
Similarly calculating using Predicting the states. Suppose the first observed n signals are We wish to predict the first n states of the Markov chain using this data. 2
16
Case 1 We wish to maximize the expected number of states that are correctly predicted. For each k=1,2,…,n, we calculate choose that j which maximizes the above as the predictor of.
17
Case 2 A different problem arises if we regard the sequence of states as a single entity. For instance in signal processing while may be actual message sent, would be what is received. Thus the objective is to predict the actual message in its entirety.
18
Let our problem is to find the sequence of states that maximizes To solve the above we let
19
We can show using probabilistic arguments Starting with We can recursively determine for each. This procedure is known as Viterbi Algorithm.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.