Download presentation
Presentation is loading. Please wait.
1
Hidden Markov Models Usman Roshan BNFO 601
2
Hidden Markov Models Alphabet of symbols: Set of states that emit symbols from the alphabet: Set of probabilities –State transition: –Emission probabilities:
3
Loaded die problem
4
Loaded die automata F L a FL a LF e F(i) e L(i) a FF a LL
5
Loaded die problem Consider the following rolls: Observed : 21665261 Underlying die :FFLLFLLF What is the probability that the underlying path generated the observed sequence?
6
Optimization Problem: Given an HMM and a sequence of rolls, find the most probably underlying generating path. Let be the sequence of rolls. Let V F (i) denote the probability of the most probable path of that ends in state F. (Define V L (i) similarly.)
7
Optimization Initialize : Recurrence: for i=0..n-1
8
Parameter learning (when generating sequence is known) A kl : number of transitions from state k to l E k (b): number of times state k emits symbol b
9
Parameter learning (when generating sequence is unknown) Use Expected-Maximization algorithm (also known as EM algorithm) Very popular and many applications For HMMs also called Baum-Welch algorithm Outline: 1.Start with random assignment to transition and emission probabilities 2.Find expected transition and emission probabilities 3.Estimate actual transition and emission probabilities from expected values in previous step 4.Go to step 2 if not converged
10
Sequence alignment
11
Alignment recurrence Initialization Recursions
12
Alignment recurrence Termination
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.