Part1 Markov Models for Pattern Recognition – Introduction CSE717, SPRING 2008 CUBS, Univ at Buffalo
Textbook Markov models for pattern recognition: from theory to applications by Gernot A. Fink, 1st Edition, Springer, Nov 2007
Textbook Foundation of Math Statistics Vector Quantization and Mixture Density Models Markov Models Hidden Markov Model (HMM) Model formulation Classic algorithms in the HMM Application domain of the HMM n-Gram Systems Character and handwriting recognition Speech recognition Analysis of biological sequences
Preliminary Requirements Familiar with Probability Theory and Statistics Basic concepts in Stochastic Process
Part 2 a Foundation of Probability Theory, Statistics & Stochastic Process CSE717, SPRING 2008 CUBS, Univ at Buffalo
Coin Toss Problem Coin toss result: X: random variable head, tail: states S X : set of states Probabilities:
Discrete Random Variable A discrete random variable’s states are discrete: natural numbers, integers, etc Described by probabilities of states Pr X (s 1 ), Pr X (x=s 2 ), … s 1, s 2, …: discrete states (possible values of x) Probabilities over all the states add up to 1
Continuous Random Variable A continuous random variable’s states are continuous: real numbers, etc Described by its probability density function (p.d.f.): p X (s) The probability of a<X<b can be obtained by integral Integral from to
Joint Probability and Joint p.d.f. Joint probability of discrete random variables Joint p.d.f. of continuous random variables Independence Condition
Conditional Probability and p.d.f. Conditional probability of discrete random variables Joint p.d.f. for continuous random variables
Statistics: Expected Value and Variance For discrete random variable For continuous random variable
Normal Distribution of Single Random Variable Notation p.d.f Expected value Variance
Stochastic Process A stochastic process is a time series of random variables : random variable t: time stamp Audio signal Stock market
Causal Process A stochastic process is causal if it has a finite history A causal process can be represented by
Stationary Process A stochastic process is stationary if the probability at a fixed time t is the same for all other times, i.e., for any n, and, A stationary process is sometimes referred to as strictly stationary, in contrast with weak or wide-sense stationarity
Gaussian White Noise White Noise: obeys independent identical distribution (i.i.d.) Gaussian White Noise
Gaussian White Noise is a Stationary Process Proof for any n, and,
Temperature Q1: Is the temperature within a day stationary?
Markov Chains A causal process is a Markov chain if for any x 1, …, x t k is the order of the Markov chain First order Markov chain Second order Markov chain
Homogeneous Markov Chains A k-th order Markov chain is homogeneous if the state transition probability is the same over time, i.e., Q2: Does homogeneous Markov chain imply stationary process?
State Transition in Homogeneous Markov Chains Suppose is a k -th order Markov chain and S is the set of all possible states (values) of x t, then for any k+1 states x 0, x 1, …, x k, the state transition probability can be abbreviated to
Rain Dry Two states : ‘Rain’ and ‘Dry’. Transition probabilities: Pr(‘Rain’|‘Rain’)=0.4, Pr(‘Dry’|‘Rain’)=0.6, Pr(‘Rain’|‘Dry’)=0.2, Pr(‘Dry’|‘Dry’)=0.8 Example of Markov Chain
Rain Dry Initial (say, Wednesday) probabilities: Pr Wed (‘Rain’)=0.3, Pr Wed (‘Dry’)=0.7 What’s the probability of rain on Thursday? P Thur (‘Rain’)= Pr Wed (‘Rain’) x Pr(‘Rain’|‘Rain’)+Pr Wed (‘Dry’) x Pr(‘Rain’|‘Dry’)= 0.3 x x 0.2=0.26 Short Term Forecast
Rain Dry P t (‘Rain’)= Pr t-1 (‘Rain’) x Pr(‘Rain’|‘Rain’)+Pr t-1 (‘Dry’) x Pr(‘Rain’|‘Dry’)= Pr t-1 (‘Rain’) x 0.4+(1– Pr t-1 (‘Rain’) x 0.2= x Pr t (‘Rain’) P t (‘Rain’)= Pr t-1 (‘Rain’) => Pr t-1 (‘Rain’)=0.25, Pr t-1 (‘Dry’)=1-0.25=0.75 Condition of Stationary steady state distribution
Rain Dry P t (‘Rain’) = x Pr t-1 (‘Rain’) P t (‘Rain’) – 0.25 = 0.2 x( Pr t-1 (‘Rain’) – 0.25) P t (‘Rain’) = 0.2 t-1 x( Pr 1 (‘Rain’)-0.25)+0.25 P t (‘Rain’) = 0.25 (converges to steady state distribution) Steady-State Analysis
Rain Dry Periodic Markov chain never converges to steady states Periodic Markov Chain