Presentation is loading. Please wait.

Presentation is loading. Please wait.

. Entropy of Hidden Markov Processes Or Zuk 1 Ido Kanter 2 Eytan Domany 1 Weizmann Inst. 1 Bar-Ilan Univ. 2.

Similar presentations


Presentation on theme: ". Entropy of Hidden Markov Processes Or Zuk 1 Ido Kanter 2 Eytan Domany 1 Weizmann Inst. 1 Bar-Ilan Univ. 2."— Presentation transcript:

1 . Entropy of Hidden Markov Processes Or Zuk 1 Ido Kanter 2 Eytan Domany 1 Weizmann Inst. 1 Bar-Ilan Univ. 2

2 2 Overview u Introduction u Problem Definition u Statistical Mechanics approach u Cover&Thomas Upper-Bounds u Radius of Convergence u Related subjects u Future Directions

3 3 HMP - Definitions Markov Process: X – Markov Process M – Transition Matrix u M ij = Pr(X n+1 = j| X n = i) Hidden Markov Process : Y – Noisy Observation of X N – Noise/Emission Matrix u N ij = Pr(Y n = j| X n = i) M N N XnXn X n+1 Y n+1 YnYn

4 4 Example: Binary HMP 0 1 p(1|0) p(0|1) p(1|1) p(0|0) 0 1 q(0|0) q(1|0) q(0|1) q(1|1) Transition Emission

5 5 Example: Binary HMP (Cont.) u For simplicity, we will concentrate on Symmetric Binary HMP : u M = N = u So all properties of the process depend on two parameters, p and . Assume (w.l.o.g.) p,  < ½

6 6 HMP Entropy Rate u Definition : H is difficult to compute, given as a Lyaponov Exponent (which is hard to compute generally.) [Jacquet et al 04] u What to do ? Calculate H in different Regimes.

7 7 Different Regimes p -> 0, p -> ½ (  fixed)  -> 0,  -> ½ (p fixed) [Ordentlich&Weissman 04] study several regimes. We concentrate on the ‘small noise regime’  -> 0. Solution can be given as a power-series in  :

8 8 Statistical Mechanics First, observe the Markovian Property : Perform Change of Variables :

9 9 Statistical Mechanics (cont.) Ising Model : ,   {-1,1} Spin Glasses ++ ++ - +- - ++ -- - ++ - 11 11 22 22 K J K J nn nn

10 10 Statistical Mechanics (cont.) Summing, we get :

11 11 Statistical Mechanics (cont.) Computing the Entropy (low-temperature/high-field expansion) :

12 12 Cover&Thomas Bounds It is known (Cover & Thomas 1991) : u We will use the upper-bounds C (n), and derive their orders : u Qu : Do the orders ‘saturate’ ?

13 13 Cover&Thomas Bounds (cont.) n=4

14 14 Cover&Thomas Bounds (cont.) u Ans : Yes. In fact they ‘saturate’ sooner than would have been expected ! For n  (K+3)/2 they become constant. We therefore have : u Conjecture 1 : (proven for k=1) u How do the orders look ? Their expression is simpler when expressed using = 1-2p, which is the 2 nd eigenvalue of P. u Conjecture 2 :

15 15 First Few Orders : u Note : H 0 -H 2 proven. The rest are conjectures from the upper-bounds.

16 16 First Few Orders (Cont.) :

17 17 First Few Orders (Cont.) :

18 18 Radius of Convergence : u When is our approximation good ? u Instructive : Compare to the I.I.D. model u For HMP, the limit is unknown. We used the fit :

19 19 Radius of Convergence (cont.) :

20 20 Radius of Convergence (cont.) :

21 21 Relative Entropy Rate u Relative entropy rate : u We get :

22 22 Index of Coincidence u Take two realizations Y,Y’ (of length n) of the same HMP. What is the probability that they are equal ? Exponentially decaying with n. u We get : u Similarly, we can solve for three and four (but not five) realizations. Can give bounds on the entropy rate.

23 23 Future Directions u Proving conjectures u Generalizations (e.g. any alphabets, continuous case) u Other regimes u Relative Entropy of two HMPs Thank You


Download ppt ". Entropy of Hidden Markov Processes Or Zuk 1 Ido Kanter 2 Eytan Domany 1 Weizmann Inst. 1 Bar-Ilan Univ. 2."

Similar presentations


Ads by Google