Download presentation
Presentation is loading. Please wait.
Published byLeo Merritt Modified over 9 years ago
1
Convolutional Codes
2
p2. OUTLINE [1] Shift registers and polynomials [2] Encoding convolutional codes [3] Decoding convolutional codes [4] Truncated Viterbi decoding
3
p3. Convolutional Code [1] Shift registers and polynomials XOR : register and clock X0X0 X2X2 Output 0 0 1 1 0 1 X1X1 X2X2 Output= X 0 +X 1 +X 2 0 1 1 0 XiXi clock register t 0 01 1 t-1 t+1 X1X1 0 0 0 0 X0X0 0 0 1 1 0 1 1 0 0 1 1 1 1 1 1 0 0
4
p4. Convolutional Code Shift register ( SR ) input X 0 +X 1 +X 3 =output Figure 8.1 : A shift register clock X0X0 X1X1 X2X2 X3X3 0 1+1+1=1 X0X0 X1X1 X2X2 X3X3 1101 t-1 t 0+1+0=1 X0X0 X1X1 X2X2 X3X3 0110 Example 8.1.1 :
5
p5. Convolutional Code timeinput X 0 +X 1 +X 3 =output X0X0 X1X1 X2X2 X3X3 0000 output=X 0 +X 1 +X 3 0111000 1010100 2111010 3000101 4000010 5010001 6000000 Input : 1010000 Initial value : 0000 =1+0+0=1+0+0 Example 8.1.2 : 000101= =1110010
6
p6. Convolutional Code s-stage shift register : shift register with s registers X i (t) : the value of the contents of register X i at time t. c t : the output of time t. c t =g 0 X 0 (t)+g 1 X 1 (t)+……..+g s-1 X s-1 (t), input ctct X0X0 X1X1 X s-1 g s-1 g1g1 g0g0 Example : 3-stage shift register at time t input X0X0 X1X1 X2X2 1 001 g 0 =1, g 1 =0, g 2 =1 X 0 (t)=1, X 1 (t)=0, X 2 (t)=0 c t =1 ‧ 1+0 ‧ 0+1 ‧ 0 = 1
7
p7. Convolutional Code Generator of shift register g(x) : generator of s-stage shift register, g(x)=g 0 +g 1 x+….+g s-1 x s-1, where and are the coefficients of the shift register. input ctct X0X0 X1X1 X s-1 g s-1 g1g1 g0g0 x0x0 x1x1 x s-1 Poly. terms : Example : 4-stage shift register at time t g(x) g 0 =1, g 1 =1, g 2 =0, g 3 =1 g = 1101 g = g 0 g 1 … g s-1 g(x)=g 0 +g 1 x+…+g s-1 x s-1 g(x) = 1+x 1 +0 ‧ x 2 +x 3 input
8
p8. Convolutional Code Example : X0X0 X1X1 X2X2 X3X3 11010=c 1 g=1101 Input : a=10000 a(x)=1 c 1 (x)=a(x) ‧ g(x) =1 ‧ g(x) Generator : g=1101 g(x) = 1+x 1 +x 3 Output : c=11010 00001 c(x) = 1+x 1 +x 3 timeinput X0X0 X1X1 X2X2 X3X3 0000 011000 100100 200010 300001 400000 Output 1 1 0 1 0
9
p9. Convolutional Code Example : X0X0 X1X1 X2X2 X3X3 01101=c 2 g=1101 Input : a=01000 a(x)=x 1 c 2 (x)=a(x) ‧ g(x) =x ‧ g(x) Generator : g=1101 g(x) = 1+x 1 +x 3 Output : c=01101 00010 c(x) = x+x 2 +x 4 timeinput X0X0 X1X1 X2X2 X3X3 0000 000000 111000 200100 300010 400001 Output 0 1 1 0 1
10
p10. Convolutional Code Example : X0X0 X1X1 X2X2 X3X3 10111=c g=1101 Input : a=11000 a(x)=1+x 1 c(x)= c 1 (x)+c 2 (x) = 1 ‧ g(x)+x ‧ g(x) =(1+x) ‧ g(x) =a(x) ‧ g(x) Generator : g=1101 g(x) = 1+x 1 +x 3 Output : c=10111 00011 c(x) = 1+x 2 +x 3 +x 4 0 1 1 0 1 1 1 0 1 0 0 1 2 3 4 time 1 0 1 1 1 X0X0 X1X1 X2X2 X3X3 1+1+0=0 t=1 0011 a=01000 a=10000
11
p11. Convolutional Code Theorem 8.1.8 a(x) : the input sequence of a s-stage shift register c(x) : the output sequence of a s-stage shift register g(x) : the generator of a s-stage shift register, g(x)=g 0 +g 1 x+….+g s x s-1, where c(x)=a(x)g(x) Example 8.1.3 : Input sequence : 1010000 a(x)=1+x 2 Generator : g(x)=1+x+x 3 Output sequence : 1110010 c(x)=a(x)g(x) = (1+x 2 )(1+x+x 3 ) =1+x+x 2 +x 5 0000101 X0X0 X1X1 X2X2 X3X3 1110010=c g= 1101
12
p12. Convolutional Code time 0 0 1 2 3 4 5 6 input a0a0 a1a1 a2a2 a3a3 0 0 0 000 a0a0 000 a1a1 a0a0 00 a2a2 a1a1 a0a0 0 a3a3 a2a2 a1a1 a0a0 0a3a3 a2a2 a1a1 00a3a3 a2a2 000a3a3 a0a0 a 1 + a 0 a 2 + a 1 a 3 + a 2 + a 0 a 3 + a 1 a2a2 a3a3 Example 8.1.4 : Input sequence : a 0, a 1, a 2, a 3, 0, 0, 0 Generator : 1+x+x 3 Output sequence : c(x)=a(x)g(x) = (a 0 +a 1 x+a 2 x 2 +a 3 x 3 )(1+x+x 3 ) = a 0 +(a 1 +a 0 )x+(a 2 +a 1 )x 2 + (a 3 +a 2 +a 0 )x 3 +(a 3 +a 1 )x 4 + a 2 x 5 +a 3 x 6 output=X 0 +X 1 +X 3 X0X0 X1X1 X2X2 X3X3 a 0 a 0 0 a 0 0 0 0 0 a 1 a 1 0 a 1 0 0 0 0 0 a 3 a 3 0 a 3 c(x)c(x) 0 0 a 2 a 2 0 a 2 0
13
p13. Convolutional Code s-stage feedback shift register(FSR) X i (t) : the value of the contents of register X i at time t. c t : the output of time t. input X0X0 X1X1 X s-1 output = ctct g s-1 g1g1 g0g0 1 Example : X0X0 X1X1 X2X2 0 1 1 0 t X0X0 X1X1 X2X2 0 1 1 1 t+1 0+10+11+1 g 0 =1, g 1 =1, g 2 =0
14
p14. Convolutional Code input c t =output Input : 11000 Initial value : 000 Example : timeinput X0+ctX0+ct X1+ctX1+ct X2+ctX2+ct 000 output=c t 010100 110110 200011 3010+10+10+10+11 4010+11+11
15
p15. Convolutional Code Generator of feedback shift register g(x) : generator of s-stage FSR, g(x)=g 0 +g 1 x+….+g s-1 x s-1 +x s, where and are the coefficients of the shift register. Example : 3-stage feedback shift register g 0 =1, g 1 =1, g 2 =0 g = 1101 g = g 0 g 1 … g s-1 1 g(x)=g 0 +g 1 x+…+g s-1 x s-1 + x s g(x) = 1+x 1 +x 3 input X0X0 X1X1 X s-1 output g s-1 g1g1 g0g0 input output
16
p16. Convolutional Code Polynomial Division input X0X0 X1X1 X s-1 output g s-1 g1g1 g0g0 g(x)=g 0 +g 1 x+…+g s-1 x s-1 + x s c t (x)=a t (x)/g(x) r t (x)=a t (x) mod g(x) Input : a=a 0 a 1 a 2 …a n Output : c=c 0 c 1 c 2 …c n a t (x)=a 0 x t +a 1 x t-1 +…+a t-1 x+a t c t (x)=c 0 x t +c 1 x t-1 +…+c t-1 x+c t time=t r t (x)=X 0 (t)+X 1 (t)x+…+X s-1 (t)x s-1 (0 ≦ t ≦ n-1) a t (x)=c t (x)g(x)+r t (x)
17
p17. Convolutional Code Example : 3-stage feedback shift register g(x) = 1+x 1 +x 3 000001 output timeinput X0+ctX0+ct X1+ctX1+ct X2X2 000 output=c t 01 0 100 10 0 010 20 0 001 30 1 0+1 0 40 0 011 50 1 1 a 0 (x)=1 c 0 (x)=0, r 0 (x)=1 a 1 (x)=x a 2 (x)=x 2 c 1 (x)=0, r 1 (x)=x a 3 (x)=x 3 a 4 (x)=x 4 a 5 (x)=x 5 c 2 (x)=0, r 2 (x)=x 2 c 3 (x)=1, r 3 (x)=1+x c 4 (x)=x, r 4 (x)=x+x 2 c 5 (x)=x 2 +1, r 4 (x)=1+x+x 2 Input : a=100000 a t (x)=x ‧ a t-1 (x) =x ‧ c t-1 (x)g(x)+x ‧ r t-1 (x) t≧0t≧0
18
p18. Convolutional Code (n=5,k=2,d=3) cyclic linear code Example 8.1.9 : Generator polynomial : g(x)= 1+x+x 3 Parity check matrix H : 3-stage feedback shift register Generator polynomial : g(x)= 1+x+x 3 r t (x)=a t (x) mod g(x) : 0 ≦ t ≦ 4 ( n-1=5 ) Input : 10000 ( length=5 ) ==
19
p19. Convolutional Code Generator polynomial : g(x)= 1+x+x 3 time 0 0 1 2 3 4 input 1 0 1 1 0 00 100 010 101 1+1 0 000 0 0 0 1 0 output Example 8.1.10 : g(x) = 1+x 1 +x 3 01101 00010 a=10110 a(x)=x+x 2 +x 4 c(x)=x r=000 r(x)=0 X0+ctX0+ct X1+ctX1+ct X2X2
20
p20. Convolutional Code [2] Encoding convolutional codes (n,k,m) convolutional code ( (n,k,m)CV ) m : (m+1)-stage shift register n : the number of (m+1)-stage shift registers ((m+1)-SR i ) generator : k : shift k bits into (m+1)-stage shift register Input of (m+1)-SR i : Output of (m+1)-SR i : c i (x)=m(x)g i (x) (n,k,m)CV={c(x)|(c 1 (x),c 2 (x),…,c n (x))}
21
p21. Convolutional Code c 1 = c 1,0 c 1,1 c 1,2 c 1,3 … g 1 (x)=g 1,0 +g 1,1 x + …+g 1,m x m c 2 = c 2,0 c 2,1 c 2,2 c 2,3 … g 2 (x)=g 2,0 +g 2,1 x + …+g 2,m x m c n = c n,0 c n,1 c n,2 c n,3 … g n (x)=g n,0 +g n,1 x + …+g n,m x m (m+1)-SR 1 (m+1)-SR 2 (m+1)-SR n (n,k,m)CV ={c(x)|(c 1 (x),c 2 (x),…,c n (x))} Input X0X0 X1X1 X m+1
22
p22. Convolutional Code Example : (n=2,k=1,m=3) CV g 1 (x)=1+x+x 3 m(x)m(x) c 1 (x) (2,1,3) CV={c(x)=(m(x) ‧ (1+x+x 3 ), m(x) ‧ (1+x 2 +x 3 )) | } c 2 (x) g 2 (x)=1+x 2 +x 3 g 1 (x)=1+x+x 3 m(x)m(x) c 1 (x) c 2 (x) g 2 (x)=1+x 2 +x 3
23
p23. Convolutional Code (a) The message m(x)=1+x 2 is encoded to c(x)=((1+x 2 )g 1 (x), (1+x 2 )g 2 (x)) =(1+x+x 2 +x 5, 1+x 3 +x 4 +x 5 ) (b) The message m(x)=1+x 2 +x 3 +…=1+x 2 + is encoded to c(x)=(1+x 3 +x 4 +x 5 +…, 1+x+x 3 +x 4 +x 5 +…) Example 8.2.1 : (2,1,3) CV g 1 (x)=1+x+x 3 m(x)m(x) c 1 (x) c 2 (x) g 2 (x)=1+x 2 +x 3 (2,1,3) CV ={c(x)=(m(x) ‧ g 1 (x), m(x) ‧ g 2 (x)}
24
p24. Convolutional Code convolutional codes are linear codes (n,k,m) convloutional code : (n,k,m)CV Two codewords : c(x),c’(x) c(x)+c’(x)=(c 1 (x),c 2 (x),…,c n (x))+ (c 1 ’ (x),c 2 ’ (x),…,c n ’ (x)) =(m(x)g 1 (x),…, m(x)g n (x))+ (m ’ (x)g 1 (x),…, m ’ (x)g n (x)) =((m(x)+m ’ (x))g 1 (x),…, (m(x)+m ’ (x))g n (x)) =(m’’(x)g 1 (x),…,m’’(x)g n (x)) = (c 1 ’’(x),c 2 ’’(x),…,c n ’’(x)) =c’’(x)
25
p25. Convolutional Code Interleaved form of a (n,k,m) CV (n,k,m)CV={c(x)|(c 1 (x),c 2 (x),…,c n (x))} c i (x)=c i,0 +c i,1 x+c i,2 x 2 +c i,3 x 3 +…,, 1 ≦ i ≦ n, and 0 ≦ j c(x)=( c 1,0 + c 1,1 x + c 1,2 x 2 + c 1,3 x 3 + …, c 2,0 + c 2,1 x + c 2,2 x 2 + c 2,3 x 3 + …, c 3,0 + c 3,1 x + c 3,2 x 2 + c 3,3 x 3 + …, c n,0 + c n,1 x + c n,2 x 2 + c n,3 x 3 + … ) c = c 1,0 c 2,0 … c n,0, c 1,1 c 2,1 … c n,1, c 1,2 c 2,2 … c n,2, …… c(x)=c 1 (x n )+c 2 (x n ) ‧ x +c 3 (x n ) ‧ x 2 +c 4 (x n ) ‧ x 3 +…. interleaved form : 1 x1x1 x n-1 xnxn …x n+1 … x 2n …… Poly. terms : x 2n+1 x 2n+2
26
p26. Convolutional Code The message m(x)=1+x 2 is encoded to c(x)=(c 1 (x),c 2 (x))=((1+x 2 )g 1 (x), (1+x 2 )g 2 (x)) =(1+x+x 2 +x 5, 1+x 3 +x 4 +x 5 ) (11100100…, 10011100…) The interleaved representation of c and c(x) are c=11 10 10 01 01 11…. c(x)= c 1 (x 2 )+c 2 (x 2 ) ‧ x = 1+x 2 +x 4 +x 10 +(1+x 6 +x 8 +x 10 )x=1+x+x 2 +x 4 +x 7 +x 10 +x 11 Example 8.2.5 : (2,1,3) CV g 1 (x)=1+x+x 3 m(x)m(x) c1c1 c2c2 g 2 (x)=1+x 2 +x 3 (2,1,3) CV ={c(x)=(m(x) ‧ g 1 (x), m(x) ‧ g 2 (x)}
27
p27. Convolutional Code The rate of a (n,k,m) convolutional code The rate of a (n,k,m)CV is defined to be k/n. convolutional code with k>1 Example 8.2.7 : (3,2,3) CV g 1 (x)=1+x 3, g 2 (x)=1+x+x 3, g 3 (x)=x+x 2 +x 3 input c2c2 c3c3 c1c1 timeinput X0X0 X1X1 X2X2 X3X3 0000 00 10100 11 01001 2 1010 31 1110 40 0011 c1c1 0 0 1 1 1 c2c2 c3c3 5 00000 Input : m=10 01 01 11 00 00 …. 1 0 1 0 1 0 1 1 1 0 0 0
28
p28. Convolutional Code Property + 00a3a2a1a000a3a2a1a0 a1a1 a0a0 0 0 a3a3 a2a2 a1a1 a0a0 Example : (1,2,3) CV X0X0 X1X1 X2X2 X3X3 g (x)=1+x+x 3 0000 c 00a3a200a3a2 a1+a0a1+a0 a inputoutput a3+a2+a0a3+a2+a0 0 00a3a3 a2a2 a2a2
29
p29. Convolutional Code + g (x)=1+x+x 3 c g 1 (x)=1 0a 3 a 1 X0X0 X2X2 0a 2 a 0 X1X1 X3X3 g 2 (x)=1+x c2c2 c1c1 + c + + 00a3a2a1a000a3a2a1a0 c 1 (x) =(a 1 +a 3 x) ‧ 1 c 2 (x) =(a 0 +a 2 x) ‧ (1+x)= a 0 +(a 2 +a 0 )x+a 2 x 2 c(x)=c 1 (x)+c 2 (x) =(a 1 +a 0 )+(a 3 +a 2 +a 0 )x+a 2 x 2 X0X0 X1X1 X2X2 X3X3
30
p30. Convolutional Code Generator matrix of a (2,1,m)CV Generator matrix : G Input sequence : m=m 0 m 1 m 2 … Output sequence : c=m ‧ G (interleaved form) c 1 =c 1,0 c 1,1 c 1,2 c 1,3 … g 1 (x)=g 1,0 +g 1,1 x + …+g 1,m x m c 2 =c 2,0 c 2,1 c 2,2 c 2,3 … g 2 (x)=g 2,0 +g 2,1 x + …+g 2,m x m Input m (2,1,m)CV={c|c=m ‧ G} c=c 1,0 c 2,0 c 1,1 c 2,1 … g1,0g1,0 g1,1g1,1 g1,mg1,m g2,0g2,0 g2,1g2,1 g2,mg2,m
31
p31. Convolutional Code (1) input : m=m 0 c 1,0 = m 0 ‧ g 1,0 c 2,0 = m 0 ‧ g 2,0 c = [c 1,0 c 2,0 ]=[m 0 ‧ g 1,0 m 0 ‧ g 2,0 ] = m 0 [g 1,0 g 2,0 ] output : c=c 1,0 c 2,0
32
p32. Convolutional Code c 1,0 =m 0 ‧ g 1,0 c 2,0 =m 0 ‧ g 2,0 (2) input : m=m 0 m 1 output : c=c 1,0 c 2,0 c 1,1 c 2,1 c 1,1 =m 0 ‧ g 1,1 +m 1 ‧ g 1,0 c 2,1 =m 0 ‧ g 2,1 +m 0 ‧ g 2,0 c=[c 1,0 c 2,0 c 1,1 c 2,1 ]
33
p33. Convolutional Code (3) input : m=m 0 m 1 m 2 … m e output : c=c 1,0 c 2,0 c 1,1 c 2,1 c 1,2 c 2,2 …. g 1,0 g 2,0 g 1,1 g 2,1 g 1,0 g 2,0 g 1,2 g 2,2 g 1,1 g 2,1 g 1,m-2 g 2,m-2 g 1,m-2 g 2,m-2 g 1,m-1 g 2,m-1 g 1,0 g 2,0 g 1,m-3 g 2,m-3 g 1,m-2 g 2,m-2 g 1,m-1 g 2,m-1 0 0 ex2e G =
34
p34. Convolutional Code Example : (2,1,3) CV Generator : g 1 (x)=1+x+x 3 g 2 (x)=1+x 2 +x 3 Input : m=101000 1 1 0 1 1 0 1 1 1110 1110 11 01 10 11 01 11 10 11 01 11 10 11 01 11 c=m ‧ G =[ 1 0 1 0 0 0 ] 0 0 6x12 =[ 1 1 1 0 1 0 0 1 0 1 1 1 ]
35
p35. Convolutional Code state diagram of (n,1,m) convolutional code state : the contents of the first m registers in the shift register. (s=s 0 s 1 ….s m-1 ) zero state : each of the first m registers contains 0. input X0X0 X1X1 XmXm s0s0 s1s1 smsm X m-1 s m-1 Example : (1,1,m) CV time t s 0 s 1 ….s m-1 time t+1 0s 0 s 1 ….s m-2 1s 0 s 1 ….s m-2 or Input=0 or 1 g(x)=g 0 +g 1 x+…+g m x m time t-1 s 1 s 2 ….s m-1 0 s 1 s 2 ….s m-1 1 Input=s 0
36
100 001 000 010 00 11 10 01 00 110 011 101 111 11 10 01 00 10 01 10 11 g 1 (x)=1+x+x 3 c1c1 c2c2 g 2 (x)=1+x 2 +x 3 m Example 8.2.9 : (2,1,3)CV : State diagram : 000 ttime state t+1 c1c2c1c2 000 100 input 0 input 1 0 0 1 0 0 0 X0X0 X1X1 X2X2 X3X3 c 1 c 2 =output 0 input 0 11
37
100 001 000 010 00 11 10 01 00 110 011 101 111 11 10 01 00 10 01 10 11 g 1 (x)=1+x+x 3 c1c1 c2c2 g 2 (x)=1+x 2 +x 3 m time state000 0 input 1 100 11 Example 8.2.9 : encoding convolutional code by state diagram Input : m=101 (2,1,3)CV : State diagram : 1 0 010 10 2 1 010 10
38
Tabular form A state diagram of a (n,1,m)CV can also be represented in tabular form. 100 001 000 010 00 11 10 01 00 110 011 101 111 11 10 01 00 10 01 10 11 000 001 010 011 100 101 110 111 00 01 10 11 10 01 00 11 10 01 00 01 10 11 X0X1X2X3X0X1X2X3 0 1 1 0 1 1 0 0 1 1 0 1 1 1 1 1 1 0 1 1 X1X2X3X1X2X3 X0X1X2X0X1X2 Example : (2,1,3)CV
39
p39. Convolutional Code [3] Decoding convolutional codes Idea of decoding convolutional codes Consider C 1 in example 8.2.1. Suppose that the received word is * But there is no directed walk that would give an output of w. Therefore we are faced with finding a codeword that “most likely” (minimum hamming distance) fits w. encoder of convolution code noise channel decoder of convolution code
40
p40. Convolutional Code. Window size τ Window size is the amount of received codeword w we “see” when making each decoding decision. Hamming distance : H d (w1,w2). Hamming weight : H w (w1).
41
p41. Convolutional Code Exhaustive decoding algorithm (window size τ=1) input : received word w=11 10 10 01 output : correct decoding message 1 0 1 0 100 001 000 010 00 11 10 01 00 110 011 101 111 11 10 01 00 10 01 10 11 Walk : (1) (2) (3) (4) The length of the walk : 4
42
p42. Convolutional Code input : received word w=11 00 00 00 output : most likely decoding message 1 1 1 0 * (1) : walk and decode message digit 1. (2) : walk ? Random choosing one to decode message digit. (say, choose 110) 100 001 000 010 00 11 10 01 00 110 011 101 111 11 10 01 00 10 01 10 11 (1) (2) (4) (3) (3) : walk and decode message digit 1. (4) : walk and decode message digit 0.
43
p43. Convolutional Code Exhaustive decoding algorithm (window size τ=2) input : received word w=11 00 00 00…… and window size τ=2 output : ”most closely” message (tick 0) We start at state 000 in Example 8.2.9 state diagram. (tick 1) we see w=11 00 We make the decoding decision to move to state 100 and decode first message digit as 1. 000, 000, 000 000, 000, 100 000, 100, 010 000, 100, 110 00 00 11 11 10 11 01 2 4 1 1 walkoutputDistance from 11 00
44
p44. Convolutional Code (tick 2) we see w=00 00 We make the decoding decision to move to state 110 and decode first message digit as 1. 100, 010, 001 100, 010, 101 100, 110, 011 100, 110, 111 10 01 10 01 11 01 00 2 2 3 1 walkoutputDistance from 00 00
45
p45. Convolutional Code Catastrophic (n,1,m) CV If its state diagram contains a zero weight cycle different form the loop on the zero state. For n=2, gcd(g 1 (x), g 2 (x)) !=1 if and only if the (2,1,m) CV is catastrophic. Example : g 1 (x)=1+x 3 =(1+x)(1+x+x 2 ) gcd(g 1 (x),g 2 (x))=1+x+x 2 g 2 (x)=x+x 2 +x 3 =x(1+x+x 2 )
46
p46. Convolutional Code The minimum distance of a convolutional code We are only considering non-catastrophic convolutional codes. 100 001 000 010 00 11 10 01 00 110 011 101 111 11 10 01 00 10 01 10 11 (2,1,3) CV=C 1 Example : d(C 1 )=H w (11 10 01 11 00 00 …)=6
47
p47. Convolutional Code τ(e) Given a non-catastrophic convolutional code C for define τ(e) to be the least integer x such that all walks of length x in the state diagram that immediately leave the zero state have weight greater than 2e.
48
p48. Convolutional Code Theorem 8.3.4 Let C be a non-catastrophic convolutional code. For any e, if any error pattern containing at most e errors in any τ(e) consecutive steps occurs during transmission, then the exhaustive decoding algorithm using the window size τ(e) will decode the received word correctly. 100 001 000 010 00 11 10 01 00 110 011 101 111 11 10 01 00 10 01 10 11 (2,1,3) CV=C 1 Example : e=1 H w (11 10)=3 H w (11 02)=3
49
p49. Convolutional Code 100 001 000 010 00 11 10 01 00 110 011 101 111 11 10 01 00 10 01 10 11 e=2 H w (11 10 00 00 10 00)=4 (Hamming weight on 6 red edges) Choose (since the Hamming weight of any walk of length 7 > 2e=4)
50
p50. Convolutional Code How many errors can be corrected? Theorem 8.3.4 says that if we use the exhaustive decoding algorithm with window size τ(1), then all error patterns with at most e=1 error in any τ(1) =2 consecutive ticks will be corrected. So for example, the error pattern e1 = 10 00 01 00 01 00 10 … will be corrected. Also if we use the exhaustive decoding algorithm with window size τ(2), then all error patterns with at most e=2 errors in any τ(2)=7 consecutive ticks will be corrected. So for example, the error pattern e2 = 11 00 00 00 00 00 00 … will be corrected.
51
p51. Convolutional Code Exhaustive decoding algorithm vs truncated Viterbi decoding algorithm Notice that the exhaustive decoding algorithm with window size τ(e) requires that we consider all walks of length to be the least integer x such that all walks of length τ(e) from the current state for each message digit to be decoded. Constructing all 2^τ(e) such walks at each tick is very time consuming, so we will present a faster truncated Viterbi decoding algorithm (dynamic programming approach) in the next section.
52
p52. Convolutional Code [4] Truncated Viterbi decoding This algorithm only makes 2 m calculations and stores 2 m walks of length τat each tick. The window size τ is chosen to be between 4m to 6m (a number more than τ(e)). For the first m ticks the decoder is still storing all walks from the zero state, each ending in a different state, so t=m is the first time at which we have exactly one walk ending in this state. For t>m, each state s saves an optimal walk W(s;t) and its corresponding distance d(s;t). Once t>= τ, a message digit is decoded at each tick. W(s;t)=x 0 x 1 … x τ-1 : optimal walk from current decoded state to state s at tick t (stored as a sequence of message digits, rather than a sequence of states). d(s;t) : distance between the outputs of W(s;t) and the corresponding received words
53
p53. Convolutional Code Algorithm 8.4.1 : truncated Viterbi decoding of (n,1,m) convolutional codes with windows size τ Input : received word w=w 0 w 1 ….. each w i consists of n digits Output : ”most closely” message s : state s = s 0 s 1 …s m-1 (1) Initialization : t=0, define W(s ; t)= s 0 s 1 …s m-1 ﹡﹡ … ﹡ (of length τ) if s is the zero state otherwise (a) (b)
54
p54. Convolutional Code (2,1,3) CV code, τ= 5 (in Example 8.2.1) W(000,0)=000** d(000,0)=0 W(100,0)=100** d(100,0)=∞ Example : W(010,0)=010**d(010,0)=∞ W(110,0)=110** d(110,0)=∞ W(001,0)=001** d(001,0)=∞ W(101,0)=101**d(101,0)=∞ W(011,0)=011** d(011,0)=∞ W(111,0)=111**d(111,0)=∞
55
p55. Convolutional Code (2) Distance calculation : t>0 For each state s, define : the distance between the input w t-1 and the output on the directed edge from state (s 1,…,s m-1,i) to s, i=0,1. (a) (b)
56
p56. Convolutional Code Example : w=w 0 w 1 w 2 w 3 …=00 11 01 11 …. (2,1,3) CV code, τ= 5 (in Example 8.2.1) S=S 0 S 1 S 2 =011 When t=1, i=0 i=1
57
p57. Convolutional Code (3) Walk calculation : (a) If form W(s,t) from by adding the leftmost digit of s to the left of and then deleting the rightmost digit. S=s 0 s 1 …s m-1 Example : {i,j}={0,1}
58
p58. Convolutional Code (b) If W(s,t) from by adding the leftmost ] digit of s to the left of, replacing each digit that disagrees with with ﹡, and then deleting the rightmost digit. where Example : S=s 0 s 1 …s m-1 Then
59
p59. Convolutional Code (4) Decoding : For t ≧ τ, let If the rightmost digit in W(s;t) is the same, say i, for all then decode the message digit i; otherwise decode the message digit ﹡. 0000000 1001000 0101000 1101000 0011000 1011000 0111000 1111000 Artificial Examples : 000 100 010 110 001 101 011 111 Sd(s,t) W(s,t) 2 2 3 3 4 4 3 3 S(t)={000,100} Decode to 0 0000000 1001001 0101000 1101000 0011000 1011000 0111000 1111000 000 100 010 110 001 101 011 111 Sd(s,t) W(s,t) 2 2 3 3 4 4 3 3 S(t)={000,100} Decode to * τ=7 (1)(2)
60
p60. Convolutional Code 2. E.g 8.4.2 : consider convolutional code C1 in Example 8.2.1 State diagram of (2,1,3) convolutional code in Example 8.2.1 : Received word : w=w 0 w 1 w 2 …..=11 00 00 ….. τ=7 100 001 000 010 00 11 10 01 00 110 011 101 111 11 10 01 00 10 01 10 11
61
p61. Convolutional Code 000 100 010 110 001 101 011 111 00 01 10 11 10 01 00 11 10 01 00 01 10 11 t=0 W(s;0)=s ﹡﹡﹡﹡ for all state s; d(000;0)=0 and d(s’;0)= ∞ for all states s’ other than the zero state. 0, 000**** ∞, 100**** ∞, 010**** ∞, 110**** ∞, 001**** ∞, 101**** ∞, 011**** ∞, 111**** W(s;t)
62
p62. Convolutional Code t=1 : w t-1 =w 0 =11 s=000 : s=010 : s 0 W(000,0)=0000**** W(000,1)=0000*** s 0 W’(010,0)=010***** W(100,0)=100**** W(101,0)=101**** W’(010,0)=10***** W(010,1)=010*** input = 0 =S 0
63
p63. Convolutional Code 000 100 010 110 001 101 011 111 t=0 0, 000**** ∞, 100**** ∞, 010**** ∞, 110**** ∞, 001**** ∞, 101**** ∞, 011**** ∞, 111**** t=1 2, 0000*** 0, 1000*** ∞, 010**** ∞, 110**** ∞, 001**** ∞, 101**** ∞, 011**** ∞, 111**** 00 01 10 11 10 01 00 11 10 01 00 01 10 11
64
p64. Convolutional Code 000 100 010 110 001 101 011 111 t=2 2, 00000** 4, 10000** 1, 01000** 1, 11000** ∞, 001**** ∞, 101**** ∞, 011**** ∞, 111**** t=2,3 : w 1 =00, w 2 =00 t=3 2, 000000* 4, 100000* 5, 010000* 5, 110000* 2, 001000* 2, 101000* 3, 011000* 1, 111000* 00 01 10 11 10 01 00 11 10 01 00 01 10 11
65
p65. Convolutional Code t=4 : w t-1 =w 3 =00 s=100 : s=010 : s 0 W(001,3)=1001000* W(100,4)=1001000 input = 1 =S 0 s 0 W(101,3)=0101000* W(010,4)=0101000 input = 0 =S 0
66
p66. Convolutional Code 000 100 010 110 001 101 011 111 t=4 2, 0000000 2, 1001000 3, 0101000 3, 1101000 4, 0011000 4, 1011000 1, 0111000 3, 1111000 t=3 2, 000000* 4, 100000* 5, 010000* 5, 110000* 2, 001000* 2, 101000* 3, 011000* 1, 111000* 00 01 10 11 10 01 00 11 10 01 00 01 10 11
67
p67. Convolutional Code t=5 : w t-1 =w 4 =00 S=000 : s 0 W(000,4)=00000000 W(000,5)=0000000 s 0 W’(010,0)=100**000 W(000,4)=0000000 W(001,4)=0011000 W’(010,0)=00**000 W(010,1)=100**00 input = 0 =S 0 S=100 : input = 1 =S 0
68
p68. Convolutional Code 000 100 010 110 001 101 011 111 t=6 2, 0000000 2, 1001110 3, 0101110 3, 1101110 4, 001**10 4, 101**10 3, 0111010 3, 1110010 t=5 2, 0000000 4, 100**00 3, 0100100 3, 1100100 2, 0011100 2, 1011100 3, 0111100 3, 1110100 00 01 10 11 10 01 00 11 10 01 00 01 10 11 t=7 2, 0000000 4, 100**** 3, 0100111 3, 1100111 4, 001*1*1 4, 101*1*1 3, 0111011 3, 1110111 t=7 : w t-1 =w 6 =00. We have reached t=τ. So we decode the rightmost digit in W(000,7)=0000000, namely 0. Since d(000,7)=2<d(s,7), s=001~111 ( S(7)={000} )
69
p69. Convolutional Code 000 100 010 110 001 101 011 111 t=9 2, 0000000 4, 100***0 5, 010**** 5, 110**** 4, 0011101 4, 1011101 3, 0111001 5, 111**** t=8 2, 0000000 4, 100**** 5, 010**** 5, 110**** 4, 001**** 4, 101**** 3, 0111011 3, 1110011 00 01 10 11 10 01 00 11 10 01 00 01 10 11 t=10 2, 0000000 4, 100**** 5, 010**** 5, 110**** 4, 0011100 4, 1011100 5, 0111*** 5, 1110*** Decode to : 000
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.