Convolutional Codes. p2. OUTLINE  [1] Shift registers and polynomials  [2] Encoding convolutional codes  [3] Decoding convolutional codes  [4] Truncated.

Slides:



Advertisements
Similar presentations
Cyclic Code.
Advertisements

10.1 Chapter 10 Error Detection and Correction Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
15-853:Algorithms in the Real World
CHANNEL CODING REED SOLOMON CODES.
Data and Computer Communications
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
DIGITAL COMMUNICATION Coding
Chapter 11 Error-Control CodingChapter 11 : Lecture edition by K.Heikkinen.
Error detection and correction
7/2/2015Errors1 Transmission errors are a way of life. In the digital world an error means that a bit value is flipped. An error can be isolated to a single.
DIGITAL COMMUNICATION Coding
Error Detection and Reliable Transmission EECS 122: Lecture 24 Department of Electrical Engineering and Computer Sciences University of California Berkeley.
Error Detection and Correction Rizwan Rehman Centre for Computer Studies Dibrugarh University.
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
Cyclic codes 1 CHAPTER 3: Cyclic and convolution codes Cyclic codes are of interest and importance because They posses rich algebraic structure that can.
1 S Advanced Digital Communication (4 cr) Cyclic Codes.
Channel Coding and Error Control
Part.7.1 Copyright 2007 Koren & Krishna, Morgan-Kaufman FAULT TOLERANT SYSTEMS Part 7 - Coding.
Great Theoretical Ideas in Computer Science.
Copyright © 2003, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 1 Chapter 4 Channel Coding.
1 Channel Coding (II) Cyclic Codes and Convolutional Codes.
Application of Finite Geometry LDPC code on the Internet Data Transport Wu Yuchun Oct 2006 Huawei Hisi Company Ltd.
Cyclic_Code2004/3/17Yuh-Ming Huang, CSIE NCNU1 v = (v 0, v 1,..., v n-1 ) : code vector v (1) = (v n-1, v 0, v 1, …, v n-2 ) v (i) = (v n-i, v n-i+1, …,
Error Coding Transmission process may introduce errors into a message.  Single bit errors versus burst errors Detection:  Requires a convention that.
1 SNS COLLEGE OF ENGINEERING Department of Electronics and Communication Engineering Subject: Digital communication Sem: V Cyclic Codes.
10.1 Chapter 10 Error Detection and Correction Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
COEN 180 Erasure Correcting, Error Detecting, and Error Correcting Codes.
MIMO continued and Error Correction Code. 2 by 2 MIMO Now consider we have two transmitting antennas and two receiving antennas. A simple scheme called.
Cyclic Redundancy Check CRC Chapter CYCLIC CODES Cyclic codes are special linear block codes with one extra property. In a cyclic code, if a codeword.
Practical Session 10 Error Detecting and Correcting Codes.
§6 Linear Codes § 6.1 Classification of error control system § 6.2 Channel coding conception § 6.3 The generator and parity-check matrices § 6.5 Hamming.
EE 430 \ Dr. Muqaibel Cyclic Codes1 CYCLIC CODES.
Information and Coding Theory Cyclic codes Juris Viksna, 2015.
Computer Science Division
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Error Detection and Correction
1 © Unitec New Zealand CRC calculation and Hammings code.
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Error Detection. Data can be corrupted during transmission. Some applications require that errors be detected and corrected. An error-detecting code can.
Some Computation Problems in Coding Theory
Error Detection and Correction
A simple rate ½ convolutional code encoder is shown below. The rectangular box represents one element of a serial shift register. The contents of the shift.
Dr. Muqaibel \ EE430 Convolutional Codes 1 Convolutional Codes.
Transmission Errors Error Detection and Correction.
Error Control Coding. Purpose To detect and correct error(s) that is introduced during transmission of digital signal.
Interleaving Compounding Packets & Convolution Codes
10.1 Chapter 10 Error Detection and Correction Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Reed-Solomon Codes Rong-Jaye Chen.
Exercise in the previous class (1) Define (one of) (15, 11) Hamming code: construct a parity check matrix, and determine the corresponding generator matrix.
Cyclic Linear Codes. p2. OUTLINE  [1] Polynomials and words  [2] Introduction to cyclic codes  [3] Generating and parity check matrices for cyclic.
Practical Session 10 Computer Architecture and Assembly Language.
Class Report 林格名 : Reed Solomon Encoder. Reed-Solomom Error Correction When a codeword is decoded, there are three possible outcomes –If 2s + r < 2t (s.
Channel Coding and Error Control 1. Outline Introduction Linear Block Codes Cyclic Codes Cyclic Redundancy Check (CRC) Convolutional Codes Turbo Codes.
Modulo-2 Digital coding uses modulo-2 arithmetic where addition becomes the following operations: 0+0= =0 0+1= =1 It performs the.
DIGITAL SYTEM DESIGN MINI PROJECT CONVOLUTION CODES
The Viterbi Decoding Algorithm
Computer Architecture and Assembly Language
Error Detection and Correction
Subject Name: COMPUTER NETWORKS-1
Communication Networks: Technology & Protocols
DATA COMMUNICATION AND NETWORKINGS
S Digital Communication Systems
COS 463: Wireless Networks Lecture 9 Kyle Jamieson
Cyclic Code.
Error Detection and Correction
Computer Architecture and Assembly Language
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
عنوان درس نام استاد
IV. Convolutional Codes
Chapter 10 Error Detection and Correction
Presentation transcript:

Convolutional Codes

p2. OUTLINE  [1] Shift registers and polynomials  [2] Encoding convolutional codes  [3] Decoding convolutional codes  [4] Truncated Viterbi decoding

p3. Convolutional Code [1] Shift registers and polynomials XOR : register and clock X0X0 X2X2 Output X1X1 X2X2 Output= X 0 +X 1 +X XiXi clock register t t-1 t+1 X1X X0X

p4. Convolutional Code Shift register ( SR ) input X 0 +X 1 +X 3 =output Figure 8.1 : A shift register clock X0X0 X1X1 X2X2 X3X =1 X0X0 X1X1 X2X2 X3X t-1 t 0+1+0=1 X0X0 X1X1 X2X2 X3X Example :

p5. Convolutional Code timeinput X 0 +X 1 +X 3 =output X0X0 X1X1 X2X2 X3X output=X 0 +X 1 +X Input : Initial value : 0000 =1+0+0=1+0+0 Example : = =

p6. Convolutional Code s-stage shift register : shift register with s registers X i (t) : the value of the contents of register X i at time t. c t : the output of time t. c t =g 0 X 0 (t)+g 1 X 1 (t)+……..+g s-1 X s-1 (t), input ctct X0X0 X1X1 X s-1 g s-1 g1g1 g0g0 Example : 3-stage shift register at time t input X0X0 X1X1 X2X g 0 =1, g 1 =0, g 2 =1 X 0 (t)=1, X 1 (t)=0, X 2 (t)=0 c t =1 ‧ 1+0 ‧ 0+1 ‧ 0 = 1

p7. Convolutional Code Generator of shift register g(x) : generator of s-stage shift register, g(x)=g 0 +g 1 x+….+g s-1 x s-1, where and are the coefficients of the shift register. input ctct X0X0 X1X1 X s-1 g s-1 g1g1 g0g0 x0x0 x1x1 x s-1 Poly. terms : Example : 4-stage shift register at time t g(x) g 0 =1, g 1 =1, g 2 =0, g 3 =1 g = 1101 g = g 0 g 1 … g s-1 g(x)=g 0 +g 1 x+…+g s-1 x s-1 g(x) = 1+x 1 +0 ‧ x 2 +x 3 input

p8. Convolutional Code Example : X0X0 X1X1 X2X2 X3X =c 1 g=1101 Input : a=10000 a(x)=1 c 1 (x)=a(x) ‧ g(x) =1 ‧ g(x) Generator : g=1101 g(x) = 1+x 1 +x 3 Output : c= c(x) = 1+x 1 +x 3 timeinput X0X0 X1X1 X2X2 X3X Output

p9. Convolutional Code Example : X0X0 X1X1 X2X2 X3X =c 2 g=1101 Input : a=01000 a(x)=x 1 c 2 (x)=a(x) ‧ g(x) =x ‧ g(x) Generator : g=1101 g(x) = 1+x 1 +x 3 Output : c= c(x) = x+x 2 +x 4 timeinput X0X0 X1X1 X2X2 X3X Output

p10. Convolutional Code Example : X0X0 X1X1 X2X2 X3X =c g=1101 Input : a=11000 a(x)=1+x 1 c(x)= c 1 (x)+c 2 (x) = 1 ‧ g(x)+x ‧ g(x) =(1+x) ‧ g(x) =a(x) ‧ g(x) Generator : g=1101 g(x) = 1+x 1 +x 3 Output : c= c(x) = 1+x 2 +x 3 +x time X0X0 X1X1 X2X2 X3X =0 t= a=01000 a=10000

p11. Convolutional Code Theorem a(x) : the input sequence of a s-stage shift register c(x) : the output sequence of a s-stage shift register g(x) : the generator of a s-stage shift register, g(x)=g 0 +g 1 x+….+g s x s-1, where c(x)=a(x)g(x) Example : Input sequence :  a(x)=1+x 2 Generator : g(x)=1+x+x 3 Output sequence : c(x)=a(x)g(x) = (1+x 2 )(1+x+x 3 ) =1+x+x 2 +x X0X0 X1X1 X2X2 X3X =c g= 1101

p12. Convolutional Code time input a0a0 a1a1 a2a2 a3a a0a0 000 a1a1 a0a0 00 a2a2 a1a1 a0a0 0 a3a3 a2a2 a1a1 a0a0 0a3a3 a2a2 a1a1 00a3a3 a2a2 000a3a3 a0a0 a 1 + a 0 a 2 + a 1 a 3 + a 2 + a 0 a 3 + a 1 a2a2 a3a3 Example : Input sequence : a 0, a 1, a 2, a 3, 0, 0, 0 Generator : 1+x+x 3 Output sequence : c(x)=a(x)g(x) = (a 0 +a 1 x+a 2 x 2 +a 3 x 3 )(1+x+x 3 ) = a 0 +(a 1 +a 0 )x+(a 2 +a 1 )x 2 + (a 3 +a 2 +a 0 )x 3 +(a 3 +a 1 )x 4 + a 2 x 5 +a 3 x 6 output=X 0 +X 1 +X 3 X0X0 X1X1 X2X2 X3X3 a 0 a 0 0 a a 1 a 1 0 a a 3 a 3 0 a 3 c(x)c(x) 0 0 a 2 a 2 0 a 2 0

p13. Convolutional Code s-stage feedback shift register(FSR) X i (t) : the value of the contents of register X i at time t. c t : the output of time t. input X0X0 X1X1 X s-1 output = ctct g s-1 g1g1 g0g0 1 Example : X0X0 X1X1 X2X t X0X0 X1X1 X2X t g 0 =1, g 1 =1, g 2 =0

p14. Convolutional Code input c t =output Input : Initial value : 000 Example : timeinput X0+ctX0+ct X1+ctX1+ct X2+ctX2+ct 000 output=c t

p15. Convolutional Code Generator of feedback shift register g(x) : generator of s-stage FSR, g(x)=g 0 +g 1 x+….+g s-1 x s-1 +x s, where and are the coefficients of the shift register. Example : 3-stage feedback shift register g 0 =1, g 1 =1, g 2 =0 g = 1101 g = g 0 g 1 … g s-1 1 g(x)=g 0 +g 1 x+…+g s-1 x s-1 + x s g(x) = 1+x 1 +x 3 input X0X0 X1X1 X s-1 output g s-1 g1g1 g0g0 input output

p16. Convolutional Code Polynomial Division input X0X0 X1X1 X s-1 output g s-1 g1g1 g0g0 g(x)=g 0 +g 1 x+…+g s-1 x s-1 + x s c t (x)=a t (x)/g(x) r t (x)=a t (x) mod g(x) Input : a=a 0 a 1 a 2 …a n Output : c=c 0 c 1 c 2 …c n a t (x)=a 0 x t +a 1 x t-1 +…+a t-1 x+a t c t (x)=c 0 x t +c 1 x t-1 +…+c t-1 x+c t time=t r t (x)=X 0 (t)+X 1 (t)x+…+X s-1 (t)x s-1 (0 ≦ t ≦ n-1) a t (x)=c t (x)g(x)+r t (x)

p17. Convolutional Code Example : 3-stage feedback shift register g(x) = 1+x 1 +x output timeinput X0+ctX0+ct X1+ctX1+ct X2X2 000 output=c t a 0 (x)=1 c 0 (x)=0, r 0 (x)=1 a 1 (x)=x a 2 (x)=x 2 c 1 (x)=0, r 1 (x)=x a 3 (x)=x 3 a 4 (x)=x 4 a 5 (x)=x 5 c 2 (x)=0, r 2 (x)=x 2 c 3 (x)=1, r 3 (x)=1+x c 4 (x)=x, r 4 (x)=x+x 2 c 5 (x)=x 2 +1, r 4 (x)=1+x+x 2 Input : a= a t (x)=x ‧ a t-1 (x) =x ‧ c t-1 (x)g(x)+x ‧ r t-1 (x) t≧0t≧0

p18. Convolutional Code (n=5,k=2,d=3) cyclic linear code Example : Generator polynomial : g(x)= 1+x+x 3 Parity check matrix H : 3-stage feedback shift register Generator polynomial : g(x)= 1+x+x 3 r t (x)=a t (x) mod g(x) : 0 ≦ t ≦ 4 ( n-1=5 ) Input : ( length=5 ) ==

p19. Convolutional Code Generator polynomial : g(x)= 1+x+x 3 time input output Example : g(x) = 1+x 1 +x a=10110 a(x)=x+x 2 +x 4 c(x)=x r=000 r(x)=0 X0+ctX0+ct X1+ctX1+ct X2X2

p20. Convolutional Code [2] Encoding convolutional codes (n,k,m) convolutional code ( (n,k,m)CV ) m : (m+1)-stage shift register n : the number of (m+1)-stage shift registers ((m+1)-SR i ) generator : k : shift k bits into (m+1)-stage shift register Input of (m+1)-SR i : Output of (m+1)-SR i : c i (x)=m(x)g i (x) (n,k,m)CV={c(x)|(c 1 (x),c 2 (x),…,c n (x))}

p21. Convolutional Code c 1 = c 1,0 c 1,1 c 1,2 c 1,3 … g 1 (x)=g 1,0 +g 1,1 x + …+g 1,m x m c 2 = c 2,0 c 2,1 c 2,2 c 2,3 … g 2 (x)=g 2,0 +g 2,1 x + …+g 2,m x m c n = c n,0 c n,1 c n,2 c n,3 … g n (x)=g n,0 +g n,1 x + …+g n,m x m (m+1)-SR 1 (m+1)-SR 2 (m+1)-SR n (n,k,m)CV ={c(x)|(c 1 (x),c 2 (x),…,c n (x))} Input X0X0 X1X1 X m+1

p22. Convolutional Code Example : (n=2,k=1,m=3) CV g 1 (x)=1+x+x 3 m(x)m(x) c 1 (x) (2,1,3) CV={c(x)=(m(x) ‧ (1+x+x 3 ), m(x) ‧ (1+x 2 +x 3 )) | } c 2 (x) g 2 (x)=1+x 2 +x 3 g 1 (x)=1+x+x 3 m(x)m(x) c 1 (x) c 2 (x) g 2 (x)=1+x 2 +x 3

p23. Convolutional Code (a) The message m(x)=1+x 2 is encoded to c(x)=((1+x 2 )g 1 (x), (1+x 2 )g 2 (x)) =(1+x+x 2 +x 5, 1+x 3 +x 4 +x 5 ) (b) The message m(x)=1+x 2 +x 3 +…=1+x 2 + is encoded to c(x)=(1+x 3 +x 4 +x 5 +…, 1+x+x 3 +x 4 +x 5 +…) Example : (2,1,3) CV g 1 (x)=1+x+x 3 m(x)m(x) c 1 (x) c 2 (x) g 2 (x)=1+x 2 +x 3 (2,1,3) CV ={c(x)=(m(x) ‧ g 1 (x), m(x) ‧ g 2 (x)}

p24. Convolutional Code convolutional codes are linear codes (n,k,m) convloutional code : (n,k,m)CV Two codewords : c(x),c’(x) c(x)+c’(x)=(c 1 (x),c 2 (x),…,c n (x))+ (c 1 ’ (x),c 2 ’ (x),…,c n ’ (x)) =(m(x)g 1 (x),…, m(x)g n (x))+ (m ’ (x)g 1 (x),…, m ’ (x)g n (x)) =((m(x)+m ’ (x))g 1 (x),…, (m(x)+m ’ (x))g n (x)) =(m’’(x)g 1 (x),…,m’’(x)g n (x)) = (c 1 ’’(x),c 2 ’’(x),…,c n ’’(x)) =c’’(x)

p25. Convolutional Code Interleaved form of a (n,k,m) CV (n,k,m)CV={c(x)|(c 1 (x),c 2 (x),…,c n (x))} c i (x)=c i,0 +c i,1 x+c i,2 x 2 +c i,3 x 3 +…,, 1 ≦ i ≦ n, and 0 ≦ j c(x)=( c 1,0 + c 1,1 x + c 1,2 x 2 + c 1,3 x 3 + …, c 2,0 + c 2,1 x + c 2,2 x 2 + c 2,3 x 3 + …, c 3,0 + c 3,1 x + c 3,2 x 2 + c 3,3 x 3 + …, c n,0 + c n,1 x + c n,2 x 2 + c n,3 x 3 + … ) c = c 1,0 c 2,0 … c n,0, c 1,1 c 2,1 … c n,1, c 1,2 c 2,2 … c n,2, …… c(x)=c 1 (x n )+c 2 (x n ) ‧ x +c 3 (x n ) ‧ x 2 +c 4 (x n ) ‧ x 3 +…. interleaved form : 1 x1x1 x n-1 xnxn …x n+1 … x 2n …… Poly. terms : x 2n+1 x 2n+2

p26. Convolutional Code The message m(x)=1+x 2 is encoded to c(x)=(c 1 (x),c 2 (x))=((1+x 2 )g 1 (x), (1+x 2 )g 2 (x)) =(1+x+x 2 +x 5, 1+x 3 +x 4 +x 5 ) ( …, …) The interleaved representation of c and c(x) are c= …. c(x)= c 1 (x 2 )+c 2 (x 2 ) ‧ x = 1+x 2 +x 4 +x 10 +(1+x 6 +x 8 +x 10 )x=1+x+x 2 +x 4 +x 7 +x 10 +x 11 Example : (2,1,3) CV g 1 (x)=1+x+x 3 m(x)m(x) c1c1 c2c2 g 2 (x)=1+x 2 +x 3 (2,1,3) CV ={c(x)=(m(x) ‧ g 1 (x), m(x) ‧ g 2 (x)}

p27. Convolutional Code The rate of a (n,k,m) convolutional code The rate of a (n,k,m)CV is defined to be k/n. convolutional code with k>1 Example : (3,2,3) CV g 1 (x)=1+x 3, g 2 (x)=1+x+x 3, g 3 (x)=x+x 2 +x 3 input c2c2 c3c3 c1c1 timeinput X0X0 X1X1 X2X2 X3X c1c c2c2 c3c Input : m= …

p28. Convolutional Code Property + 00a3a2a1a000a3a2a1a0 a1a1 a0a0 0 0 a3a3 a2a2 a1a1 a0a0 Example : (1,2,3) CV X0X0 X1X1 X2X2 X3X3 g (x)=1+x+x c 00a3a200a3a2 a1+a0a1+a0 a inputoutput a3+a2+a0a3+a2+a0 0 00a3a3 a2a2 a2a2

p29. Convolutional Code + g (x)=1+x+x 3 c g 1 (x)=1 0a 3 a 1 X0X0 X2X2 0a 2 a 0 X1X1 X3X3 g 2 (x)=1+x c2c2 c1c1 + c a3a2a1a000a3a2a1a0 c 1 (x) =(a 1 +a 3 x) ‧ 1 c 2 (x) =(a 0 +a 2 x) ‧ (1+x)= a 0 +(a 2 +a 0 )x+a 2 x 2 c(x)=c 1 (x)+c 2 (x) =(a 1 +a 0 )+(a 3 +a 2 +a 0 )x+a 2 x 2 X0X0 X1X1 X2X2 X3X3

p30. Convolutional Code Generator matrix of a (2,1,m)CV Generator matrix : G Input sequence : m=m 0 m 1 m 2 … Output sequence : c=m ‧ G (interleaved form) c 1 =c 1,0 c 1,1 c 1,2 c 1,3 … g 1 (x)=g 1,0 +g 1,1 x + …+g 1,m x m c 2 =c 2,0 c 2,1 c 2,2 c 2,3 … g 2 (x)=g 2,0 +g 2,1 x + …+g 2,m x m Input m (2,1,m)CV={c|c=m ‧ G} c=c 1,0 c 2,0 c 1,1 c 2,1 … g1,0g1,0 g1,1g1,1 g1,mg1,m g2,0g2,0 g2,1g2,1 g2,mg2,m

p31. Convolutional Code (1) input : m=m 0 c 1,0 = m 0 ‧ g 1,0 c 2,0 = m 0 ‧ g 2,0 c = [c 1,0 c 2,0 ]=[m 0 ‧ g 1,0 m 0 ‧ g 2,0 ] = m 0 [g 1,0 g 2,0 ] output : c=c 1,0 c 2,0

p32. Convolutional Code c 1,0 =m 0 ‧ g 1,0 c 2,0 =m 0 ‧ g 2,0 (2) input : m=m 0 m 1 output : c=c 1,0 c 2,0 c 1,1 c 2,1 c 1,1 =m 0 ‧ g 1,1 +m 1 ‧ g 1,0 c 2,1 =m 0 ‧ g 2,1 +m 0 ‧ g 2,0 c=[c 1,0 c 2,0 c 1,1 c 2,1 ]

p33. Convolutional Code (3) input : m=m 0 m 1 m 2 … m e output : c=c 1,0 c 2,0 c 1,1 c 2,1 c 1,2 c 2,2 …. g 1,0 g 2,0 g 1,1 g 2,1 g 1,0 g 2,0 g 1,2 g 2,2 g 1,1 g 2,1 g 1,m-2 g 2,m-2 g 1,m-2 g 2,m-2 g 1,m-1 g 2,m-1 g 1,0 g 2,0 g 1,m-3 g 2,m-3 g 1,m-2 g 2,m-2 g 1,m-1 g 2,m ex2e G =

p34. Convolutional Code Example : (2,1,3) CV Generator : g 1 (x)=1+x+x 3 g 2 (x)=1+x 2 +x 3 Input : m= c=m ‧ G =[ ] 0 0 6x12 =[ ]

p35. Convolutional Code state diagram of (n,1,m) convolutional code state : the contents of the first m registers in the shift register. (s=s 0 s 1 ….s m-1 ) zero state : each of the first m registers contains 0. input X0X0 X1X1 XmXm s0s0 s1s1 smsm X m-1 s m-1 Example : (1,1,m) CV time t s 0 s 1 ….s m-1 time t+1 0s 0 s 1 ….s m-2 1s 0 s 1 ….s m-2 or Input=0 or 1 g(x)=g 0 +g 1 x+…+g m x m time t-1 s 1 s 2 ….s m-1 0 s 1 s 2 ….s m-1 1 Input=s 0

g 1 (x)=1+x+x 3 c1c1 c2c2 g 2 (x)=1+x 2 +x 3 m Example : (2,1,3)CV : State diagram : 000 ttime state t+1 c1c2c1c input 0 input X0X0 X1X1 X2X2 X3X3 c 1 c 2 =output 0 input 0 11

g 1 (x)=1+x+x 3 c1c1 c2c2 g 2 (x)=1+x 2 +x 3 m time state000 0 input Example : encoding convolutional code by state diagram Input : m=101 (2,1,3)CV : State diagram :

Tabular form A state diagram of a (n,1,m)CV can also be represented in tabular form X0X1X2X3X0X1X2X   X1X2X3X1X2X3 X0X1X2X0X1X2 Example : (2,1,3)CV

p39. Convolutional Code [3] Decoding convolutional codes Idea of decoding convolutional codes Consider C 1 in example Suppose that the received word is * But there is no directed walk that would give an output of w. Therefore we are faced with finding a codeword that “most likely” (minimum hamming distance) fits w. encoder of convolution code noise channel decoder of convolution code

p40. Convolutional Code. Window size τ Window size is the amount of received codeword w we “see” when making each decoding decision. Hamming distance : H d (w1,w2). Hamming weight : H w (w1).

p41. Convolutional Code Exhaustive decoding algorithm (window size τ=1) input : received word w= output : correct decoding message Walk : (1) (2) (3) (4) The length of the walk : 4

p42. Convolutional Code input : received word w= output : most likely decoding message * (1) : walk and decode message digit 1. (2) : walk ? Random choosing one to decode message digit. (say, choose 110) (1) (2) (4) (3) (3) : walk and decode message digit 1. (4) : walk and decode message digit 0.

p43. Convolutional Code Exhaustive decoding algorithm (window size τ=2) input : received word w= …… and window size τ=2 output : ”most closely” message (tick 0) We start at state 000 in Example state diagram. (tick 1) we see w=11 00 We make the decoding decision to move to state 100 and decode first message digit as , 000, , 000, , 100, , 100, walkoutputDistance from 11 00

p44. Convolutional Code (tick 2) we see w=00 00 We make the decoding decision to move to state 110 and decode first message digit as , 010, , 010, , 110, , 110, walkoutputDistance from 00 00

p45. Convolutional Code Catastrophic (n,1,m) CV If its state diagram contains a zero weight cycle different form the loop on the zero state. For n=2, gcd(g 1 (x), g 2 (x)) !=1 if and only if the (2,1,m) CV is catastrophic. Example : g 1 (x)=1+x 3 =(1+x)(1+x+x 2 ) gcd(g 1 (x),g 2 (x))=1+x+x 2 g 2 (x)=x+x 2 +x 3 =x(1+x+x 2 )

p46. Convolutional Code The minimum distance of a convolutional code We are only considering non-catastrophic convolutional codes (2,1,3) CV=C 1 Example : d(C 1 )=H w ( …)=6

p47. Convolutional Code τ(e) Given a non-catastrophic convolutional code C for define τ(e) to be the least integer x such that all walks of length x in the state diagram that immediately leave the zero state have weight greater than 2e.

p48. Convolutional Code Theorem Let C be a non-catastrophic convolutional code. For any e, if any error pattern containing at most e errors in any τ(e) consecutive steps occurs during transmission, then the exhaustive decoding algorithm using the window size τ(e) will decode the received word correctly (2,1,3) CV=C 1 Example : e=1 H w (11 10)=3 H w (11 02)=3

p49. Convolutional Code e=2 H w ( )=4 (Hamming weight on 6 red edges) Choose (since the Hamming weight of any walk of length 7 > 2e=4)

p50. Convolutional Code How many errors can be corrected? Theorem says that if we use the exhaustive decoding algorithm with window size τ(1), then all error patterns with at most e=1 error in any τ(1) =2 consecutive ticks will be corrected. So for example, the error pattern e1 = … will be corrected. Also if we use the exhaustive decoding algorithm with window size τ(2), then all error patterns with at most e=2 errors in any τ(2)=7 consecutive ticks will be corrected. So for example, the error pattern e2 = … will be corrected.

p51. Convolutional Code Exhaustive decoding algorithm vs truncated Viterbi decoding algorithm Notice that the exhaustive decoding algorithm with window size τ(e) requires that we consider all walks of length to be the least integer x such that all walks of length τ(e) from the current state for each message digit to be decoded. Constructing all 2^τ(e) such walks at each tick is very time consuming, so we will present a faster truncated Viterbi decoding algorithm (dynamic programming approach) in the next section.

p52. Convolutional Code [4] Truncated Viterbi decoding This algorithm only makes 2 m calculations and stores 2 m walks of length τat each tick. The window size τ is chosen to be between 4m to 6m (a number more than τ(e)). For the first m ticks the decoder is still storing all walks from the zero state, each ending in a different state, so t=m is the first time at which we have exactly one walk ending in this state. For t>m, each state s saves an optimal walk W(s;t) and its corresponding distance d(s;t). Once t>= τ, a message digit is decoded at each tick. W(s;t)=x 0 x 1 … x τ-1 : optimal walk from current decoded state to state s at tick t (stored as a sequence of message digits, rather than a sequence of states). d(s;t) : distance between the outputs of W(s;t) and the corresponding received words

p53. Convolutional Code Algorithm : truncated Viterbi decoding of (n,1,m) convolutional codes with windows size τ Input : received word w=w 0 w 1 ….. each w i consists of n digits Output : ”most closely” message s : state s = s 0 s 1 …s m-1 (1) Initialization : t=0, define W(s ; t)= s 0 s 1 …s m-1 ﹡﹡ … ﹡ (of length τ) if s is the zero state otherwise (a) (b)

p54. Convolutional Code (2,1,3) CV code, τ= 5 (in Example 8.2.1) W(000,0)=000** d(000,0)=0 W(100,0)=100** d(100,0)=∞ Example : W(010,0)=010**d(010,0)=∞ W(110,0)=110** d(110,0)=∞ W(001,0)=001** d(001,0)=∞ W(101,0)=101**d(101,0)=∞ W(011,0)=011** d(011,0)=∞ W(111,0)=111**d(111,0)=∞

p55. Convolutional Code (2) Distance calculation : t>0 For each state s, define : the distance between the input w t-1 and the output on the directed edge from state (s 1,…,s m-1,i) to s, i=0,1. (a) (b)

p56. Convolutional Code Example : w=w 0 w 1 w 2 w 3 …= …. (2,1,3) CV code, τ= 5 (in Example 8.2.1) S=S 0 S 1 S 2 =011 When t=1, i=0 i=1

p57. Convolutional Code (3) Walk calculation : (a) If form W(s,t) from by adding the leftmost digit of s to the left of and then deleting the rightmost digit. S=s 0 s 1 …s m-1 Example : {i,j}={0,1}

p58. Convolutional Code (b) If W(s,t) from by adding the leftmost ] digit of s to the left of, replacing each digit that disagrees with with ﹡, and then deleting the rightmost digit. where Example : S=s 0 s 1 …s m-1 Then

p59. Convolutional Code (4) Decoding : For t ≧ τ, let If the rightmost digit in W(s;t) is the same, say i, for all then decode the message digit i; otherwise decode the message digit ﹡ Artificial Examples : Sd(s,t) W(s,t) S(t)={000,100} Decode to Sd(s,t) W(s,t) S(t)={000,100} Decode to * τ=7 (1)(2)

p60. Convolutional Code 2. E.g : consider convolutional code C1 in Example State diagram of (2,1,3) convolutional code in Example : Received word : w=w 0 w 1 w 2 …..= ….. τ=

p61. Convolutional Code t=0 W(s;0)=s ﹡﹡﹡﹡ for all state s; d(000;0)=0 and d(s’;0)= ∞ for all states s’ other than the zero state. 0, 000**** ∞, 100**** ∞, 010**** ∞, 110**** ∞, 001**** ∞, 101**** ∞, 011**** ∞, 111**** W(s;t)

p62. Convolutional Code t=1 : w t-1 =w 0 =11 s=000 : s=010 : s 0 W(000,0)=0000**** W(000,1)=0000*** s 0 W’(010,0)=010***** W(100,0)=100**** W(101,0)=101**** W’(010,0)=10***** W(010,1)=010*** input = 0 =S 0

p63. Convolutional Code t=0 0, 000**** ∞, 100**** ∞, 010**** ∞, 110**** ∞, 001**** ∞, 101**** ∞, 011**** ∞, 111**** t=1 2, 0000*** 0, 1000*** ∞, 010**** ∞, 110**** ∞, 001**** ∞, 101**** ∞, 011**** ∞, 111****

p64. Convolutional Code t=2 2, 00000** 4, 10000** 1, 01000** 1, 11000** ∞, 001**** ∞, 101**** ∞, 011**** ∞, 111**** t=2,3 : w 1 =00, w 2 =00 t=3 2, * 4, * 5, * 5, * 2, * 2, * 3, * 1, *

p65. Convolutional Code t=4 : w t-1 =w 3 =00 s=100 : s=010 : s 0 W(001,3)= * W(100,4)= input = 1 =S 0 s 0 W(101,3)= * W(010,4)= input = 0 =S 0

p66. Convolutional Code t=4 2, , , , , , , , t=3 2, * 4, * 5, * 5, * 2, * 2, * 3, * 1, *

p67. Convolutional Code t=5 : w t-1 =w 4 =00 S=000 : s 0 W(000,4)= W(000,5)= s 0 W’(010,0)=100**000 W(000,4)= W(001,4)= W’(010,0)=00**000 W(010,1)=100**00 input = 0 =S 0 S=100 : input = 1 =S 0

p68. Convolutional Code t=6 2, , , , , 001**10 4, 101**10 3, , t=5 2, , 100**00 3, , , , , , t=7 2, , 100**** 3, , , 001*1*1 4, 101*1*1 3, , t=7 : w t-1 =w 6 =00. We have reached t=τ. So we decode the rightmost digit in W(000,7)= , namely 0. Since d(000,7)=2<d(s,7), s=001~111 ( S(7)={000} )

p69. Convolutional Code t=9 2, , 100***0 5, 010**** 5, 110**** 4, , , , 111**** t=8 2, , 100**** 5, 010**** 5, 110**** 4, 001**** 4, 101**** 3, , t=10 2, , 100**** 5, 010**** 5, 110**** 4, , , 0111*** 5, 1110*** Decode to : 000