Presentation is loading. Please wait.

Presentation is loading. Please wait.

IV. Convolutional Codes

Similar presentations


Presentation on theme: "IV. Convolutional Codes"— Presentation transcript:

1 IV. Convolutional Codes

2 Introduction Block Codes: Code words are produced on a block by block basis. In Block Codes, the encoder must buffer an entire block before generating the associated codeword. Some applications have bits arrive serially rather than in large blocks Convolutional codes operate on the incoming message sequence continuously in a serial manner

3 Convolutional Codes Specification
A convolutional code is specified by three parameters (n, k, K), where k/n is the coding rate and determines the number of data bits per coded bit K is called the constraint length of the encoder where the encoder has K-1 memory elements

4 Convolutional Encoder: Example
Rate ½ Convolutional Encoder 1 + c2 Input Output 1 0 1 c1 + 1

5 Convolutional Encoder: Example
Rate ½ Convolutional Encoder 1 + c2 Input Output 1 0 1 1 1 1 c1 +

6 Convolutional Encoder: Example
Rate ½ Convolutional Encoder + c2 Input Output 1 0 1 1 c1 +

7 Convolutional Encoder: Example
Rate ½ Convolutional Encoder + c2 Input Output 1 0 1 1 c1 +

8 State Diagram Representation
0/00 S0 b0 b1 0/11 1/11 1/00 S2 S1 0/01 States (b0b1) s0 00 s1 10 s2 01 s3 11 0/10 1/10 S3 1/01 Input 0 Input 1

9 Trellis Representation
0/00 S 0/11 1/11 1/00 S S 2 1 0/01 0/10 1/10 S 3 1/01 00 00 00 00 11 01 10 s0 (0 0) 11 11 11 s1 (1 0) 11 00 01 01 s2 (0 1) 10 10 10 s3 (1 1) 01

10 Trellis Representation
0/00 S 0/11 1/11 1/00 S S 2 1 0/01 Input: 101 Output: 001011 0/10 1/10 S 3 1/01 00 00 00 00 00 00 s0 (0 0) 11 11 11 11 11 11 s1 (1 0) 11 11 11 11 00 00 00 00 01 01 01 01 01 s2 (0 1) 10 10 10 10 10 10 10 10 10 s3 (1 1) 01 01 01 01

11 Maximum Likelihood Decoding
What is the transmitted sequence that will most likely result in the received sequence at the decoder side? Viterbi Decoding of Convolutional Codes: Maximum likelihood decoding algorithm An algorithm that finds the closest codeword to a given received sequence Hard Decision: Closest  Minimum Hamming Distance Soft Decision : Closest  Decoding Accounts for Reliability of Decision

12 Viterbi Decoding: Hard Decision Example
0/00 S0 b0 b1 0/11 1/11 1/00 S2 S1 Hard Decision: The receiver makes a firm hard decision whether one or zero is received The receiver provides no information to the decoder characterizing reliability of its decision The input to the decoder is only zero or one 0/01 0/10 1/10 S3 States (b0b1) S S S S 1/01 Input 0 Input 1

13 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 00 1 00 00 00 11 01 10 s0 (0 0) 11 11 11 1 s1 (1 0) 11 00 01 01 s2 (0 1) 10 10 10 s3 (1 1) 01

14 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 00 1 00 3 00 00 11 01 10 s0 (0 0) 11 11 11 1 1 s1 (1 0) 11 00 01 01 s2 (0 1) 10 10 2 10 s3 (1 1) 01 2

15 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 4 3 00 1 00 3 00 00 11 01 10 s0 (0 0) 11 11 11 4 1 1 s1 (1 0) 11 3 00 01 01 1 s2 (0 1) 10 10 4 2 10 s3 (1 1) 01 2 3 2

16 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 3 00 1 00 3 00 11 01 10 s0 (0 0) 11 11 1 1 s1 (1 0) 11 3 00 01 01 1 s2 (0 1) 10 2 s3 (1 1) 01 2 2

17 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 3 3 3 00 1 00 3 00 11 01 10 s0 (0 0) 11 11 5 1 1 s1 (1 0) 11 3 1 00 01 01 1 4 s2 (0 1) 10 3 2 s3 (1 1) 01 2 2 4 3

18 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is Two Equivalent Paths: Eliminate one of them at random 10 11 01 00 11 10 3 3 3 00 1 00 3 00 00 00 s0 (0 0) 11 11 11 11 1 1 s1 (1 0) 11 3 11 1 11 11 00 00 00 00 01 01 01 01 1 s2 (0 1) 10 10 10 3 2 10 10 10 s3 (1 1) 01 01 01 01 2 2 3

19 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 3 3 00 1 00 3 00 00 00 s0 (0 0) 11 11 11 11 1 1 s1 (1 0) 11 3 1 11 11 00 00 00 00 01 01 01 01 1 s2 (0 1) 10 10 10 3 2 10 10 10 s3 (1 1) 01 01 01 01 2 2 3

20 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 5 3 3 3 00 1 00 3 00 00 00 s0 (0 0) 11 11 11 11 3 1 1 s1 (1 0) 11 3 1 11 5 11 00 00 00 00 01 01 01 01 1 2 s2 (0 1) 10 10 10 3 4 2 10 10 10 s3 (1 1) 01 01 01 01 2 2 3 2 4

21 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 3 3 3 00 1 00 3 00 00 s0 (0 0) 11 11 11 11 3 1 1 s1 (1 0) 11 3 1 11 11 00 00 00 01 01 01 01 1 2 s2 (0 1) 10 10 10 3 2 10 10 s3 (1 1) 01 01 01 2 2 3 2

22 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 4 3 3 3 3 00 1 00 3 00 00 s0 (0 0) 11 11 11 11 3 4 1 1 s1 (1 0) 11 3 1 11 11 3 00 00 00 01 01 01 01 1 2 5 s2 (0 1) 10 10 10 3 2 2 10 10 s3 (1 1) 01 01 01 2 2 3 2 3 4

23 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is 10 11 01 00 11 10 3 3 3 3 00 1 00 3 00 s0 (0 0) 11 11 11 3 1 1 s1 (1 0) 11 3 1 11 11 3 00 00 00 01 01 01 1 2 s2 (0 1) 10 10 10 3 2 2 10 10 s3 (1 1) 01 01 2 2 3 2 3

24 Viterbi Decoder Hard Decision
Assume received (Hard Decision) vector is Message ( ) Decoded vector is 10 11 01 00 11 10 3 3 3 3 00 1 00 3 00 s0 (0 0) 11 11 11 3 1 1 s1 (1 0) 11 3 1 11 11 3 00 00 00 01 01 01 1 2 s2 (0 1) 10 10 10 3 2 2 10 10 s3 (1 1) 01 01 2 2 3 2 3


Download ppt "IV. Convolutional Codes"

Similar presentations


Ads by Google