Download presentation
Presentation is loading. Please wait.
Published byCora Banks Modified over 8 years ago
1
1 Convolutional Codes An (n,k,m) convolutional encoder will encode a k bit input block into an n-bit ouput block, which depends on the current input block and the m preceding input blocks. History: Elias (1955) Introduced the codes Wozencraft (1961) Sequential decoding Massey (1963) Majority logic decoding Viterbi (1967) ML decoding Berrou et. al. (1993) Turbo codes
2
2 Descriptions and representations Shift register representation Scalar encoder matrix representation ”Polynomial” matrix representation State diagram representation Trellis representation Tree representation Parity check matrix/Syndrome former representation
3
3 Example Controller canonical form k input sequences enter from the left; n output sequences produced by external mod-2 adders Input sequence u = (u 0,u 1,u 2,,… ) Output sequence v = (v 0 (0), v 0 (1), v 1 (0), v 1 (1), v 2 (0), v 2 (1), …) Impulse response sequences: u= (1,0,0,0,…) produces output g (0) = (1, 0, 1, 1) g (1) = (1, 1, 1, 1)
4
4 Example (cont.) Input sequence u = (u 0,u 1,u 2,,… ) Output sequence v = (v 0 (0), v 0 (1), v 1 (0), v 1 (1), v 2 (0), v 2 (1), …) Impulse response sequences: u= (1,0,0,0,…) produces output g (0) = (1, 0, 1, 1) g (1) = (1, 1, 1, 1) Encoding equations v l (0) = u l + u l - 2 + u l - 3 v l (1) = u l + u l - 1 + u l - 2 + u l – 3 Or, simplified v (0) = u * g (0), v (1) = u * g (1) * is a discrete convolution v (j) = u * g (j) = i=0..m u l – i g i u l – i = 0 for l<i
5
5 Scalar encoder matrix method Encoding of u: v = uG Example (continued): For u = (101110000000….)
6
6 Example: (3,2,1) convolutional code Impulse response sequences: g i (j) corresponds to input i, output j g 1 (0) = (1, 1), g 1 (1) = (0, 1), g 1 (2) = (1, 1) g 2 (0) = (0, 1), g 2 (1) = (1, 0), g 2 (2) = (1, 0)
7
7 Example: (3,2,1) CC g 1 (0) = (1, 1), g 1 (1) = (0, 1), g 1 (2) = (1, 1) g 2 (0) = (0, 1), g 2 (1) = (1, 0), g 2 (2) = (1, 0) Encoding equations v l (0) = u l (1) + u l - 1 (1) + u l - 1 (2) v l (1) = u l (2) + u l - 1 (1) v l (2) = u l (1) + u l (2) + u l - 1 (1) Or, simplified v (0) = u (1) * g 1 (0) + u (2) * g 2 (0) v (1) = u (1) * g 1 (1) + u (2) * g 2 (1) v (2) = u (1) * g 1 (2) + u (2) * g 2 (2)
8
8 (3,2,1) CC, generator matrix In general for an (n,k,m) CC, the generator matrix is where
9
9 Important definitions Nominal code rate R = k/n Length of ith shift register: i Encoder memory order m = max 0 i k-1 i The overall constraint length = 0 i k-1 i An (n,k, ) convolutional code is the set of all output sequences produced by an (n,k, ) convolutional encoder Effective code rate R eff : Input length kL, output length is n(L+m) R eff = kL / n(L+m) = RL/(L+m)
10
10 Polynomial matrix representation A (2,1,m) CC can be described by a polynomial (transform domain) representation of the input, output and generator sequences u(D) = u 0 + u 1 D +u 2 D 2 + … v (i) (D) = v 0 + v 1 (i) D +v 2 (i) D 2 + … for i=0,1 g (i) (D) = g 0 + g 1 (i) D +g 2 (i) D 2 + …+ g m (i) D m for i=0,1 Then the encoding can be written as v (i) (D) = u(D) g (i) (D) for i=0,1 Combining the output streams: v = (v 0 (0), v 0 (1), v 1 (0), v 1 (1), v 2 (0), v 2 (1), …) v(D) = v (0) (D 2 ) + Dv (1) ( D 2 )
11
11 Example (2,1,3) code: g (0) = (1, 0, 1, 1), g (1) = (1, 1, 1, 1) g (0) (D) = 1 + D 2 + D 3, g (1) (D) = 1 + D + D 2 + D 3 Encoding equations: v (i) (D) = u(D) g (i) (D) for i=0,1 Polynomial matrix form (Transform domain form): ( v (0) (D), v (1) (D) ) = u(D) ( g (0) ( D), g (1) (D) ) In general:
12
12 Transform domain: Relation to constraint length Length of ith shift register: i = max 0 j n-1 deg g i (j) (D)
13
13 Systematic encoders An (n,k,m) convolutional encoder is systematic if the first k output sequences are a copy of the k information sequences. All convolutional codes have systematic encoders, but there are codes that do not have feedforward systematic encoders
14
14 Parity check matrices Starting with a systematic feedforward generator matrix it is easy to find a parity check matrix for the code: Thus for any codeword v vH T =0
15
15 Transform domain Starting with a systematic feedforward generator matrix it is easy to find a parity check matrix for the code: Thus for any codeword v v(D) H T (D) = 0(D) Parity check matrices exist for all convolutional codes (But they are less straightforward to find)
16
16 Example – Systematic encoder G(D) = [1, 1 + D + D 3 ]
17
17 Another example, and another encoder form Controller canonical form Observer canonical form one shift register per output sequence, modulo 2 adders internal to the shift registers Highest degree term at the left Note: Different memory requirements
18
18 Systematic feedback (recursive) encoders G(D) = [1 + D + D 2, 1 + D 2, 1 + D] G(D) = [D 2 + D + 1, D 2 + 1, D + 1] G’(D) = [1,(1 + D 2 )/(1 + D + D 2 ),(1 + D)/(1 + D + D 2 ) ] Infinite impulse response (not polynomial) Easier to represent in transform domain Generates the same set of words as G(D) Different encoder mapping Again easy to obtain parity check matrix
19
19 Suggested exercises 11.1-11.6
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.