Presentation is loading. Please wait.

Presentation is loading. Please wait.

III. Turbo Codes.

Similar presentations


Presentation on theme: "III. Turbo Codes."— Presentation transcript:

1 III. Turbo Codes

2 Applications of Turbo Codes
Worldwide Applications & Standards Data Storage Systems DSL Modem / Optical Communications 3G (Third Generation) Mobile Communications Digital Video Broadcast (DVB) Satellite Communications IEEE WiMAX

3 Basic Concept of Turbo Codes
Invented in 1993 by Alian Glavieux, Claude Berrou and Punya Thitimajshima Basic Structure: a) Recursive Systematic Convolutional (RSC) Codes b) Parallel concatenation and Interleaving c) Iterative decoding

4 Recursive Systematic Convolutional Codes
Non Systematic Convolutional Code Recursive Systematic Convolutional Code c1 c1 + + + + + c2 c2 g1=[1 1 1] g2=[1 0 1] g1=[1 1 1] g2=[1 0 1] G=[g1 g2] G=[1 g2/g1]

5 Non-Recursive Encoder Trellis Diagram
00 00 00 00 11 10 01 s0 (0 0) 11 11 11 s1 (1 0) 11 00 10 10 s2 (0 1) 01 01 01 s3 (1 1) 10

6 Systematic Recursive Encoder Trellis Diagram
00 00 00 00 11 10 01 s0 (0 0) 11 11 11 s1 (1 0) 11 00 10 10 s2 (0 1) 01 01 01 s3 (1 1) 10

7 Non-Recursive & Recursive Encoders
Non Recursive and recursive encoders both have the same trellis diagram structure Generally Recursive encoders provide better weight distribution for the code The difference between them is in the mapping of information bits to codewords

8 Parallel Concatenation with Interleaving
xk xk RSC Encoder 1 yk(1) π RSC Encoder 2 yk(2) 1/3 Turbo Encoder Two component RSC Encoders in parallel separated by an Interleaver The job of the interleaver is to de-correlate the encoding process of the two encoders Usually the two component encoder are identical xk : Systematic information bit yk(1) : Parity bit from the first RSC encoder yk(2) : Parity bit from the second RSC encoder

9 Interleaving Permutation of data to de-correlate encoding process of both encoders If encoder 1 generates a low weight codeword then hopefully the interleaved input information would generate a higher weight codeword By avoiding low weight codewords the BER performance of the turbo codes could improve significantly

10 Turbo Decoder rk RSC Decoder 1 xk* π-1 π RSC Decoder 2 Iteratively exchanging information between decoder 1 and decoder 2 before making a final deciding on the transmitted bits

11 Turbo Codes Performance
Principles of Turbo Codes: Keattisak Sripiman 

12 Turbo Decoding Turbo decoders rely on probabilistic decoding of component RSC decoders Iteratively exchanging soft-output information between decoder 1 and decoder 2 before making a final deciding on the transmitted bits As the number of iteration grows, the decoding performance improves

13 Turbo Coded System Model
u=[u1, …, uk, …,uN] xs ys Turbo Decoding RSC Encoder 1 x1p y1p x y u* MUX Channel DEMUX N-bit Interleaver π RSC Encoder 2 x2p y2p u = [u1, …, uk, …,uN] : vector of N information bits xs = [x1s, …, xks, …,xNs] : vector of N systematic bits after turbo-coding of u x1p = [x11p, …, xk1p, …,xN1p] : vector of N parity bits from first encoder after turbo-coding of u x2p = [x12p, …, xk2p, …,xN2p] : vector of N parity bits from second encoder after turbo-coding of u x = [x1s, x11p, x12p, …, xNs, xN1p, xN2p] : vector of length 3N of turbo-coded bits for u y = [y1s, y11p, y12p, …, yNs, yN1p, yN2p] : vector of 3N received symbols corresponding to turbo-coded bits of u ys = [y1s, …, yks, …,yNs] : vector of N received symbols corresponding to systematic bits in xs y1p = [y11p, …, yk1p, …,yN1p] : vector of N received symbols corresponding to first encoder parity bits in x1p y2p = [y12p, …, yk2p, …,yN2p] : vector of N received symbols corresponding to second encoder parity bits in x2p u* = [u1*, …, uk*, …,uN*] : vector of N turbo decoder decision bits corresponding to u Pr[uk|y] is known as the a posteriori probability of kth information bit

14 Log-Likelihood Ratio (LLR)
The LLR of an information bit uk is given by: Given that Pr[uk=+1]+Pr[uk=-1]=1

15 Maximum A Posteriori Algorithm
u=[u1, …, uk, …,uN] xs ys Turbo Decoding RSC Encoder 1 x1p y1p x y u* MUX Channel DEMUX N-bit Interleaver π RSC Encoder 2 x2p y2p

16 Some Definitions & Conventions
u=[u1, …, uk, …,uN] xs ys Turbo Decoding y1p RSC Encoder 1 x1p x y u* MUX Channel DEMUX N-bit Interleaver π RSC Encoder 2 x2p y2p uk ykp Sk-1 Sk s0 s0 s1 s1 s2 s2 uk=-1 uk=+1 s3 s3

17 Derivation of LLR uk ykp
Define S(i) as the set of pairs of states (s’,s) such that the transition from Sk-1=s’ to Sk=s is caused by the input uk=i, i=0,1, i.e., S(0) ={(s0 ,s0), (s1 ,s3), (s2 ,s1), (s3 ,s2)} S(1) ={(s0 ,s1), (s1 ,s2), (s2 ,s0), (s3 ,s3)} Sk-1 Sk s0 s0 s1 s1 s2 s2 uk=-1 uk=+1 s3 s3 Remember Bayes’ Rule

18 Derivation of LLR Define

19 Derivation of LLR depends only on Sk and is independent on Sk-1, yk and Sk , yk depends only on Sk-1 and is independent on

20 Derivation of LLR

21 Derivation of LLR S0 Sk-1 Sk Sk+1 SN s0 s1 s2 s3 y= y1 yk-1 yk yk+1 yN

22 Computation of αk(s) Example

23 Forward Recursive Equation
Computation of αk(s) Forward Recursive Equation Given the values of γk(s’,s) for all index k, the probability αk(s) can be forward recursively computed. The initial condition α0(s) depends on the initial state of the convolutional encoder The encoder usually starts at state 0

24 Backward Recursive Equation
Computation of βk(s) Backward Recursive Equation Given the values of γk(s’,s) for all index k, the probability βk(s) can be backward recursively computed. The initial condition βN(s) depends on the final state of the trellis First encoder usually finishes at state s0 Second encoder usually has open trellis

25 Computation of γk(s’,s)
Note that

26 Computation of γk(s’,s)


Download ppt "III. Turbo Codes."

Similar presentations


Ads by Google