Download presentation
Presentation is loading. Please wait.
Published byElfreda Dennis Modified over 8 years ago
1
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han (20127738) Wireless and Mobile Communication System Lab
2
2 Outline Introduction 8.4.1 Turbo Code Concepts 8.4.2 Log-likelihood Algebra 8.4.3 Product Code Example 8.4.4 Encoding with Recursive Systematic Codes 8.4.5 A Feedback Decoder 8.4.6 The MAP Decoding Algorithm 8.4.7 MAP Decoding Example
3
Introduction 3 The code achieves a bit-error-probability of 10-5 at rate ½ over an AWGN channel and BPSK modulation at an Eb/N 0 of 0.7dB The codes are constructed of two or more component codes on different interleaved versions of the same information sequence. The concept behind the code is to pass soft decisions from the output of one decoder to the input of other decoder and to iterate this process several times to produce more reliable decisions Channel Coding Waveform M-ary signalling Antipodal Orthogonal Trellis-coded modulation Structured Sequences Block Convolutional Turbo A Turbo Code is a refinement of the concatenated encoding structure plus an iterative algorithm for decoding the associated sequence
4
Turbo Code Concepts Likelihood Functions A posteriori probability (APP) 4 8.4.1 and (8.61) (8.62)
5
Likelihood Function Where: x: random variable or test statistic that is obtained from output of demodulator or signal processing P(d=i|x) is the APP d=i represents data i belongs to the ith signal p(x|d=i): probability density function (pdf) of x in condition d=i p(d=i): priori probability APP of a received signal from (8.61) can be thought as the result of an experiment which before experiment there exits a priori probability P(d=i), APP is a “refinement” of the priori knowledge about the data brought about by examining the received signal x. 5 8.4.1.1
6
The Two-Signal Class Case The binary logical elements 1 and 0 be represented electronically by voltages +1 and -1, respectively The variable d is used to represented the transmitted data bit, whether voltages or logical elements. Signal is transmitted over an AWGN channel. 6 8.4.1.2
7
The Two-Signal Class Case Maximum likelihood decision rule d k =+1 or d k =-1 is chosen associated with the larger of ℓ 1 or ℓ 2, respectively For each data bit at time k, d=+1 if x k falls to the right side of the decision line γ 0, otherwise d=-1 Maximum a posteriori (MAP) decision rule 7 8.4.1.2 H 1 : d=+1 H 2 : d=-1 (8.63)
8
Log-likelihood Ratio (LLR) From (8.63) by taking logarithm 8 8.4.1.3 (8.66; 8.67 and 8.68) L(d|x): Real number represents a soft decision out of the detector L(x|d): LLR of the test statistic x obtained by measurement of the channel output under alternate conditions d=+1 or d=-1 L(d): Priori LLR of the data bit d
9
Log-likelihood Ratio (LLR) To simplify the notation, rewrite these equation (6.68) 9 8.4.1.3 (6.69) For a systematic code, LLR (soft output) L(d^) is: (6.70) (6.71) Where: L’(d^): LLR of a data bit out of the demodulator (or input to the decoder) L e (d^): Extrinsic LLR, represents extra knowledge gleaned from the decoding process. The soft decoder output L(d^) is a real number that provides a hard decision as well as the reliability of the decision which the sign of L(d^) denote the hard decision and the magnitude of L(d^) denotes the reliability of the decision.
10
Principles of Iterative (Turbo) Decoding In a typical communication receiver, a demodulator often produces soft decisions which the transferred to a decoder A decoder could be called a soft input/hard output decoder because the final decoding process are bits In turbo codes, two or more component codes are used, therefore the output of a decoder is the input of other decoder Hence, the decoder in turbo code is called a soft-input/soft-output decoder 10 8.4.1.4
11
Principles of Iterative (Turbo) Decoding 11 8.4.1.4 Figure 8.21 Soft input/soft output decoder (for a systematic code)
12
Log-likelihood Algebra 12 8.4.2 For statistically independent data d, the sum of two log likelihood ratio (LLRs) is defined as: (8.72) (8.73) Where: sgn(.): polarity function : modulo-2 sum +: ordinary sum [+]: log-likelihood addition L(d)[+]∞=-L(d) L(d)[+]0=0
13
Product Code Example 13 8.4.3 This example represents a simple concatenated code which are 2-dimetional code as depicted in Figure 8.22 We are going to compute the soft output LLR (6.71) by computing each three component: L c( x), L(d) and L e (d^). The final decision is based on the soft output LLR L(d^) k 1 : row data bits n 1 -k 1 : row parity bits k 2 : column data bits n 2 -k 2 : column parity bits Figure 8.22 Product codes
14
Iterative Decoding Algorithm
15
Two-Dimensional Single-Parity Code Example 15 8.4.3.1 Figure 8.23 describes a two-dimensional single-parity code. The relationship between data and parity bits within row (or column) are: Transmitted sequence: d 1 d 2 d 3 d 4 p 12 p 34 p 13 p 24 (10011111) Figure 8.23 (a) Encoder output binary bits (8.75) (8.76)
16
Decoder-input LLR 16 Assuming an AWGN model, the LLR for the channel measurement of a signal x k at time k is: Figure 8.23(b) Decoder input LLR L c( x)
17
Extrinsic Likelihood 17 8.4.3.2 For the 2-dimensional product code in this example, the soft output for the received signal corresponding to data d 1 is: In general And extrinsic LLR contribution for the code: (8.82) (8.83)
18
Computing the Extrinsic Likelihood 18 8.4.3.3 The horizontal L eh (d^) and vertical L ev (d^):
19
Computing the Extrinsic Likelihood 19 8.4.3.3 After four iterations yields the values of L(d^)
20
Encoding with Recursive Systematic Codes 20 8.4.4 Nonsystematic covolutional code (NSC) Codewords are:
21
Encoding with Recursive Systematic Codes 21 8.4.4 Recursive Systematic Convolutional (RSC) Code Figure 8.25 (a) Recursive Systematic Convolutinal Code (RSC) (b) Trellis structure for the RSC code in (a) (a) (b)
22
Recursive Systematic Convolutional Code 22 Verifying trellis structure of RSC code
23
Recursive Systematic Convolutional Code 23 Finding the output codewords corresponding with the input {d k }=1110 00 10 01 11 10 11 00
24
Concatenation of RSC Codes 24 8.4.4.1 Figure 8.26 Parallel concatenation of two RSC encoders
25
Question The below figure illustrate a RSC code rate ½ and K=4. Form a table that describes all possible transitions, and use the table to draw the trellis section. 25
26
26
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.