1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han (20127738) Wireless and Mobile Communication System Lab.

Slides:



Advertisements
Similar presentations
Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Advertisements

Iterative Equalization and Decoding
Detection Chia-Hsin Cheng. Wireless Access Tech. Lab. CCU Wireless Access Tech. Lab. 2 Outlines Detection Theory Simple Binary Hypothesis Tests Bayes.
1. INTRODUCTION In order to transmit digital information over * bandpass channels, we have to transfer the information to a carrier wave of.appropriate.
Modern Digital and Analog Communication Systems Lathi Copyright © 2009 by Oxford University Press, Inc. C H A P T E R 15 ERROR CORRECTING CODES.
Maximum Likelihood Sequence Detection (MLSD) and the Viterbi Algorithm
Chapter 6 Information Theory
Near Shannon Limit Performance of Low Density Parity Check Codes
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Turbo Codes Azmat Ali Pasha.
Chapter 11 Error-Control CodingChapter 11 : Lecture edition by K.Heikkinen.
06 Dec 04Turbo Codes1 TURBO CODES Michelle Stoll.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
EE436 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Analysis of Iterative Decoding
ECED 4504 Digital Transmission Theory
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 7 Low Density Parity Check Codes.
Telex Magloire Ngatched Centre for Radio Access Technologies University Of Natal Durban, South-Africa Telex Magloire Ngatched Centre for Radio Access Technologies.
Contact: Robust Wireless Communication System for Maritime Monitoring Robust Wireless Communication System for Maritime Monitoring.
III. Turbo Codes.
Iterative Soft Decoding of Reed-Solomon Convolutional Concatenated Codes Li Chen Associate Professor School of Information Science and Technology, Sun.
A Novel technique for Improving the Performance of Turbo Codes using Orthogonal signalling, Repetition and Puncturing by Narushan Pillay Supervisor: Prof.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Basic Characteristics of Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding.
Wireless Information Networking Group Cooperative Diversity Techniques for Wireless Networks Arun ‘Nayagam Wireless Information Networking Group (WING)
Iterative decoding If the output of the outer decoder were reapplied to the inner decoder it would detect that some errors remained, since the columns.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Iterative Equalization
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
VIRGINIA POLYTECHNIC INSTITUTE & STATE UNIVERSITY MOBILE & PORTABLE RADIO RESEARCH GROUP MPRG Combined Multiuser Detection and Channel Decoding with Receiver.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Last time, we talked about:
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Low Density Parity Check codes
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Error Correction Code (2)
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
A simple rate ½ convolutional code encoder is shown below. The rectangular box represents one element of a serial shift register. The contents of the shift.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
Matthew Valenti West Virginia University
Log-Likelihood Algebra
Implementation of Turbo Code in TI TMS320C8x Hao Chen Instructor: Prof. Yu Hen Hu ECE734 Spring 2004.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3c: Signal Detection in AWGN.
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
10/19/20051 Turbo-NFSK: Iterative Estimation, Noncoherent Demodulation, and Decoding for Fast Fading Channels Shi Cheng and Matthew C. Valenti West Virginia.
The Viterbi Decoding Algorithm
An Efficient Software Radio Implementation of the UMTS Turbo Codec
MAP decoding: The BCJR algorithm
S Digital Communication Systems
Chapter 6.
Error Correction Code (2)
Error Correction Code (2)
Mr. Ali Hussain Mugaibel
Digital Communication Chapter 1: Introduction
Chapter 10: Error-Control Coding
Error Correction Code (2)
Error Correction Coding
IV. Convolutional Codes
Presentation transcript:

1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab

2 Outline Introduction Turbo Code Concepts Log-likelihood Algebra Product Code Example Encoding with Recursive Systematic Codes A Feedback Decoder The MAP Decoding Algorithm MAP Decoding Example

Introduction 3 The code achieves a bit-error-probability of 10-5 at rate ½ over an AWGN channel and BPSK modulation at an Eb/N 0 of 0.7dB The codes are constructed of two or more component codes on different interleaved versions of the same information sequence. The concept behind the code is to pass soft decisions from the output of one decoder to the input of other decoder and to iterate this process several times to produce more reliable decisions Channel Coding Waveform M-ary signalling Antipodal Orthogonal Trellis-coded modulation Structured Sequences Block Convolutional Turbo  A Turbo Code is a refinement of the concatenated encoding structure plus an iterative algorithm for decoding the associated sequence

Turbo Code Concepts Likelihood Functions A posteriori probability (APP) and (8.61) (8.62)

Likelihood Function Where: x: random variable or test statistic that is obtained from output of demodulator or signal processing P(d=i|x) is the APP d=i represents data i belongs to the ith signal p(x|d=i): probability density function (pdf) of x in condition d=i p(d=i): priori probability APP of a received signal from (8.61) can be thought as the result of an experiment which before experiment there exits a priori probability P(d=i), APP is a “refinement” of the priori knowledge about the data brought about by examining the received signal x

The Two-Signal Class Case The binary logical elements 1 and 0 be represented electronically by voltages +1 and -1, respectively The variable d is used to represented the transmitted data bit, whether voltages or logical elements. Signal is transmitted over an AWGN channel

The Two-Signal Class Case Maximum likelihood decision rule d k =+1 or d k =-1 is chosen associated with the larger of ℓ 1 or ℓ 2, respectively For each data bit at time k, d=+1 if x k falls to the right side of the decision line γ 0, otherwise d=-1 Maximum a posteriori (MAP) decision rule H 1 : d=+1 H 2 : d=-1 (8.63)

Log-likelihood Ratio (LLR) From (8.63) by taking logarithm (8.66; 8.67 and 8.68) L(d|x): Real number represents a soft decision out of the detector L(x|d): LLR of the test statistic x obtained by measurement of the channel output under alternate conditions d=+1 or d=-1 L(d): Priori LLR of the data bit d

Log-likelihood Ratio (LLR) To simplify the notation, rewrite these equation (6.68) (6.69) For a systematic code, LLR (soft output) L(d^) is: (6.70) (6.71) Where: L’(d^): LLR of a data bit out of the demodulator (or input to the decoder) L e (d^): Extrinsic LLR, represents extra knowledge gleaned from the decoding process. The soft decoder output L(d^) is a real number that provides a hard decision as well as the reliability of the decision which the sign of L(d^) denote the hard decision and the magnitude of L(d^) denotes the reliability of the decision.

Principles of Iterative (Turbo) Decoding In a typical communication receiver, a demodulator often produces soft decisions which the transferred to a decoder A decoder could be called a soft input/hard output decoder because the final decoding process are bits In turbo codes, two or more component codes are used, therefore the output of a decoder is the input of other decoder Hence, the decoder in turbo code is called a soft-input/soft-output decoder

Principles of Iterative (Turbo) Decoding Figure 8.21 Soft input/soft output decoder (for a systematic code)

Log-likelihood Algebra For statistically independent data d, the sum of two log likelihood ratio (LLRs) is defined as: (8.72) (8.73) Where: sgn(.): polarity function : modulo-2 sum +: ordinary sum [+]: log-likelihood addition L(d)[+]∞=-L(d) L(d)[+]0=0

Product Code Example This example represents a simple concatenated code which are 2-dimetional code as depicted in Figure 8.22 We are going to compute the soft output LLR (6.71) by computing each three component: L c( x), L(d) and L e (d^). The final decision is based on the soft output LLR L(d^) k 1 : row data bits n 1 -k 1 : row parity bits k 2 : column data bits n 2 -k 2 : column parity bits Figure 8.22 Product codes

Iterative Decoding Algorithm

Two-Dimensional Single-Parity Code Example Figure 8.23 describes a two-dimensional single-parity code. The relationship between data and parity bits within row (or column) are: Transmitted sequence: d 1 d 2 d 3 d 4 p 12 p 34 p 13 p 24 ( ) Figure 8.23 (a) Encoder output binary bits (8.75) (8.76)

Decoder-input LLR 16 Assuming an AWGN model, the LLR for the channel measurement of a signal x k at time k is: Figure 8.23(b) Decoder input LLR L c( x)

Extrinsic Likelihood For the 2-dimensional product code in this example, the soft output for the received signal corresponding to data d 1 is: In general And extrinsic LLR contribution for the code: (8.82) (8.83)

Computing the Extrinsic Likelihood The horizontal L eh (d^) and vertical L ev (d^):

Computing the Extrinsic Likelihood After four iterations yields the values of L(d^)

Encoding with Recursive Systematic Codes Nonsystematic covolutional code (NSC) Codewords are:

Encoding with Recursive Systematic Codes Recursive Systematic Convolutional (RSC) Code Figure 8.25 (a) Recursive Systematic Convolutinal Code (RSC) (b) Trellis structure for the RSC code in (a) (a) (b)

Recursive Systematic Convolutional Code 22 Verifying trellis structure of RSC code

Recursive Systematic Convolutional Code 23 Finding the output codewords corresponding with the input {d k }=

Concatenation of RSC Codes Figure 8.26 Parallel concatenation of two RSC encoders

Question The below figure illustrate a RSC code rate ½ and K=4. Form a table that describes all possible transitions, and use the table to draw the trellis section. 25

26