III. Turbo Codes.

Slides:



Advertisements
Similar presentations
Iterative Equalization and Decoding
Advertisements

Convolutional Codes Representation and Encoding  Many known codes can be modified by an extra code symbol or by deleting a symbol * Can create codes of.
Inserting Turbo Code Technology into the DVB Satellite Broadcasting System Matthew Valenti Assistant Professor West Virginia University Morgantown, WV.
Modern Digital and Analog Communication Systems Lathi Copyright © 2009 by Oxford University Press, Inc. C H A P T E R 15 ERROR CORRECTING CODES.
II. Linear Block Codes. © Tallal Elshabrawy 2 Last Lecture H Matrix and Calculation of d min Error Detection Capability Error Correction Capability Error.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Cellular Communications
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Turbo Codes Azmat Ali Pasha.
06 Dec 04Turbo Codes1 TURBO CODES Michelle Stoll.
EE436 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Division of Engineering and Applied Sciences DIMACS-04 Iterative Timing Recovery Aleksandar Kavčić Division of Engineering and Applied Sciences Harvard.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Improving the Performance of Turbo Codes by Repetition and Puncturing Youhan Kim March 4, 2005.
Concatenated Codes, Turbo Codes and Iterative Processing
VIRGINIA POLYTECHNIC INSTITUTE & STATE UNIVERSITY MOBILE & PORTABLE RADIO RESEARCH GROUP MPRG Turbo Codes and Iterative Processing IEEE New Zealand Wireless.
Matthew C. Valenti (presenter)
Telex Magloire Ngatched Centre for Radio Access Technologies University Of Natal Durban, South-Africa Telex Magloire Ngatched Centre for Radio Access Technologies.
Daphne Koller Message Passing Loopy BP and Message Decoding Probabilistic Graphical Models Inference.
Copyright © 2003, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 1 Chapter 4 Channel Coding.
Contact: Robust Wireless Communication System for Maritime Monitoring Robust Wireless Communication System for Maritime Monitoring.
ECE 4371, Fall, 2014 Introduction to Telecommunication Engineering/Telecommunication Laboratory Zhu Han Department of Electrical and Computer Engineering.
1 (Chapter 15): Concatenated codes Simple (classical, single-level) concatenation Length of concatenated code: n 1 n 2 Dimension of concatenated code:
CODING/DECODING CONCEPTS AND BLOCK CODING. ERROR DETECTION CORRECTION Increase signal power Decrease signal power Reduce Diversity Retransmission Forward.
Codes Codes are used for the following purposes: - to detect errors - to correct errors after detection Error Control Coding © Erhan A. Ince Types: -Linear.
Iterative Soft Decoding of Reed-Solomon Convolutional Concatenated Codes Li Chen Associate Professor School of Information Science and Technology, Sun.
A Novel technique for Improving the Performance of Turbo Codes using Orthogonal signalling, Repetition and Puncturing by Narushan Pillay Supervisor: Prof.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Introduction of Low Density Parity Check Codes Mong-kai Ku.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Chapter Thirteen Channel Coding and Encryption.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding.
Iterative decoding If the output of the outer decoder were reapplied to the inner decoder it would detect that some errors remained, since the columns.
Iterative Equalization
VIRGINIA POLYTECHNIC INSTITUTE & STATE UNIVERSITY MOBILE & PORTABLE RADIO RESEARCH GROUP MPRG Combined Multiuser Detection and Channel Decoding with Receiver.
Iterative Channel Estimation for Turbo Codes over Fading Channels Matthew C. Valenti Assistant Professor Dept. of Comp. Sci. & Elect. Eng. West Virginia.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
Last time, we talked about:
Design and Implementation of Turbo Decoder for 4G standards IEEE e and LTE Syed Z. Gilani.
An ARQ Technique Using Related Parallel and Serial Concatenated Convolutional Codes Yufei Wu formerly with: Mobile and Portable Radio Research Group Virginia.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
VIRGINIA POLYTECHNIC INSTITUTE & STATE UNIVERSITY MOBILE & PORTABLE RADIO RESEARCH GROUP MPRG Iterative Multiuser Detection for Convolutionally Coded Asynchronous.
Channel Capacity. Techniques to reduce errors in digital communication systems Automatic repeat request (ARC) Forward error correction (FEC) Channel.
Log-Likelihood Algebra
Implementation of Turbo Code in TI TMS320C8x Hao Chen Instructor: Prof. Yu Hen Hu ECE734 Spring 2004.
A Bandwidth Efficient Pilot Symbol Technique for Coherent Detection of Turbo Codes over Fading Channels Matthew C. Valenti Dept. of Comp. Sci. & Elect.
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
II. Linear Block Codes. © Tallal Elshabrawy 2 Digital Communication Systems Source of Information User of Information Source Encoder Channel Encoder Modulator.
Classical Coding for Forward Error Correction Prof JA Ritcey Univ of Washington.
1 Convolutional Codes An (n,k,m) convolutional encoder will encode a k bit input block into an n-bit ouput block, which depends on the current input block.
Channel Coding and Error Control 1. Outline Introduction Linear Block Codes Cyclic Codes Cyclic Redundancy Check (CRC) Convolutional Codes Turbo Codes.
10/19/20051 Turbo-NFSK: Iterative Estimation, Noncoherent Demodulation, and Decoding for Fast Fading Channels Shi Cheng and Matthew C. Valenti West Virginia.
V. Non-Binary Codes: Introduction to Reed Solomon Codes
© Tallal Elshabrawy Trellis Coded Modulation. © Tallal Elshabrawy Trellis Coded Modulation: Introduction Increases the constellation size compared to.
Bridging the Gap Between Parallel and Serial Concatenated Codes
An Efficient Software Radio Implementation of the UMTS Turbo Codec
MAP decoding: The BCJR algorithm
January 2004 Turbo Codes for IEEE n
Mr. Ali Hussain Mugaibel
Chapter 10: Error-Control Coding
Improving turbocode performance by cross-entropy
Miguel Griot, Andres I. Vila Casado, and Richard D. Wesel
Time Varying Convolutional Codes for Punctured Turbocodes
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
III. Turbo Codes.
Presentation transcript:

III. Turbo Codes

Applications of Turbo Codes Worldwide Applications & Standards Data Storage Systems DSL Modem / Optical Communications 3G (Third Generation) Mobile Communications Digital Video Broadcast (DVB) Satellite Communications IEEE 802.16 WiMAX …

Basic Concept of Turbo Codes Invented in 1993 by Alian Glavieux, Claude Berrou and Punya Thitimajshima Basic Structure: a) Recursive Systematic Convolutional (RSC) Codes b) Parallel concatenation and Interleaving c) Iterative decoding

Recursive Systematic Convolutional Codes Non Systematic Convolutional Code Recursive Systematic Convolutional Code c1 c1 + + + + + c2 c2 g1=[1 1 1] g2=[1 0 1] g1=[1 1 1] g2=[1 0 1] G=[g1 g2] G=[1 g2/g1]

Non-Recursive Encoder Trellis Diagram 00 00 00 00 11 10 01 s0 (0 0) 11 11 11 s1 (1 0) 11 00 10 10 s2 (0 1) 01 01 01 s3 (1 1) 10

Systematic Recursive Encoder Trellis Diagram 00 00 00 00 11 10 01 s0 (0 0) 11 11 11 s1 (1 0) 11 00 10 10 s2 (0 1) 01 01 01 s3 (1 1) 10

Non-Recursive & Recursive Encoders Non Recursive and recursive encoders both have the same trellis diagram structure Generally Recursive encoders provide better weight distribution for the code The difference between them is in the mapping of information bits to codewords

Parallel Concatenation with Interleaving xk xk RSC Encoder 1 yk(1) π RSC Encoder 2 yk(2) 1/3 Turbo Encoder Two component RSC Encoders in parallel separated by an Interleaver The job of the interleaver is to de-correlate the encoding process of the two encoders Usually the two component encoder are identical xk : Systematic information bit yk(1) : Parity bit from the first RSC encoder yk(2) : Parity bit from the second RSC encoder

Interleaving Permutation of data to de-correlate encoding process of both encoders If encoder 1 generates a low weight codeword then hopefully the interleaved input information would generate a higher weight codeword By avoiding low weight codewords the BER performance of the turbo codes could improve significantly

Turbo Decoder rk RSC Decoder 1 xk* π-1 π RSC Decoder 2 Iteratively exchanging information between decoder 1 and decoder 2 before making a final deciding on the transmitted bits

Turbo Codes Performance Principles of Turbo Codes: Keattisak Sripiman  http:www.kmitl.ac.th/dslabs/TurboCodes

Turbo Decoding Turbo decoders rely on probabilistic decoding of component RSC decoders Iteratively exchanging soft-output information between decoder 1 and decoder 2 before making a final deciding on the transmitted bits As the number of iteration grows, the decoding performance improves

Turbo Coded System Model u=[u1, …, uk, …,uN] xs ys Turbo Decoding RSC Encoder 1 x1p y1p x y u* MUX Channel DEMUX N-bit Interleaver π RSC Encoder 2 x2p y2p u = [u1, …, uk, …,uN] : vector of N information bits xs = [x1s, …, xks, …,xNs] : vector of N systematic bits after turbo-coding of u x1p = [x11p, …, xk1p, …,xN1p] : vector of N parity bits from first encoder after turbo-coding of u x2p = [x12p, …, xk2p, …,xN2p] : vector of N parity bits from second encoder after turbo-coding of u x = [x1s, x11p, x12p, …, xNs, xN1p, xN2p] : vector of length 3N of turbo-coded bits for u y = [y1s, y11p, y12p, …, yNs, yN1p, yN2p] : vector of 3N received symbols corresponding to turbo-coded bits of u ys = [y1s, …, yks, …,yNs] : vector of N received symbols corresponding to systematic bits in xs y1p = [y11p, …, yk1p, …,yN1p] : vector of N received symbols corresponding to first encoder parity bits in x1p y2p = [y12p, …, yk2p, …,yN2p] : vector of N received symbols corresponding to second encoder parity bits in x2p u* = [u1*, …, uk*, …,uN*] : vector of N turbo decoder decision bits corresponding to u Pr[uk|y] is known as the a posteriori probability of kth information bit

Log-Likelihood Ratio (LLR) The LLR of an information bit uk is given by: Given that Pr[uk=+1]+Pr[uk=-1]=1

Maximum A Posteriori Algorithm u=[u1, …, uk, …,uN] xs ys Turbo Decoding RSC Encoder 1 x1p y1p x y u* MUX Channel DEMUX N-bit Interleaver π RSC Encoder 2 x2p y2p

Some Definitions & Conventions u=[u1, …, uk, …,uN] xs ys Turbo Decoding y1p RSC Encoder 1 x1p x y u* MUX Channel DEMUX N-bit Interleaver π RSC Encoder 2 x2p y2p uk ykp Sk-1 Sk s0 s0 s1 s1 s2 s2 uk=-1 uk=+1 s3 s3

Derivation of LLR uk ykp Define S(i) as the set of pairs of states (s’,s) such that the transition from Sk-1=s’ to Sk=s is caused by the input uk=i, i=0,1, i.e., S(0) ={(s0 ,s0), (s1 ,s3), (s2 ,s1), (s3 ,s2)} S(1) ={(s0 ,s1), (s1 ,s2), (s2 ,s0), (s3 ,s3)} Sk-1 Sk s0 s0 s1 s1 s2 s2 uk=-1 uk=+1 s3 s3 Remember Bayes’ Rule

Derivation of LLR Define

Derivation of LLR depends only on Sk and is independent on Sk-1, yk and Sk , yk depends only on Sk-1 and is independent on

Derivation of LLR

Derivation of LLR S0 Sk-1 Sk Sk+1 SN s0 s1 s2 s3 y= y1 yk-1 yk yk+1 yN

Computation of αk(s) Example

Forward Recursive Equation Computation of αk(s) Forward Recursive Equation Given the values of γk(s’,s) for all index k, the probability αk(s) can be forward recursively computed. The initial condition α0(s) depends on the initial state of the convolutional encoder The encoder usually starts at state 0

Backward Recursive Equation Computation of βk(s) Backward Recursive Equation Given the values of γk(s’,s) for all index k, the probability βk(s) can be backward recursively computed. The initial condition βN(s) depends on the final state of the trellis First encoder usually finishes at state s0 Second encoder usually has open trellis

Computation of γk(s’,s) Note that

Computation of γk(s’,s)