Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding.

Slides:



Advertisements
Similar presentations
Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Advertisements

Iterative Equalization and Decoding
Convolutional Codes Representation and Encoding  Many known codes can be modified by an extra code symbol or by deleting a symbol * Can create codes of.
What is a good code? Ideal system
Inserting Turbo Code Technology into the DVB Satellite Broadcasting System Matthew Valenti Assistant Professor West Virginia University Morgantown, WV.
Modern Digital and Analog Communication Systems Lathi Copyright © 2009 by Oxford University Press, Inc. C H A P T E R 15 ERROR CORRECTING CODES.
Maximum Likelihood Sequence Detection (MLSD) and the Viterbi Algorithm
Submission May, 2000 Doc: IEEE / 086 Steven Gray, Nokia Slide Brief Overview of Information Theory and Channel Coding Steven D. Gray 1.
Cellular Communications
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Coding and Error Control
Turbo Codes Azmat Ali Pasha.
06 Dec 04Turbo Codes1 TURBO CODES Michelle Stoll.
EE436 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
296.3Page :Algorithms in the Real World Convolutional Coding & Viterbi Decoding.
林茂昭 教授 台大電機系 個人專長 錯誤更正碼 數位通訊
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
Review of modern noise proof coding methods D. Sc. Valeri V. Zolotarev.
Channel Coding Part 1: Block Coding
Copyright © 2003, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 1 Chapter 4 Channel Coding.
Lecture 10: Error Control Coding I Chapter 8 – Coding and Error Control From: Wireless Communications and Networks by William Stallings, Prentice Hall,
III. Turbo Codes.
A Novel technique for Improving the Performance of Turbo Codes using Orthogonal signalling, Repetition and Puncturing by Narushan Pillay Supervisor: Prof.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Basic Characteristics of Block Codes
Introduction of Low Density Parity Check Codes Mong-kai Ku.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
DIGITAL COMMUNICATIONS Linear Block Codes
Iterative decoding If the output of the outer decoder were reapplied to the inner decoder it would detect that some errors remained, since the columns.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
CELLULAR COMMUNICATIONS MIDTERM REVIEW. Representing Oscillations   w is angular frequency    Need two variables to represent a state  Use a single.
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Last time, we talked about:
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Error Correction Code (2)
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Log-Likelihood Algebra
Implementation of Turbo Code in TI TMS320C8x Hao Chen Instructor: Prof. Yu Hen Hu ECE734 Spring 2004.
Error Control Coding. Purpose To detect and correct error(s) that is introduced during transmission of digital signal.
Coding No. 1  Seattle Pacific University Digital Coding Kevin Bolding Electrical Engineering Seattle Pacific University.
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
Information Theory & Coding for Digital Communications Prof JA Ritcey EE 417 Source; Anderson Digital Transmission Engineering 2005.
Diana B. Llacza Sosaya Digital Communications Chosun University
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
Channel Coding and Error Control 1. Outline Introduction Linear Block Codes Cyclic Codes Cyclic Redundancy Check (CRC) Convolutional Codes Turbo Codes.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
FEC decoding algorithm overview VLSI 자동설계연구실 정재헌.
296.3:Algorithms in the Real World
DIGITAL SYTEM DESIGN MINI PROJECT CONVOLUTION CODES
An Efficient Software Radio Implementation of the UMTS Turbo Codec
Coding for Noncoherent M-ary Modulation
Coding and Interleaving
S Digital Communication Systems
January 2004 Turbo Codes for IEEE n
Chapter 6.
Physical Layer Approach for n
Chapter 10: Error-Control Coding
IV. Convolutional Codes
Presentation transcript:

Turbo Codes COE 543 Mohammed Al-Shammeri

Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding TTurbo Codes Performance TTurbo Coding Application CConclusion Remarks

Introduction  Motivation Can we have error free communication as much as possible. Can we reach Shannon Limit?  Objectives Studying channel coding Understanding channel capacity Ways to increase data rate Provide reliable communication link

Turbo Codes History  IEEE International Comm conf 1993 in Geneva  Berrou, Glavieux. : ‘Near Shannon Limit Error-Correcting Coding : Turbo codes’  Provided virtually error free communication at data date/power efficiencies beyond most expert though

Turbo Codes History…  Double data throughput at a given power  Or work with half the power  The two men were not known, “most were thinking that they are wrong in calculation”  They realized that it was true.  Many companies adopted, new compnaies started: turboconcept and iCoding  0.5 dB from Shannon limit at P e 10 -6

Communication System  Structural modular approach  Various components  Of defined functions Channel Coding Source Coding Modulation Formatting Digitization Multiplexing Access techniques send receive

Channel Coding  Accounting for the channel  Can be categorized into Wave form signal design  Better detectible signals. Structured sequences  Added redundancy  Objective: provide coded signals with better distance properties

Binary Symmetric Channel  Special case of DMC : discrete input and discrete output; where input and output are {0,1}  Memoryless : each symbol is affected d independently  Hard decisions decoding  P is related to the bit Energy p p p

Gaussian Channel  descrete inputs with continuous property  Noise get added to the signals passing through it  Noise is a Gaussian random variable with zero mean and variance σ 2  The resulting pdf is Likelihood of u k

Why use ECC  Consider the following trade offs Error performance vs. bandwidth  High redendency consumes bw Power vs. bandwidth  Reduction in E b /N 0 Data rate vs. bandwidth  Higher rate

Shannon Theory  Started the information Theory  Stated the max data rate of a channel Error rate Power  Did not say how!  Clue : large data words in term of number of bits: distance property

Error Correction Mechanisms  Backward Error correction Error detection capability Communication cost Real time traffic  Forward Error Correction Detection and correction of errors More complex receivers DSP cost

Forward Error Correction  Block Codes Data split into blocks Checks are within the block  Convolutional code Bit streamed data Involves memory  Turbo codes Uses conv. Codes Special properties

Structured Redundency Channel encoder Channel encoder Input word k-bit Output word n-bit Redundancy = (n-k) Code rate = k/n codeword Code sequence

Coding advantages PnPn E b /N 0 dB uncoded coded Coding gain

Coding disadvantages  More bandwidth due to redundant  Processing Delay  Design Complexity

Error Correction  Codewords : points in hyperspace  Noise can alter some bits : displacement  If two words are close to each other, and if an error occurs so that one can fall into the other; decoding error  Keep large differences  Decoder complexity !

Hyperspace and Codewords Hamming distance Same word

Good Codes  Random  If we set 1000 bits per word  , astronomical number  No way with conventional coding schemes

Turbo codes  30 years ago. Forney Nonsystematic Nonrecursive combination of conv. Encoders  Berrou et al at 1993 Recursive Systematic  Based on pseudo random  Works better for high rates or high level of noise  Return to zero sequences

Turbo Encoder Input RSC Interleaver Systematic codeword random X Y1 Y2

Turbo codes  Parallel concatenated The k-bit block is encoded N times with different versions (order) Pro the sequence remains RTZ is 1/2 Nv Randomness with 2 encoders; error pro of Permutations are to fix d min

Recursive Systematic Coders Copy of the data in natural order Recursive S1S2S3 Data stream Systematic Calculated parity bits

Return to zero sequences  Non recursive encoder state goes to zero after v ‘0’.  RSC goes to zero with P= 1/2 v  if one wants to transform conv. into block code; it is automatically built in.  Initial state i will repeat after encoding k

Convolutional Encoders Input stream Modulo-2 adder Output serialized stream 4 stage Shift

Turbo Decoding

 Criterion For n probabilistic processors working together to estimate common symbols, all of them should agree on the symbols with the probabilities as a single decoder could do

Turbo Decoder

 The inputs to the decoders are the Log likelihood ratio (LLR) for the individual symbol d.  LLR value for the symbol d is defined ( Berrou) as

Turbo Decoder  The SISO decoder reevaluates the LLR utilizing the local Y1 and Y2 redundancies to improve the confidence The value z is the extrinsic value determined by the same decoder and it is negative if d is 0 and it is positive if d is 1 The updated LLR is fed into the other decoder and which calculates the z and updates the LLR for several iterations After several iterations, both decoders converge to a value for that symbol.

Turbo Decoding  Assume U i : modulating bit {0,1} Y i : received bit, output of a correlator. Can take any value (soft). Turbo Decoder input is the log likelihood ratio  R(u i ) = log [ P(Y i |U i =1)/(P(Y i |U i =0)]  For BPSK, R(u i ) =2 Yi/ (var) 2 For each data bit, calculate the LLR given that a sequence of bit were sent

Turbo Decoding  Compare the LLR output, to see if the estimate is towards 0 or 1 then take HD

Soft in/ Soft out processor  At the heart of the decoder  Represent all possible states of an encoder (trellis)  Number of states at a particular clock is 2 n ; n = # of flip flops used in the SR  Trellis shows: Current state Possible paths lead to this state

SISO  Label all branches with a branch metric  Function of processor inputs  Obtain the LLR for each data bit by traversing the Trellis  Two algorithms : Soft output Viterbi Algorithm (SOVA) Maximum a Posteriori (MAP)  Log MAP

How Do they Work (© IEEE spectrum)

Turbo Codes Performance

Turbo Codes Applications  Deep space exploration France SMART-1 probe  JPL equipped Pathfinder 1997  Mobile 3G systems In use in Japan UMTS NTT DoCoMo  Turbo codes : pictures/video/mail  Convolutional codes : voice

Conclusion : End of Search  Turbo codes achieved the theorical limits with small gap  Give rise to new codes : Low Density Parity Check (LDPC)  Need Improvements in decoding delay