Iterative decoding If the output of the outer decoder were reapplied to the inner decoder it would detect that some errors remained, since the columns.

Slides:



Advertisements
Similar presentations
Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Advertisements

Iterative Equalization and Decoding
Detection Chia-Hsin Cheng. Wireless Access Tech. Lab. CCU Wireless Access Tech. Lab. 2 Outlines Detection Theory Simple Binary Hypothesis Tests Bayes.
Turbo Multiuser Detection Group Members: -Bhushan G. Jagyasi -Himanshu Soni.
What is a good code? Ideal system
Inserting Turbo Code Technology into the DVB Satellite Broadcasting System Matthew Valenti Assistant Professor West Virginia University Morgantown, WV.
Maximum Likelihood Sequence Detection (MLSD) and the Viterbi Algorithm
Chapter 6 Information Theory
TELIN Estimation and detection from coded signals Presented by Marc Moeneclaey, UGent - TELIN dept. Joint research : - UGent.
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Ger man Aerospace Center Transfer Chart Analysis of Iterative OFDM Receivers with Data Aided Channel Estimation Stephan Sand, Christian Mensing, and Armin.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
Receiver Performance for Downlink OFDM with Training Koushik Sil ECE 463: Adaptive Filter Project Presentation.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
1 Digital Communication Systems Lecture-3, Prof. Dr. Habibullah Jamal Under Graduate, Spring 2008.
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Analysis of Iterative Decoding
Multilevel Coding and Iterative Multistage Decoding ELEC 599 Project Presentation Mohammad Jaber Borran Rice University April 21, 2000.
Telex Magloire Ngatched Centre for Radio Access Technologies University Of Natal Durban, South-Africa Telex Magloire Ngatched Centre for Radio Access Technologies.
Contact: Robust Wireless Communication System for Maritime Monitoring Robust Wireless Communication System for Maritime Monitoring.
1 (Chapter 15): Concatenated codes Simple (classical, single-level) concatenation Length of concatenated code: n 1 n 2 Dimension of concatenated code:
III. Turbo Codes.
CODED COOPERATIVE TRANSMISSION FOR WIRELESS COMMUNICATIONS Prof. Jinhong Yuan 原进宏 School of Electrical Engineering and Telecommunications University of.
Iterative Soft-Decision Decoding of Algebraic-Geometric Codes Li Chen Associate Professor School of Information Science and Technology, Sun Yat-sen University,
1 –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system.
Baseband Demodulation/Detection
A Novel technique for Improving the Performance of Turbo Codes using Orthogonal signalling, Repetition and Puncturing by Narushan Pillay Supervisor: Prof.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
Medicaps Institute of Technology & Management Submitted by :- Prasanna Panse Priyanka Shukla Savita Deshmukh Guided by :- Mr. Anshul Shrotriya Assistant.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
VIRGINIA POLYTECHNIC INSTITUTE & STATE UNIVERSITY MOBILE & PORTABLE RADIO RESEARCH GROUP MPRG Multiuser Detection with Base Station Diversity IEEE International.
Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
EE 3220: Digital Communication
VIRGINIA POLYTECHNIC INSTITUTE & STATE UNIVERSITY MOBILE & PORTABLE RADIO RESEARCH GROUP MPRG Combined Multiuser Detection and Channel Decoding with Receiver.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Last time, we talked about:
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Low Density Parity Check codes
Iterative detection and decoding to approach MIMO capacity Jun Won Choi.
An ARQ Technique Using Related Parallel and Serial Concatenated Convolutional Codes Yufei Wu formerly with: Mobile and Portable Radio Research Group Virginia.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
Forschungszentrum Telekommunikation Wien [Telecommunications Research Center Vienna] Göttfried Lächner, Ingmør Lønd, Jössy Säyir Optimization of LDPC codes.
Matthew Valenti West Virginia University
Log-Likelihood Algebra
Energy Efficient Source Coding and Modulation for Wireless Applications Yashwanth Prakash Sandeep.K.S.Gupta Arizona State University Tempe, AZ
A Bandwidth Efficient Pilot Symbol Technique for Coherent Detection of Turbo Codes over Fading Channels Matthew C. Valenti Dept. of Comp. Sci. & Elect.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3c: Signal Detection in AWGN.
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
Block Coded Modulation Tareq Elhabbash, Yousef Yazji, Mahmoud Amassi.
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
10/19/20051 Turbo-NFSK: Iterative Estimation, Noncoherent Demodulation, and Decoding for Fast Fading Channels Shi Cheng and Matthew C. Valenti West Virginia.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
FEC decoding algorithm overview VLSI 자동설계연구실 정재헌.
Lecture 1.31 Criteria for optimal reception of radio signals.
MAP decoding: The BCJR algorithm
Rate 7/8 (1344,1176) LDPC code Date: Authors:
Coding for Noncoherent M-ary Modulation
Latent Variables, Mixture Models and EM
Bin Zhao, Ph.D. student Matthew Valenti, Assistant Professor
CT-474: Satellite Communications
Improving turbocode performance by cross-entropy
IV. Convolutional Codes
Presentation transcript:

Iterative decoding If the output of the outer decoder were reapplied to the inner decoder it would detect that some errors remained, since the columns would not be codewords of the inner code Iterative decoder: to reapply the decoded word not just to the inner code, but also to the outer, and repeat as many times as necessary. However, it is clear that this would be in danger of simply generating further errors. One further ingredient is required for the iterative decoder.

Soft-In, Soft-Out (SISO) decoding The performance of a decoder is significantly enhanced if, in addition to the ‘hard decision’ made by the demodulator on the current symbol, some additional ‘soft information’ on the reliability of that decision is passed to the decoder. For example, if the received signal is close to a decision threshold (say between 0 and 1) in the demodulator, then that decision has low reliability, and the decoder should be able to change it when searching for the most probable codeword. Making use of this information in a conventional decoder, called soft decision decoding, leads to a performance improvement of around 2dB in most cases.

SISO decoder A component decoder that generates ‘soft information’ as well as makes use of it, Soft information usually takes the form of a log-likelihood ratio for each data bit, The likelihood ratio is the ratio of the probability that a given bit is ‘1’ to the probability that it is ‘0’ If we take the logarithm of this, then its sign corresponds to the most probable hard decision on the bit (if it is positive, ‘1’ is most likely; if negative, then ‘0’) The absolute magnitude is a measure of our certainty about this decision.

Likelihood Functions Bayes’ Theorem  P(d=i/x) = p(x/d=i) P(d=i) ; i = 1,……M p(x) P(d=i/x)  A posteriori probability APP P(d=i)  A priori probability p(x/d=i)  conditional pdf of received Signal x p(x)  pdf of received Signal x

Maximum Likelihood Let dk= +1, -1 ; AWGN channel Received statistic  xk Likelihood functions  l1 = p(xk / dk= +1 ) l2= p(xk / dk= -1 ) Maximum Likelihood  hard decision rule choose dk= +1, if l1 > l2 choose dk= -1, if l2 > l1

Maximum A Posteriori - MAP Let dk= +1, -1 ; AWGN channel Received statistic  xk MAP Rule  P(dk= +1 / xk ) P(dk= -1 / xk ) H1 : dk= +1, H2 : dk= -1. > < H1 H2

MAP Likelihood ratio test > < H1 H2 p(xk / dk= +1 ) P(dk= +1) p(xk / dk= -1 ) P(dk= -1) > < H1 H2 p(xk / dk= +1 ) P(dk= -1) p(xk / dk= -1 ) P(dk= +1) > < H1 H2 p(xk / dk= +1 ) P(dk= +1) 1 p(xk / dk= -1 ) P(dk= -1)

Log - Likelihood Ratio : LLR P(d= +1 / x) P(d= -1 / x ) p(x / d= +1 ) P(d= +1) p(x / d= -1 ) P(d= -1) L(d/ x ) = log = log = log + log = L ( x/d ) + L(d) p(x / d= +1 ) p(x / d= -1 ) P(d= +1) P(d= -1)

Log - Likelihood Ratio : LLR L(d/ x ) = L ( x /d ) + L(d) L’( d ) = Lc ( x ) + L(d) ^ Soft LLR output for a systematic code : L( d ) = L’( d ) + Le( d ) LLR of data at Demod. output Extrinsic LLR : Knowledge from Decoding process L( d ) = Lc ( x ) + L(d) + Le( d )

L( d ) = Lc ( x ) + L(d) + Le( d ) ^ L’( d ) a posteriori value out ^ SISO Decoder L(d) apriori value in Le( d ) Extrinsic Value out Lc( x ) Channel Value in L’( d ) = Lc ( x ) + L(d) Detector a posteriori LLR value L( d ) = L’( d ) + Output LLR value

Iterative decoding algorithm for the product code Set the a-priori LLR L(d) = 0 Decode horizontally and obtain Leh( d ) = L( d ) - Lc ( x ) - L( d ) Set L(d) = Leh(d) for vertical decoding Decode vertically and obtain Lev( d ) = L( d ) - Lc ( x ) - L( d ) Set L(d) = Lev(d) for horizontal decoding Repeat steps 2 to 5 to optimize and the soft output is L( d ) = Lc ( x ) + Leh( d ) + Lev( d ) ^ ^ ^ ^ ^ ^ ^ ^ ^

Iterative Decoder

Decoder Architectures Decoders must operate much faster than the rate at which incoming data arrives, so that several iterations can be accommodated in the time between the arrivals of received data blocks, Architecture may be replaced by a pipeline structure, in which data and extrinsic information are passed to a new set of decoders while the first one processes the next data block At some point the decoder may be deemed to have converged to the optimum decoded word, at which point the combination of extrinsic and intrinsic information can be used to find the decoded data Usually a fixed number of iterations is used—between 4 and 10, depending on the type of code and its length—but it is also possible to detect convergence and terminate the iterations at that point.