1 –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system.

Slides:



Advertisements
Similar presentations
Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Advertisements

Iterative Equalization and Decoding
Convolutional Codes Representation and Encoding  Many known codes can be modified by an extra code symbol or by deleting a symbol * Can create codes of.
Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
What is a good code? Ideal system
Applied Algorithmics - week7
Inserting Turbo Code Technology into the DVB Satellite Broadcasting System Matthew Valenti Assistant Professor West Virginia University Morgantown, WV.
The Impact of Channel Estimation Errors on Space-Time Block Codes Presentation for Virginia Tech Symposium on Wireless Personal Communications M. C. Valenti.
1 Finite-Length Scaling and Error Floors Abdelaziz Amraoui Andrea Montanari Ruediger Urbanke Tom Richardson.
Visual Recognition Tutorial
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Asymptotic Enumerators of Protograph LDPCC Ensembles Jeremy Thorpe Joint work with Bob McEliece, Sarah Fogal.
Turbo Codes Azmat Ali Pasha.
1 Scalable Image Transmission Using UEP Optimized LDPC Codes Charly Poulliat, Inbar Fijalkow, David Declercq International Symposium on Image/Video Communications.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
EE436 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
Improving the Performance of Turbo Codes by Repetition and Puncturing Youhan Kim March 4, 2005.
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Noise, Information Theory, and Entropy
Analysis of Iterative Decoding
ECED 4504 Digital Transmission Theory
S Advanced Digital Communication (4 cr)
Channel Coding Part 1: Block Coding
Telex Magloire Ngatched Centre for Radio Access Technologies University Of Natal Durban, South-Africa Telex Magloire Ngatched Centre for Radio Access Technologies.
1 Systematic feedback (recursive) encoders G’(D) = [1,(1 + D 2 )/(1 + D + D 2 ),(1 + D)/(1 + D + D 2 ) ] Infinite impulse response (not polynomial) Easier.
1 (Chapter 15): Concatenated codes Simple (classical, single-level) concatenation Length of concatenated code: n 1 n 2 Dimension of concatenated code:
1 SNS COLLEGE OF ENGINEERING Department of Electronics and Communication Engineering Subject: Digital communication Sem: V Cyclic Codes.
EE 6332, Spring, 2014 Wireless Communication Zhu Han Department of Electrical and Computer Engineering Class 11 Feb. 19 th, 2014.
A Novel technique for Improving the Performance of Turbo Codes using Orthogonal signalling, Repetition and Puncturing by Narushan Pillay Supervisor: Prof.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
DIGITAL COMMUNICATIONS Linear Block Codes
Iterative decoding If the output of the outer decoder were reapplied to the inner decoder it would detect that some errors remained, since the columns.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Iterative Channel Estimation for Turbo Codes over Fading Channels Matthew C. Valenti Assistant Professor Dept. of Comp. Sci. & Elect. Eng. West Virginia.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Last time, we talked about:
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Low Density Parity Check codes
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Some Computation Problems in Coding Theory
1 Design of LDPC codes Codes from finite geometries Random codes: Determine the connections of the bipartite Tanner graph by using a (pseudo)random algorithm.
An ARQ Technique Using Related Parallel and Serial Concatenated Convolutional Codes Yufei Wu formerly with: Mobile and Portable Radio Research Group Virginia.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
Forschungszentrum Telekommunikation Wien [Telecommunications Research Center Vienna] Göttfried Lächner, Ingmør Lønd, Jössy Säyir Optimization of LDPC codes.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Log-Likelihood Algebra
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
A Bandwidth Efficient Pilot Symbol Technique for Coherent Detection of Turbo Codes over Fading Channels Matthew C. Valenti Dept. of Comp. Sci. & Elect.
1 Reliability-Based SD Decoding Not applicable to only graph-based codes May even help with some algebraic structure SD alternative to trellis decoding.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3c: Signal Detection in AWGN.
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
The Viterbi Decoding Algorithm
Bridging the Gap Between Parallel and Serial Concatenated Codes
MAP decoding: The BCJR algorithm
S Digital Communication Systems
Physical Layer Approach for n
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Improving turbocode performance by cross-entropy
Theory of Information Lecture 13
IV. Convolutional Codes
Presentation transcript:

1 –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system in Matlab using the Communication Blockset. The system must simulate communication over an AWGN channel using either of these codes: –Block code –Convolutional code –PCCC –LDPC –You are free to implement any of these coding techniques, as long as the requirements below are fullfilled: –Information length k = 1024 Block length n = 3072 E b / N o = 1.25 –We will test your answers on our own computer and evaluate them based on bit error rate versus CPU time usage according to the following formula: p = T ⋅ BER

2 How to create and run simulations in MATLAB from scratch –Run matlab from a command window. –Type in simulink in MATLAB's command window. –Choose File -> New -> Model in Simulink's main window. –Create the model by dragging and dropping elements into it. How to finish this exercise starting from a demo –Run matlab from a command window. –Type in sccc_sim in MATLAB's command window. A ready- made demo of an SCCC opens. –Study the demo closely and modify it to your needs.

3 Design of turbo codes Turbo codes can be designed for performance at Low SNR High SNR Choices: Constituent codes, interleaver Errors that an ML decoder would also make Errors due to imperfectness of decoding algorithm

4 Design of turbo codes for high SNR Goal: Maximize minimum distance d Minimize the number of codewords A d of weight d In general, design code for a thin weight spectrum Use recursive component encoders! Simple approach: Concentrate on the weight two-inputs Simple (but flawed!) approach: This applies if the interleavers are chosen at random. But it is possible (and even easy) to avoid the problem

5 Weight-two inputs Assume primitive feedback polynomial Two weight input vectors that will take the encoder out of and back to the initial state: (1, 0,0, 1)  corresponds to parity weight z min (1, 0,0, 0,0,0, 1) In general, 1 followed by 3m-1 0s followed by a 1 Even more general, 1 followed by (2 -1)m-1 0s followed by a 1 d eff = z min

6 Theorem on z min Theorem: Let G(D) = [ 1, e(D)/d(D) ], where the denominator polynomial d(D) is primitive of degree. Then z min = s, s=1 if e(D) has constant coefficient 1 and degree, s=0 otherwise. Proof: d(D) is the generator polynomial of a cyclic (2 -1, ) Hamming code q(D)=(D ) / d(D) is the generator polynomial of a cyclic (2 -1, ) maximum-length code of minimum distance 2 -1 deg e(D) < : e(D) q(D) is a codeword in the maximum-length code and so has weight 2 -1 deg e(D) = : e(D)=DD -1 + e (2) (D). c 1 (D) =D -1 q(D) and c 2 (D) = e (2) (D)q(D) are both codewords in the maximum-length code and so have weight Dc 1 (D) = [cycl. shift of c 1 (D)]+ D , so e(D) q(D) is [a codeword with const.coeff=0] D 2 -1.

7 Convolutional Recursive Encoders for PCCC codes Max

8 Convolutional Recursive Encoders for PCCC codes Max

9 Choice of component codes The listed codes may not have the best free distance, but have a better mapping (compared to ”optimum” CCs) of input weights to output weights The overall turbo code performance depends also on the actual interleaver used

10 Choice of interleaver Pseudorandom interleavers with enhanced requirements: Interleavers that avoid problem with weight-2 inputs:* If | i-j | = (2 -1)m, then |  (i)-  (j) |  (2 -1)n (for n+m small) S-random interleaver: If | i-j |  S, then |  (i)-  (j) |  S Interleavers specialized to accommodate the actual encoders* Maintains a list of ”critical” sets of positions, which are the information symbols of low weight words Do not map one critical set into another

11 Design of turbo codes for low SNR The foregoing discussion assumes that the decoding is close to maximum likelihood. This is not the case for very low SNRs Goal for low SNR: Optimize interchange of information between the constituent decoders Analyze this interchange by using density evolution or EXIT charts

12 EXtrinsic Information Transfer charts Approach: A SISO block produces a more accurate information about the transmitted information at its output than what is available at the input The amount of information can be precisely quantified using information theory The entropy H(X) of a stochastic variable X is given as H(X) = -  x P(X=x)log(P(X=x)). It is a measure of uncertainty The mutual information I(X;Y) = H(X)-H(X|Y) For a specified SNR (and thus a known information about the u l due to the channel values): I a (u l,L a (u l )) I e (u l,L e (u l )) EXIT chart: I e (u l,L e (u l )) as a function of I a (u l,L a (u l )) by log-MAP

13 EXIT curves Obtained by simulations (But much simpler than turbo code simulations)

14 EXIT charts Next, plot the EXIT curve for one SNR, together with its mirror image. These curves represent the EXIT curves of the two constituent decoders Open tunnel: Decoding will proceed to convergence

15 EXIT charts EXIT chart for another SNR: Closed tunnel: Decoding will get stuck

16 SNR Threshold SNR Threshold: The smallest SNR with an open EXIT chart tunnel Defines the start of the waterfall region Tunnel opens Non-convergence becomes a small problem

17 EXIT chart A property of the constituent encoder Can be used to find good constituent encoders for low SNRs In general, simple is good (flatter EXIT curve) Can be used for codes with different constituent encoders too. The constituent coders can in this case be fitted to each other’s EXIT curve, providing a lower SNR threshold It is assumed that the interleavers are very long, so that a Gaussian Approximation applies: Errors in the extrinsic values occur according to a Gaussian distribution

18 Iterative decoding Decoding examples Some observations

19 Decoding example

20 Decoding example: K=4

21 The effect of many iterations

22 Iterative decoding: Stopping criteria Fixed number of iterations Hard-decisions If the hard decisions of the two extrinsic value vectors coincide; assume that convergence has been achieved Cross-entropy Outer error-detecting codes

23 Iterative decoding: Some observations Parallel implementations: The constituent decoders can work in parallel Final decision can be taken from a posteriori values of either constituent decoder; their average; or the extrinsic values The decoder may sometimes, depending on the SNR and on the occurence of structural faults in the interleaver, oscillate between correct and incorrect decisions Max-log-MAP can be shown to be equivalent to SOVA Max-log-MAP is (a little!) simpler to implement than log-MAP, but suffers a penalty of about 0.5 dB

24 Suggested exercises