1 Finite-Length Scaling and Error Floors Abdelaziz Amraoui Andrea Montanari Ruediger Urbanke Tom Richardson.

Slides:



Advertisements
Similar presentations
Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Advertisements

Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Probability Distributions CSLU 2850.Lo1 Spring 2008 Cameron McInally Fordham University May contain work from the Creative Commons.
VL Netzwerke, WS 2007/08 Edda Klipp 1 Max Planck Institute Molecular Genetics Humboldt University Berlin Theoretical Biophysics Networks in Metabolism.
TNO orbit computation: analysing the observed population Jenni Virtanen Observatory, University of Helsinki Workshop on Transneptunian objects - Dynamical.
Improving BER Performance of LDPC Codes Based on Intermediate Decoding Results Esa Alghonaim, M. Adnan Landolsi, Aiman El-Maleh King Fahd University of.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep
1 Rare Event Simulation Estimation of rare event probabilities with the naive Monte Carlo techniques requires a prohibitively large number of trials in.
By Hua Xiao and Amir H. Banihashemi
Near Shannon Limit Performance of Low Density Parity Check Codes
Output Analysis and Experimentation for Systems Simulation.
OCDMA Channel Coding Progress Report
Threshold Phenomena and Fountain Codes
Asymptotic Enumerators of Protograph LDPCC Ensembles Jeremy Thorpe Joint work with Bob McEliece, Sarah Fogal.
Division of Engineering and Applied Sciences March 2004 Belief-Propagation with Information Correction: Near Maximum-Likelihood Decoding of LDPC Codes.
1 Local optimality in Tanner codes June 2011 Guy Even Nissim Halabi.
1 Scalable Image Transmission Using UEP Optimized LDPC Codes Charly Poulliat, Inbar Fijalkow, David Declercq International Symposium on Image/Video Communications.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
RAPTOR CODES AMIN SHOKROLLAHI DF Digital Fountain Technical Report.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Optimizing LDPC Codes for message-passing decoding. Jeremy Thorpe Ph.D. Candidacy 2/26/03.
Monte Carlo Methods in Partial Differential Equations.
CSC2535: 2013 Advanced Machine Learning Lecture 3a: The Origin of Variational Bayes Geoffrey Hinton.
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
Analysis of Iterative Decoding
Entropy and some applications in image processing Neucimar J. Leite Institute of Computing
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Repairable Fountain Codes Megasthenis Asteris, Alexandros G. Dimakis IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 32, NO. 5, MAY /5/221.
The horseshoe estimator for sparse signals CARLOS M. CARVALHO NICHOLAS G. POLSON JAMES G. SCOTT Biometrika (2010) Presented by Eric Wang 10/14/2010.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 7 Low Density Parity Check Codes.
Optimal Degree Distribution for LT Codes with Small Message Length Esa Hyytiä, Tuomas Tirronen, Jorma Virtamo IEEE INFOCOM mini-symposium
1 Channel Coding (II) Cyclic Codes and Convolutional Codes.
Channel Capacity.
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
Andrea Montanari and Ruediger Urbanke TIFR Tuesday, January 6th, 2008 Phase Transitions in Coding, Communications, and Inference.
Week 7 Lecture 1+2 Digital Communications System Architecture + Signals basics.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Robustness of complex networks with the local protection strategy against cascading failures Jianwei Wang Adviser: Frank,Yeong-Sung Lin Present by Wayne.
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
Multi-Edge Framework for Unequal Error Protecting LT Codes H. V. Beltr˜ao Neto, W. Henkel, V. C. da Rocha Jr. Jacobs University Bremen, Germany IEEE ITW(Information.
Pei-Chuan Tsai Chih-Ming Chen Ying-ping Chen WCCI 2012 IEEE World Congress on Computational Intelligence Sparse Degrees Analysis for LT Codes Optimization.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Low Density Parity Check codes
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
1 Design of LDPC codes Codes from finite geometries Random codes: Determine the connections of the bipartite Tanner graph by using a (pseudo)random algorithm.
Machine Learning 5. Parametric Methods.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Stats Term Test 4 Solutions. c) d) An alternative solution is to use the probability mass function and.
Error-Correcting Code
1 Reliability-Based SD Decoding Not applicable to only graph-based codes May even help with some algebraic structure SD alternative to trellis decoding.
Joint Decoding on the OR Channel Communication System Laboratory UCLA Graduate School of Engineering - Electrical Engineering Program Communication Systems.
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
The Viterbi Decoding Algorithm
Sequential Algorithms for Generating Random Graphs
LECTURE 03: DECISION SURFACES
Increasing Watermarking Robustness using Turbo Codes
Optimizing LDPC Codes for message-passing decoding.
Xiaoyang Zhang1, Yuchong Hu1, Patrick P. C. Lee2, Pan Zhou1
Irregular Structured LDPC Codes and Structured Puncturing
Miguel Griot, Andres I. Vila Casado, and Richard D. Wesel
LECTURE 15: REESTIMATION, EM AND MIXTURES
Theory of Information Lecture 13
Watermarking with Side Information
Presentation transcript:

1 Finite-Length Scaling and Error Floors Abdelaziz Amraoui Andrea Montanari Ruediger Urbanke Tom Richardson

2 Approach to Asymptotic

3 Finite Length Scaling

4

5

6

7 Analysis (BEC): Covariance evolution Fraction of check nodes of degree greater than one and equal to one. Covariance terms. As a function of residual graph fractional size.

8 Covariance evolution

9 Finite Length Curves

10 Analysis (BEC) Follow Luby et al: single variable at a time with the trajectory converging to a differential equation. Covariance of state space variables also follows a d.e. Increments have Markov property and regularity.

11 Results

12 Finite Threshold Shift

13 Generalizing from the BEC? No obvious incremental form (diff. Eq.) No state space characterization of failure. No clear finite dimensional state space. Not clear what the right coordinates are for the general case (Capacity?). Nevertheless, it is useful in practice to have this interpretation of iterative failure and to have the basic form of the scaling law.

14 Empirical Evidence

15 Error Floors

16

17

18 Error Floors: Review of the BEC

19 Error floors on the erasure channel: Stopping sets.

20 Error floors on the erasure channel: average performance.

21 Error floors on the erasure channel: decomposition

22 Error floors on the erasure channel: average and typical performance.

23 Error floors for general channels: Expurgated Ensemble Experiments. AWGN channel rate 51/64 block lengths 4k Random Girth (8) optimized Neighborhood optimized

24 Error floors for general channels: Trapping set distribution. AWGN channel rate 51/64 block lengths 4k (3,1) (5,1) (7,1)

25 Observations. Error floor region dominated by small weight errors. Subset on which error occur usually induces a subgraph with only degree 2 and degree 1 check nodes where the number of degree 1 check nodes is relatively small. Optimized graphs exhibit concentration of types of errors.

26 Intuition In the error floor event, nodes in the trapping sets receive 1’s with some reliability. Other nodes receive typical inputs. (Reliable 1) (Definite 0) After a few iterations ‘exterior’ nodes and messages converge to high reliability 0s. Internally messages are 1s. (Definite 0) Nevertheless, if internal received are 1s, internal messaging reaches highly reliable 1 and message state gets trapped. (Definite 0) (9,3)

27 A decoder on an input ℇ Y is a sequence of maps: D l : ℇ {0,1} n Defining Failure: Trapping Sets (Assume the all-0 codeword is the desired decoding. For the BEC let 1 denote an erasure.) We say that bit j is eventually correct if there exists L so that l > L implies D l ( ℇ ) = 0. Assuming failure, the trapping set T is the set of all bits that are not eventually correct.

28 Defining Failure for BP: Practice Decode for 200 iterations. If the decoding is not successful decode an additional 20 iterations and take the union of all bits that do not decode to 0 during this time.

29 Trapping Sets: Examples 1.Let the decoder be the maximum likelihood decoder in one step. Then the trapping sets are the non-zero codewords. 2.Let the decoder be belief propagation over the BEC. Then the trapping sets are the stopping sets. 3.Let the decoder be serial (strict) flipping over the BSC. T is a trapping set if and only if the in the subgraph induced by T each node has more even then odd degree neighbors, and the same holds for the complement of T.

30 Analysis with Trapping Sets: Decomposition of failure FER(  ) =  T P( ℇ T,  ) ℇ T : The set of all inputs giving rise to failure on trapping set T. Error Floors dominated by “small” events.

31 1.Find (cover) all trapping sets likely to have significant contribution to the error floor region. T 1,T 2,T 3,….,T m 2.Evaluate contribution of each set to the error floor. P( ℇ T 1,  ), P( ℇ T 2,  ),… Predicting Error Floors: A two pronged attack. Strictly speaking, we get a (tight) lower bound FER(  ) >  i P( ℇ T i,  )

32 Finding Trapping Sets Simulation of decoding can be viewed as stochastic process for finding trapping sets. It is very inefficient, however. We could use (aided) flipping to get some speed up. It is still too inefficient.

33 Finding Trapping Sets (Flipping) Trapping sets can be viewed as “local” extrema of certain functions. E.g., number of odd degree induced checks. “Local” means, e.g., under single element removal, addition, or swap. Therefore, we can look for subsets that are “local” extrema.

34 Finding Trapping Sets (Flipping) Basic idea: Build up a connected subset with bias towards minimizing induced odd degree checks. Check occasionally for containment of a in-flipping stable set by applying flipping decoding. Eventually such a set is contained. Check now for other types of variation: Out-flipping stability. Single aided flip stability (chains). ……

35 Differences: BP and Flipping

36 Differences: BP and Flipping r1r1 r 1 +r 2 +r 3 r3r3 r2r2

37 Differences: BP and Flipping

38 Differences: BP and Flipping

39 Find random variable x on which to condition the decoder input Y that “mostly” determines membership in ℇ T. I.e., Pr{ ℇ T | x} is nearly a step function in x. Perform in situ simulation of trapping set while varying x to measure Pr{ ℇ T | x}. Combine with density of x to get Pr{ ℇ T }. Evaluating Trapping Sets Basic idea:

40 Evaluating Trapping Sets Condition input to trapping set Otherwise simulate channel

41 Evaluating Trapping Sets: BEC X is the number of erasures in T (=S).

42 Evaluating Trapping Sets: AWGN X is the mean noise input in T.

43 Evaluating Trapping Sets: AWGN

44 Evaluating Trapping Sets: Margulis 12,4

45 A tougher test case:G

46 Evaluating Trapping Sets: G

47 A Curve from a single point

48 Extrapolating a curve

49 Variation in Trapping Sets: (10,4) (10,2) (10,0)

50 Variation in Trapping Sets: (12,4) (12,2) (12,0)

51 Variation in Trapping Sets

52 Conclusions Error floor performance is predictable with considerable computational effort. (Would be nice to have scaling law for “best” codes.) Trade off between error floor and threshold (waterfall) can be optimized even for very deep error floors.

53 Conclusions “It is interesting to observe that the search for theoretical understanding of turbo codes has transformed coding theorists into experimental scientists.” physicists.”