Download presentation
Presentation is loading. Please wait.
Published byShannon Boyd Modified over 9 years ago
1
Error correction on a tree: Instanton approach Misha Chertkov (LANL) In collaboration with: V. Chernyak (Corning) M. Stepanov (UA, Tucson) B. Vasic (UA, Tucson) Thanks: I. Gabitov (Tucson/LANL) Boulder: 04/15/04 Submitted to Phys.Rev.Lett.
2
Forward-Error-Correction (FEC). Channel Noise. ** Coding. Low Density Parity Check codes (LDPC) - Tanner graph ** Decoding. Marginal-A-Posteriori (MAP) – Stat Mech interpretation ** Belief Propagation (BP) – Message Passing (MP) ** Post-Error-Correction Bit-Error-Rate (BER). Optimization. * Shannon transition/limit *. Error floor - Evaluation. ** Tree as an approximation: BP is exact *. From LDPCC to a tree ** BER in the center of the tree ** High Signal-to-Noise Ratio (SNR) phase. Hamming distance. ** Symmetry. * Broken Symmetry. ** Instantons/phases on the tree. ** Introduction: What is next? ** Our objectives **
3
L L N N Forward-Error-Correction Coding Decoding N > L R=L/N - code rate channel white Gaussian symmetric example menu noise
4
Low Density Parity Check Codes menu N=10 variable nodes M=N-L=5 checking nodes Parity check matrix mod 2 Tanner graph “spin” variables - - set of constraints (linear coding)
5
Decoding (optimal) “magnetic” field (external/noise) constraints “free energy” “statistical sum” menu (symbol to symbol) Maximum-A-Posteriori (MAP) decoding Efficient but Expensive: requires operations “magnetization” Stat Mech interpretation was suggested by N. Sourlas (Nature ‘89) To notice – spin glass (replica) approach for random codes: e.g. Rujan ’93, Kanter, Saad ’99; Montanari, Sourlas ’00; Montanari ’01; Franz, Leone, Montanari, Ricci-Tersenghi ‘02
6
Sub-optimal but efficient decoding Belief Propagation (BP) Gallager’63;Pearl ’88;MacKay ‘99 =solving Eqs. on the graph Iterative solution of BP = Message Passing (MP) Q*m*N steps instead of Q - number of MP iterations m - number of checking nodes contributing a variable node What about efficiency? Why BP is a good replacement for MAP? * (no loops!) menu
7
Post-Error-Correction Bit Error Rate (BER) measure for unsuccessful decoding Probability of making an error in the bit “i” {+1} is chosen for the initial code-word probability density for given magnetic field/noise realization Foreword-error-correction scheme/optimization 1. describe the channel/noise --- External 2. suggest coding scheme 3. suggest decoding scheme 4. measure BER/FER 5.If BER/FER is not satisfactory (small enough) goto 2 menu
8
From R. Urbanke, “Iterative coding systems” menu SNR, s BER, B Shannon transition/limit
9
Error floor Error floor prediction for some regular (3,6) LDPC Codes using a 5-bit decoder. From T. Richardson “Error floor for LDPC codes”, 2003 Allerton conference Proccedings. menu No-go zone for brute-force Monte-Carlo numerics. Estimating very low BER is the major bottleneck of the coding theory
10
Our objective: For given (a) channel (b) coder (c) decoder to estimate BER/FER by means of analytical and/or semi-analytical methods. Hint: BER is small and it is mainly formed at some very special “bad” configurations of the noise/”magnetic field” Instanton/saddle-point approach is the right way to identify the “bad” configurations and thus to estimate BER! menu
11
Tree -- no loops -- approximation MAP BP Belief Propagation is optimal (i.e. equivalent to Maximum-A-Posteriori decoding) on a tree (no loops) Analogy: Bethe lattice (1937) Gallager ’63; Pearl ’88; MacKay ’99 Vicente, Saad, Kabashima ’00; Yedidia, Freeman, Weiss ‘01
12
From a finite-size LDPCC to a tree: 1) Fix the variable node where BER needs to be calculated 2) Choose shortest loop on the graph coming through the “0”th node. Length of the loop is (n+1). 3) Count n-generations from the tree center and cut the rest. Regular graph/tree is characterized by: m - number of checking nodes connected to a variable node k - number of variable nodes connected to a checking node n - number of generations on the tree m=2,k=3,n=4 menu
13
BER in the center of the tree Tree is directed thus integrating over the ``magnetic fields” one gets a path-integral over new fields,, defined on the variable nodes. menu Remarks: 1) Optimal configuration/instanton depends on SNR, s; 2) There are may be many competing instantons; 3) Looking for instantons pay attention to the symmetry Instanton equations! Effective action
14
High Signal-to-Noise-Ratio (SNR) phase menu Original code word = “+1” on the entire tree The next “closest” code word = “-1” on the colored branches, = “+1” on the remaining variable nodes Hamming distance between the two code words = number of the colored variable nodes at s>>1 That is also given by an instanton: node is colored node is not colored Analogy with a low-temperature phase in stat-mech: High SNR value of effective action ~ self energy
15
Low SNR -- symmetric -- phase menu Symmetric phase: at any node on the tree depends primarily on the generation (counted from the center) for j=0,…,n-2 instanton equations “zero momentum” configuration/approximation guarantees estimation from above for eff. action Shannon’s transition finite infinite
16
“0” “2” “1” “4”“3” In general: There are many (!!!) broken symmetry instanton solutions Remark: Broken symmetry instantons may be related to the “near codewords” suggested by Richardson ‘03 in the context of the error-floor phenomenon explanation menu Broken symmetry High SNR (low temp) Low SNR (high temp)
17
m=4, l=5, n=3. Curves of different colors correspond to the instantons/phases of different symmetries. Instanton phases on the tree menu truth … transitions
18
Full numerical optimization (no symmetry breaking was assumed !!!) Area of a circle surrounding any variable node is proportional to the value of the noise on the node. m=2 l=3 n=3 menu
19
What is next? We plan to develop and extend this instanton approach to: Regular codes with loops. This task will require developing a perturbation theory with respect to the inverse length of the closed loop and/or with respect to the small density of closed loops. Other types of codes, e.g. convolutional, turbo, etc. Calculation of the Frame Error Rate (FER), thereby measuring the probability of making an error in a code word. Finite-number of iteration in message-passing version of the BP algorithm. The particular interest here lies in testing how BER in general and the error floor phenomena in particular depend on the number of iterations. Other types of fast but, probably, less efficient decoding schemes. Other types of uncorrelated channels (noise), e.g. binary eraser channel. Correlated channels, with both positive and negative types of correlations between neighboring slots. This is particularly relevant for linear and nonlinear (soliton) transmission in fiber optics communications. Accounting for Gaussian fluctuations (i.e. second order effects) around the instantons. menu
20
Truth … menu main slide
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.