Download presentation
Presentation is loading. Please wait.
Published byChristiana Daniel Modified over 9 years ago
1
Improving analysis and performance of modern error-correction schemes: a physics approach Phys.Rev.Lett. 93, 198702 (2004) IT workshop, San Antonio 10/2004 CNLS workshop, Santa Fe 01/2005 Phys.Rev.Lett. 95, 228701 (2005) arxiv.org/abs/cond-mat/0506037 arxiv.org/abs/cs.IT/0507031 IT workshop, Allerton 09/2005 arxiv.org/abs/cs.IT/0601070 arxiv.org/abs/cs.IT/0601113 Misha Chertkov (Theory Division, LANL) Vladimir Chernyak (Department of Chemistry, Wayne State) Misha Stepanov (Theory Division, LANL) Bane Vasic (Department of ECE, University of Arizona) arxiv.org/abs/cond-mat/0601487 arxiv.org/abs/cond-mat/0603189 Analyzing error-floor for LDPC codesUnderstanding Belief-Propagation: Loop Calculus MC,VC towards improving Belief Propagation UoC, 04/10/06
2
Menu: (first part) Analogous vs Digital && Analogous Error-Correction && Digital Error-Correction && LDPC, Tanner graph, Parity Check && Inference, Maximum-Likelihood, MAP && MAP vs Belief Propagation (sum-product) && BP is exact on the tree && Error-correction Optimization && Shannon-Transition && Error-floor && Introduction Instanton method – the idea && Instanton-amoeba (efficient numerical method) && Test code: (155,64,20) LDPC && Instantons for the Gaussian channel (Results) && BER: Monte-Carlo vs Instanton && Conclusions && Path Forward && Instanton: proof of principles test
3
Analogous vs digital Analogous Digital continuous hard to copy discrete easy to copy 0111100101 camera picture music on tape typed text computer file real number better/worse integer number yes/no menu
4
Error-correction for analogous One iteration 4 iteration 16 iteration clean menu
7
L L N N Digital Error-Correction Coding Decoding N > L R=L/N - code rate channel white Gaussian symmetric example noise menu
8
Low Density Parity Check Codes menu N=10 variable nodes M=N-L=5 checking nodes Parity check matrix mod 2 Tanner graph “spin” variables - - set of constraints (linear coding)
9
Parity check matrix (155,64,20) code Tanner graph (155,64,20) code menu
10
Inference Given the detected (real) signal --- To find the most probable (integer) pre-image --- Maximum-Likelihood (ML) Decoding menu
11
Decoding (optimal) “magnetic” field log-likelihood constraints “free energy” “partition function” (symbol to symbol) Maximum-A-Posteriori (MAP) decoding (close to optimal) Efficient but Expensive: requires operations “magnetization”=a-posteriori log-likelihood Stat Mech interpretation was suggested by N. Sourlas (Nature ‘89) To notice – spin glass (replica) approach for random codes: e.g. Rujan ’93, Kanter, Saad ’99; Montanari, Sourlas ’00; Montanari ’01; Franz, Leone, Montanari, Ricci-Tersenghi ‘02 menu
12
Sub-optimal but efficient decoding Belief Propagation (BP=sum-product) Gallager’63;Pearl ’88;MacKay ‘99 =solving Eqs. on the graph Iterative solution of BP = Message Passing (MP) Q*m*N steps instead of Q - number of MP iterations m - number of checking nodes contributing a variable node What about efficiency? Why BP is a good replacement for MAP? * (no loops!) menu
13
Tree -- no loops -- approximation MAP BP Belief Propagation is optimal (i.e. equivalent to Maximum-A-Posteriori decoding) on a tree (no loops) Analogy: Bethe lattice (1937) Gallager ’63; Pearl ’88; MacKay ’99 Yedidia, Freeman, Weiss ‘01 menu
14
Bit Error Rate (BER) measure of unsuccessful decoding Probability of making an error in the bit “i” {+1} is chosen for the initial code-word probability density for given magnetic field/noise realization (channel) Digital error-correction scheme/optimization 1. describe the channel/noise --- External 2. suggest coding scheme 3. suggest decoding scheme 4. measure BER/FER 5.If BER/FER is not satisfactory (small enough) goto 2 menu
15
From R. Urbanke, “Iterative coding systems” SNR, s BER, B Shannon transition/limit menu
16
Error floor (finite size & BP-approximate) Error floor prediction for some regular (3,6) LDPC Codes using a 5-bit decoder. From T. Richardson “Error floor for LDPC codes”, 2003 Allerton conference Proccedings. No-go zone for brute-force Monte-Carlo numerics. Estimating very low BER is the major bottleneck in coding theory/practice menu
17
Our (current) objective: For given (a) channel (b) coder (c) decoder to estimate BER by means of analytical and/or semi-analytical methods. Hint: BER is small and it is mainly formed at some very special “bad” configurations of the noise/”magnetic field” Instanton approach is the right way to identify the “bad” configurations and thus to estimate BER! menu
18
Instanton Method Laplace method Saddle-point method Steepest descent errors no errors Error-surface (ES) Point at the ES closest to zero menu BER = d(noise) Weight(noise) instanton config. of the noise BERWeight instanon config of the noise Point at the ES closest to zero
19
Parity check matrix (155,64,20) code Tanner graph (155,64,20) code menu
20
Found with numerical instanton-amoeba scheme instanton-amoeba menu
21
Instantons for (155,64,20) code: Gaussian channel Phys. Rev. Lett -- Nov 25, 2005 menu
23
We suggested amoeba-instanton method for efficient numerical evaluation of BER in the regime of high SNR (error floor). The main idea: error-floor is controlled by only a few most damaging configurations of the noise (instantons). Conclusions (for the first part – error floor analysis) Results of the amoeba-instanton are successfully validated against brut-force Monte-Carlo (in the regime of moderate SNR) menu
24
Path Forward Extend the amoeba-instanton test to study the error-floor to develop universal computational tool-box for the error-floor analysis Other codes Other decoding schemes (e.g. number of iterations) Other channels (e.g. magnetic recording and fiber-optics specific) Major challenge !!!! – to improve BP qualitatively New decoding ?! New coding ?! Efficient (channel specific) LDPC optimization Inter-symbol interference + noise (2d and 3d + error-correction) Distributed coding, Network coding Combinatorial optimization menu
25
Understanding Belief Propagation Questions: Why it works so well … even when it should not? -- BP is gauge fixing condition Can one constructs a full solution (MAP) from BP? -- yes one can!/loop series Making use of the loop calculus/series Improving BP – approximate algorithms LDPC decoding SATisfiability resolution Data reconstruction Clustering etc Answers: arxiv.org/abs/cond-mat/0601487 arxiv.org/abs/cond-mat/0603189 first slide
26
Vertex Model Partition function Probability Reduction to bipartite graph (error-correction): Ising variables on edges improving BP
27
Bethe Free Energy --- Variational Approach Generalization of Yedidia, Freeman,Weiss ‘01 Constraints (introduce in minimization through Lagrange multipliers) Belief Propagation (Bethe-Peierls) equations self-energyentropy entropy correction improving BP
28
Loop series --- beliefs (prob.) calculated within BP ! BP is special, not only without loops! Gauge invariant representation! =C improving BP integral representation algebraic representation gauge representation Three alternative derivations:
29
Loop series (derivation #1) “vertex” “propagator” --- gauge degrees of freedom (at our disposal !) improving BP
30
Loop series (derivation #2) ** … * Expand the “vertex” (edge) term Calculate resulting terms one-by-one Each node enters the product only once Node is colored if it contains at least one colored edge Gauge fixing condition: To forbid “loose end contribution” for any node !! --- gauge degrees of freedom (at our disposal !) improving BP
31
Loop series (derivation #3) fixing the gauge!! to kill loops Belief Propagation !! equations Loop series has just been derived!! improving BP
32
Future work Approximate algorithms --- leading loop, next after leading,.. --- apply to LDPC decoding --- different graphs, lattices Generalization --- Ising Potts (longer alphabets) --- continuous alphabets (XY,Heisenberg,Quantum models) first slideimproving BP Conclusions ( for the second part – Understanding/Improving BP) Loopy BP works well because BP is nothing but GAUGE FIXING condition Simple finite series --- LOOP SERIES --- for MAP is constructed in terms of BP solution
34
Instantons on the tree (semi-analytical) menu PRL 93, 198702 (2004) ITW 2004, San Antonio m=2, l=3, n=3 m=3, l=5, n=2
35
Instanton-amoeba (efficient-numerical scheme) To minimize BER with respect to the unit vector !! error-surface unite vector in the noise space Minimization method of our choice is simplex-minimization (amoeba) menu instanton-amoeba for Tanner code
36
Different noise models for different channels White Gaussian Linear Symmetric simplifications Laplacian menu
37
Rational structure of instanton (computational tree analysis/explanation) min-sum 4 iterations based on Wiberg ‘96 Phys.Rev.Lett. 95, 228701 (2005) Minimize effective action keeping the condition menu
38
Bit-Error-Rate: Gaussian channel menu
39
Instantons for (155,64,20) code: Laplacian channel menu IT workshop, Allerton 09/2005
40
Instantons as medians of pseudo-codewords menu PRL -- Nov 25, 2005
41
Bit-Error-Rate: Laplacian channel menu
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.