Presentation is loading. Please wait.

Presentation is loading. Please wait.

Iterative Soft-Decision Decoding of Algebraic-Geometric Codes Li Chen Associate Professor School of Information Science and Technology, Sun Yat-sen University,

Similar presentations


Presentation on theme: "Iterative Soft-Decision Decoding of Algebraic-Geometric Codes Li Chen Associate Professor School of Information Science and Technology, Sun Yat-sen University,"— Presentation transcript:

1 Iterative Soft-Decision Decoding of Algebraic-Geometric Codes Li Chen Associate Professor School of Information Science and Technology, Sun Yat-sen University, Guangzhou, China chenli55@mail.sysu.edu.cn website: sist.sysu.edu.cn/~chenli Institute of Network Coding and Department of Information Engineering, the Chinese University of Hong Kong 1 st of Aug, 2012

2 Outline Introduction (How to construct an algebraic-geometric code?) Review on Koetter-Vardy list decoding (Challenges in the decoding) Iterative soft-decision decoding (An iterative solution) Geometric interpretation of the iterative decoding (An insight into the solution) Complexity reduction decoding approaches (Some implementation advises) Performance analysis (Advantage and cost) Conclusions (An end & a beginning)

3 I. Introduction The construction of an algebraic-geometric (AG) code Based on an algebraic curve χ(x, y, z)  Identify its point of infinity p ∞  define the pole basis Φ  Pick up one of the affine components, e.g., χ(x, y, 1)  find out the affine points p j The Reed-Solomon (RS) code is the simplest AG code  Constructed based on y = 0;  Its pole basis Φ = {1, x, x 2, x 3, x 4, ……}  Affine points {x 1, x 2, x 3, …., x n } \ {0}  Note: the length of the code cannot exceed the size of the finite field. The generator matrix GThe parity-check matrix H

4 I. Introduction The Hermitian curve:,,  The point of infinity The pole basis Bivariate monomials and their pole orders  Based on one of its affine components H w (x, y, 1), determine the affine points p j = (x j, y j, 1) where x j w+1 + y j w + y j = 0 and j = 1, 2, …, n. Encoding of an (n, k) Hermitian code  Given the message vector  The codeword is generated by  Note > q  The length of the code can exceed the size of the finite field!

5 I. Introduction Example: Construction of a (8, 4) Hermitian code  Defined in = {0, 1, α, α 2 };  The Hermitian curve H 2 (x, y, z) = x 3 + y 2 z + yz 2  point of infinity p ∞ = (0, 1, 0); One of its affine components: H 2 (x, y, 1) = x 3 + y 2 + y;  Its pole basis  Its affine points: p 1 = (0, 0), p 2 = (0, 1), p 3 = (1, α), p 4 = (1, α 2 ), p 5 = (α, α), p 6 = (α, α 2 ), p 7 = (α 2, α), p 8 = (α 2, α 2 )

6 I. Introduction Advantage: Over the same finite field, the Hermitian codes are larger than the RS codes; Code length vs. field size Disadvantage: It is not a Maximum Distance Separable (MDS) code. The error-correction capability of a very high rate code is almost vanished. F q Codes F4F4 F 16 F 64 F 256 RS code31563255 H. code8645124096

7 II. Review on KV list decoding Decoding philosophy evolution Unique decoding List decoding The Sakata algorithm with majority voting The Guruswami-Sudan (GS) algorithm The Koetter-Vardy (KV) algorithm

8 II. Review on KV list decoding Key processes: Reliability transform (Π  M), Interpolation (construct Q(x, y, z)), Factorisation (find out the z-roots of Q) Reliability transform and knowledge of M (Example: a (8, 4) Hermitian code) p 1 p 2 p 3 p 4 p 5 p 6 p 7 p 8 C 1 C 2 C 3 C 4 C 5 C 6 C 7 C 8 R 1 R 2 R 3 R 4 R 5 R 6 R 7 R 8 0 1 α Encoding Channel α2α2 Reliability transform E.g., interpolation will be performed w.r.t. (p 5, 1) with a multiplicity of 2. The number of interpolation constraints is

9 Reliability-based codeword score and multiplicity-based codeword score Given a codeword Theorem 1 If, can be found by determining the z-roots of Q. Theorem 2 If, can be found by determining the z-roots of Q. The optimal decoding performance of the KV algorithm is dictated by Π. II. Review on KV list decoding 0 1 α α2α2 = 2.14 0 1 α α2α2 = 5

10 II. Review on KV list decoding KV decoding performance of the (64, 39) Hermitian code Challenge: Can we further improve the KV decoding performance for Hermitian codes? Π

11 III. Iterative Soft-Decision Decoding Decoding stages:  ABP: Adaptive belief propagation, to improve the reliability of Π;  KV: Koetter-Vardy list decoding, to find out the message vector ; Decoding block diagram ABP KV Π Π’Π’

12 III. Iterative Soft-Decision Decoding Binary image of the parity-check matrix H Let σ(x) = σ 0 + σ 1 x + ∙∙∙ + σ β x β be the primitive polynomial of The companion matrix of σ(x) is Example: In, σ(x) = 1 + x + x 2 and

13 III. Iterative Soft-Decision Decoding Is H b suitable to be used for BP decoding? Density of the matrix: 53.125%; The number of short cycles ( ): 279; We will have to reduce the density and eliminate parts of the short cycles!

14 III. Iterative Soft-Decision Decoding Bit reliability oriented Gaussian elimination on H b ; Assume each coded by c j is BPSK modulated  symbol s j (j = 1, 2, …, N); Given as the received vector; The bit log-likelihood ratio (LLR) value is: and the LLR vector Reliability of bit c j is determined by, E.g., Pr[c 1 = 0 | y 1 ] = 0.49 Pr[c 1 = 1 | y 1 ] = 0.51 Pr[c 2 = 0 | y 2 ] = 0.93 Pr[c 2 = 1 | y 2 ] = 0.07 |L(c 1 )| = 0.04 |L(c 2 )| = 2.59 Bit c 2 is more reliable!

15 III. Iterative Soft-Decision Decoding Bit reliability sorting: sort the bits in term of their reliabilities; Refreshed bit indices that indicate Let be the set of bit indices and. E.g., based on the above sorting outcome, let that collects all the N - K least reliable bit indices; We could sort LLR vector as

16 III. Iterative Soft-Decision Decoding Perform Gaussian elimination w.r.t. the columns indicated by B, i.e., reduce column j 1 to [1 0 0 ∙∙∙ 0] T ; reduce column j 2 to [0 1 0 ∙∙∙ 0] T ; reduce column j N-K to [0 0 0 ∙∙∙ 1] T. Gaussian elimination H b  H b ’ (density ; number of short cycles ) Matrix H b ’ is more suitable to be used in the BP decoding. N-K bits …

17 III. Iterative Soft-Decision Decoding The conventional BP decoding based on H b ’ Let h ij denote the entry of matrix H b ’ Define and Initialization: with entries v ij and u ij, and For each BP iteration Horizontal step (V  U) Vertical step (U  V) After a number of BP iterations, update the bit LLR values as, where the extrinsic LLR is η is the damping factor.

18 III. Iterative Soft-Decision Decoding The updated LLR vector can be formed as The updated bit LLR values are converted back into APP values by They can then be used to generate the improved reliability matrix Π’ Reliability transform M Interpolation Factorization

19 III. Iterative Soft-Decision Decoding A work example: Iterative decoding of the (8, 4) Hermitian code Codeword (sym. wise) Codeword (bit wise) The received LLR vector is: : LLR values that give a wrong estimation on the bits The original reliability matrix Π is: = 3.969 = 3.993 Based on Theorem 1, KV decoding will fail! L(c j ) ≥ 0  c j = 0; L(c j ) < 0  c j = 1

20 III. Iterative Soft-Decision Decoding Sort the bits in an ascending order in terms of, yielding Perform Gaussian elimination on those columns implied by B Density: 53.125% S. cycles: 279 Density: 37.5% S. cycles: 112 j 0, j 2, …, j 15 = 7, 10, 0, 12, 8, 3, 11, 13, 2, 14, 15, 4, 1, 6, 9, 5 B BcBc

21 III. Iterative Soft-Decision Decoding Based on H b ’, perform 3 BP iterations, we have the updated LLR vector as The updated reliability matrix Π’ becomes For the ‘wrong’ LLR values ( ): we would like to change its sign, or reduce its magnitude; For the ‘right’ LLR values: we would like to leave the sign unchanged and increase its magnitude; = 4.478 = 4.037 Based on Theorem 1, KV decoding will succeed!

22 III. Iterative Soft-Decision Decoding Why Gaussian elimination should be bit reliability oriented? reliable bits unreliable bits L’(c 7 ) L’(c 5 ) Tanner graph 4/14/15/25/25/25/23/23/2 3/23/2 5/05/0

23 III. Iterative Soft-Decision Decoding How to improve the iterative decoding performance? It is possible that reliable bits are wrongly estimated by their LLR values; We can create different sets of bit indices B and let more bits’ corresponding cols. also fall into the identity submatrix of H b ’. Example with the sorted bit indices being {7, 10, 0, 12, 8, 3, 11, 13, 2, 14, 15, 4, 1, 6, 9, 5} B (1) Gau. elimination {3, 11, 13, 2, 14, 7, 10, 0, 12, 8, 15, 4, 1, 6, 9, 5} B (2) {15, 4, 1, 6, 9, 7, 10, 0, 12, 8, 3, 11, 13, 2, 14, 5} B (3) Gau. elimination

24 III. Iterative Soft-Decision Decoding A revisit of the decoding block diagram Note if there are multiple matrix adaptations, the next bit reliability sorting will be performed based on the updated LLR vector ; Multiple attempts of KV decoding result in an output list that contains all the message candidates. The Maximum Likelihood (ML) criterion is used to select one from the list.

25 IV. Geometric Interpretation Insight of why we need matrix adaptations before the BP decoding Normalize the vector to the vector Normalize L(c j ) to T j by the mapping function A graphical look into the vector and the vector.

26 IV. Geometric Interpretation When the codeword is not found When the codeword is found When a codeword is found, T j = 1 for j = 1, 2, …, N;

27 IV. Geometric Interpretation Objective of the BP decoding: Finding the vector that minimizes the potential function The LLR update in the BP decoding can be seen as the T value update Finding the estimated codeword using the BP algorithm can be seen as identifying the vertex at which the potential function is minimized.

28 IV. Geometric Interpretation The convergence behavior of the potential function of the (64, 39) Hermitian code = -100

29 V. Complexity Reduction Decoding parameters -- number of groups of unreliable bit indices -- number of matrix adaptations (Gau. eliminations) -- number of BP iterations There are three types of computations required by the decoding Binary operations (Gau. eliminations): Floating point operations (BP iterations): Finite field arithmetic operations (KV decodings): With the iterative decoding parameters of Binary operations: Floating point operations: Finite field arithmetic operations: × × ×

30 V. Complexity Reduction Reduce the deployment of the KV decoding steps ABP-KV decoding block diagram We could try to assess the quality of matrices Π’ and M. If they are not good enough to result in a possibly successful decoding, the following KV decoding process will NOT be carried out. ABP Π’MΠ’M Π Π’Π’ Intp.Fac. M

31 V. Complexity Reduction Reliability-based received word score Multiplicity-based received word score Example: the (8, 4) Hermitian code 0 1 α α2α2 0 1 α α2α2 = 6.7 = 15

32 V. Complexity Reduction Recall the two theorems for successful KV decoding Theorem 1  If ( ) {KV can succeed;} Theorem 2  If ( ) {KV can succeed;} Lemma 3  If ( ) {KV cannot succeed;} Lemma 4  If ( ) {KV cannot succeed;} ABP Π’MΠ’M Π Π’Π’ Intp.Fac. M Proof:

33 V. Complexity Reduction Complexity reduction for ABP-KV decoding of the (64, 52) Hermitian code Decoding parameters = (10, 5, 2) There are 50 KV decoding processes for each codeword frame

34 V. Complexity Reduction Other facilitated decoding approaches: Parallel decoding: Output validation: once is found, the iterative decoding will be terminated. ( ; )

35 VI. Performance Analysis Decoding parameters: the KV decoding output list size (l) and The (8, 4) Hermitian code over the AWGN channel

36 VI. Performance Analysis The (64, 39) Hermitian code over the AWGN channel

37 VI. Performance Analysis The (64, 47) Hermitian code over the AWGN channel

38 VI. Performance Analysis The (64, 47) Hermitian code over the fast Rayleigh fading channel Coherent detection with the knowledge of CSI

39 VI. Performance Analysis Herm. (64, 47) vs. RS (15, 11), over the AWGN channel

40 VII. Conclusions Revisit the construction of AG codes: pole basis + affine points; Review the KV soft-decision list decoding algorithm: Π dependent; Introduce an iterative soft-decision decoding algorithm for Hermitian codes: Adaptive Belief Propagation + KV list decoding; ABP algorithm is bit reliability oriented  BP is also good for AG (RS) codes; Geometric interpretation  necessity of performing parity-check matrix adaptation; Complexity reduction : successive criteria to assess Π’ and M; parallel decoding; output validations; Performance analysis shows a significant performance gain can be achieved (~ conventional algorithms; ~ RS codes).

41 Acknowledgement Project: Advanced coding technology for future storage devices; ID: 61001094; From 2011. 1 to 2013. 12. National natural Science Foundation of China Thank you!


Download ppt "Iterative Soft-Decision Decoding of Algebraic-Geometric Codes Li Chen Associate Professor School of Information Science and Technology, Sun Yat-sen University,"

Similar presentations


Ads by Google