Download presentation
Presentation is loading. Please wait.
1
CUHK, May 2010 Department of Electrical Engineering École Polytechnique de Montréal Professor of Electrical Engineering Life Fellow of IEEE Fellow, Engineering Institute of Canada David Haccoun, Eng., Ph.D.
2
Engineering training in Canada 36 schools/facultie s 3 2 2 1 13 11 1 1 2 Undergraduate students Canada: 55,000 Québec: 14,600 2 Montréal Vancouver Toronto
3
3 École Polytechnique, cradle of engineering in Québec The oldest engineering school in Canada. The third-largest in Canada for teaching and research. The first in Québec for the student body size. Operating budget $85 million Canadian Dollars (C$). Annual research budget $60.5 million C$. Annual grants and research contracts $38 million C$. 15 Industrial Research Chairs. 24 Canada Research Chairs. 7863 scientific publications over the last decade. 220 professors, and 1,100 employees. 1,000 graduates per year, and 30,000 since 1873.
4
4 11 engineering programs Biomedical Civil Chemical Electrical Geological Industrial Computer Software Mechanical Mining Engineering physics
5
Our campus 5 Polytechnique
6
Novel Iterative Decoding Using Convolutional Doubly Orthogonal Codes A simple approach to capacity David Haccoun Éric Roy, Christian Cardinal
7
Modern Error Control Coding Techniques Based on Differences Families A new class of threshold decodable codes leading to simple and efficient error control schemes. No interleaver, at neither encoding nor decoding Far less complex to implement than turbo coding schemes, attractive alternatives to turbo coding at moderate E b /N 0 values High rate codes readily obtained by puncturing technique Low complexity and high speed FPGA-based prototypes at bit rate >100 Mbps. Extensions to recursive codes Capacity – Rate adaptive schemes Punctured Codes – Reduced latency Simplified Codes – Reduced complexity 7
8
8
9
0 1 2 m-1 m D1D1 =0 =m=m D2D2... DDmDm Shift register of length m Information sequence Parity sequence J AWGN Channel – Set of connection positions – Number of connection positions – Memory length – Coding span AJmAJm One-Dimensional NCDO Codes Nonrecursive systematic convolutional (NSC) encoder Nonrecursive systematic convolutional (NSC) encoder ( R = 1/2 ) 9
10
Simple orthogonal properties : CSOC Differences are distinct Example of Convolutional Self-Orthogonal Code CSOC, R=1/2, J=4, m=15, CSOC, R=1/2, J=4, m=15, 10 +
11
Example of CSOC, J=4, Distinct Simple Differences 031315 0-3-13-15 33-10-12 13 10-2 15 122 All the simple differences are distinct CSOC codes are suitable for threshold decoding 11
12
Threshold (TH) Decoding of CSOC Threshold (TH) Decoding of CSOC Well known symbol decoding technique that exploits the simply-orthogonal properties of CSOC Either hard or soft-input soft-output (SISO) decoding Very simple implementation of majority logic procedure 12 CSOC are Non iterative, and CSOC are Non iterative, systematic and non recursive
13
Example of One-Step Threshold Decoder J = 3, A= {0, 1, 3}, d min = 4, are LLRs values representing the received symbols, ûiûi > < DDD DD D 1 =0 2 =1 3 =3 Soft outputs in LLR = tanh/tanh -1 (sum-product) or add-min (min-sum) operator Decoded bits 0 1 0 ( 2 - 1 )=1 ( 3 - 0 )=3( 3 - 1 )=2 13
14
Extension to Iterative Threshold Decoding Convolutional Self-Doubly-Orthogonal Codes : CSO2C Decoder exploits the doubly-orthogonal properties of CSO2C Asymptotic error performance (d min =J+1 ) at moderate E b /N 0 1.All the differences ( j - k ) are distinct ; 2.The differences of differences ( j - k )–( l - n ), j k, k n, n l, l j, must be distinct from all the differences ( r - s ), r s ; 3.The above differences of differences are distinct except for the unavoidable repetitions Novel Iterative Error Control Coding Schemes Issues : Search and determination of new CSO2Cs Extention of Golomb rulers problem (unsolved) 14
15
(3,0,2,0)=((15)-(-13))= 28 (3,0,3,0)=((15)-(-15))= 30 (3,1,0,1)=((12)-( 3))= 9 (3,1,2,0)=((12)-(-13))= 25 (3,1,2,1)=((12)-(-10))= 22 (3,1,3,0)=((12)-(-15))= 27 (3,1,3,1)=((12)-(-12))= 24 (3,2,0,1)=(( 2)-( 3))= -1 (3,2,0,2)=(( 2)-( 13))= -11 (3,2,1,0)=(( 2)-( -3))= 5 (3,2,1,2)=(( 2)-( 10))= -8 (3,2,3,0)=(( 2)-(-15))= 17 (3,2,3,1)=(( 2)-(-12))= 14 (3,2,3,2)=(( 2)-( -2))= 4 (0,1,0,1)=(( -3)-( 3))= -6 (0,2,0,1)=((-13)-( 3))= -16 (0,2,0,2)=((-13)-(13))= -26 (0,3,0,1)=((-15)-( 3))= -18 (0,3,0,2)=((-15)-(13))= -28 (0,3,0,3)=((-15)-(15))= -30 (1,0,1,0)=(( 3)-( -3))= 6 (1,2,0,2)=((-10)-(13))= -23 (1,2,1,0)=((-10)-( -3))= -7 (1,2,1,2)=((-10)-(10))= -20 (1,3,0,2)=((-12)-(13))= -25 (1,3,0,3)=((-12)-(15))= -27 (1,3,1,0)=((-12)-( -3))= -9 (1,3,1,2)=((-12)-(10))= -22 (1,3,1,3)=((-12)-(12))= -24 (2,0,1,0)=((13) -( -3))= 16 (2,0,2,0)=((13)-(-13))= 26 (2,1,0,1)=((10)-( 3))= 7 (2,1,2,0)=((10)-(-13))= 23 (2,1,2,1)=((10)-(-10))= 20 (2,3,0,1)=(( -2)-( 3))= -5 (2,3,0,3)=(( -2)-( 15))= -17 (2,3,1,0)=(( -2)-( -3))= 1 (2,3,1,3)=(( -2)-( 12))= -14 (2,3,2,0)=(( -2)-(-13))= 11 (2,3,2,1)=(( -2)-(-10))= 8 (2,3,2,3)=(( -2)-( 2))= -4 (3,0,1,0)=((15)-( -3))= 18 Differences of Differences Example of CSO2C, J=4, All the differences of differences are distinct These codes are suitable for iterative threshold or belief propagation decoding 15
16
Issue : minimization of memory length (span) m of encoders Lower bound on span Spans of some best known CSO2C encoders Jm (span)J J 541141377423402923 6100151650324502505 7222163490825643676 8459175007126965950 99121871858271117924 10169819107528281517378 11346720148787291894067 12517321209013302437586 13925222299126 16
17
Approximate MAP value i : Decision rule : û i =1 if and only if i 0, otherwise û i = 0 CSOC i is an equation of independent variables Non-Iterative Threshold Decoding for CSOCs Received Inform. Symb. Extrinsic Information =+ : Addmin operator; where 17
18
Iterative Threshold Decoding for CSO2Cs Feedback for past symbols Feedforward for future symbols 1 Iteration: Distinct Differences 2 Iterations: Distinct Differences of differences Distinct Differences of differences from Differences General Expression: Iterative Expressions: 18 Depends on the simple differences Estimation of at Iteration depends on the simple differences and on the differences of differences
19
No interleaver One ( identical ) decoder per iteration Forward-only operation Features : Forward-Only Iterative Decoder Iterative Threshold Decoder Structure for CSO2Cs From channel Hard Decision Delay m......... Soft output Soft output Soft output Soft output threshold decoder Iteration = 1......... threshold decoder Iteration = 2 threshold decoder Iteration = I threshold decoder Iteration =M Last Iteration Decoded Information symbols Information symbols Parity-check symbols Delay m 19
20
Block Diagram of Iterative Threshold Decoder (CSO2Cs) One-step TH decoding per iteration Iterative TH decoder ( M iterations M one-step decoders) Each one-step decoder for a distinct bit Latency m bits Input For Output For Total Latency M m bits Latency m bits 20
21
Iterative Belief Propagation (BP) Decoder of CSO2C DEC 1 DEC2 M p t w u t w p Mmt w u t w }{ )(, M j t v )1( mt )2( 2 mt )( M Mmt Mmt u ˆ 0 1 (TH) M (BP) ~ ½ M (TH) BP Latency ~ ½ TH Latency 1-step BP complexity ~ J X 1-step TH complexity (BP) Threshold Decoder BP Decoder Latency m bits Latency m bits Latency m bits Latency m bits 21
22
Both BP and TH decoding approach the asymptotic error performance in error floor region Error Performance Behaviors of CSO2Cs J=9, A={0, 9, 21, 395, 584, 767, 871, 899, 912} BP, 8 iterations BP, 4 iterations TH, 8 iterations BP Error floor region TH Error floor region TH Waterfall region BP Waterfall region 22
23
Reduce span by relaxing conditions on the double orthogonality at small degradation of the error performance Simplified S-CSO2C Search and determination of new S-CSO2Cs with minimal spans Analysis Results of CSO2Cs With iterative decoding, error performance depends essentially on the number of connections, rather than on memory lengths (spans). Effects of Code Structure on Error Performance Improvements : Span Reduction Best known codes: rapid increase of encoding spans with J : Optimal codes unknown (Minimum span m ) Shortcomings of CSO2Cs 23
24
The set of connection positions A satisfies : 1.All the differences ( j - k ) are distinct ; 2.The differences of differences ( j - k )-( l - n ), j k, k n, n l, l j, are distinct from all the differences ( r - s ), r s ; 3.The differences of differences are distinct except for the unavoidable repetitions and a number of avoidable repetitions Definition of S-CSO2Cs Number of repeated differences of differences (excluding the unavoidable repetitions) Maximal number of distinct differences of differences (excluding the unavoidable repetitions), Normalized simplification factor Search and determination of new short span S-CSO2Cs yielding value 24
25
Comparison of Spans of CSO2Cs and S-CSO2Cs J CSO2C m (span) S-CSO2C m (span) J CSO2C m (span) S-CSO2C m (span) 5410.38182314137740.42691967 61000.43334515165030.42532653 72220.44168216349080.43133532 84590.482812917500710.42464978 99120.489520818718580.40026905 1016980.4917340191075280.40538748 1134670.4539588201487870.39239749 1251730.4632894 1392520.41931217 25
26
Performance Comparison for J=10 S-CSO2C 26 Uncoded BPSK coding gain coding gain asymptotic coding gain asymptotic coding gain
27
Performance Comparison for J=8 Codes (BP Decoding) 27 CSO2C: A = { 0, 43, 139, 322, 422, 430, 441, 459 } S-CSO2C: A = { 0, 9, 22, 55, 95, 124, 127, 129 }
28
Performance Comparison CSO2Cs / S-CSO2Cs (TH Decoding) E b /N o = 3.5 dB 8 th iteration BER Latency (x 10 4 bits) 3000 14000 CSO2C S-CSO2C 28
29
29 Convolutional Self- Orthogonal Codes (CSOC) Simple Orthogonality Extension Simplified CSO2C (S-CSO2C) Relaxed Double Orthogonality Relaxed Conditions Double Orthogonality Convolutional Self- Doubly-Orthogonal Codes (CSO2C) Orthogonal properties of set A Substantial Span Reduction Large Span Small Span Analysis of Orthogonality Properties (span)
30
30 Decoded symbol Analysis of Orthogonality Properties (computational tree) LLR for final hard decision Simple orthogonality Independence of inputs over ONE iteration Double orthogonality Independence of inputs over TWO iterations The computational tree represents the symbols used by the decoder to estimate each information symbol in the iterative decoding process. Error performances function of Independency VS Short cycles 1 Analysis shows that the parity symbols are limiting the decoding performances of the iterative decoder because of their degree 1 in the computational tree (no descendant nodes). Impact : The decoder does not update these values over the iterative decoding process : limiting error performances. Iter ( -1) Iter ( -2)
31
31 Distinct differences Distinct differences of differences Distinct differences from difference of differences CSOC CSO2C Conditions on associated sets Codes Analysis of Orthogonality Properties (cycles) No 4-cycles Minimization of Number of 6-cycles Minimization of Number of 8-cycles Cycles on Graphs Uniformly Distributed A number of repetitions of differences of differences A Number of Additional 8-cycles Approximately Uniformly Distributed S-CSO2C
32
32 Asymptotic coding gain Correspond to the minimum Hamming distance at moderate Eb/N0 values. Error performance Summary of Single Register CSO2Cs Structure of Tanner Graphs for Iterative Decoding No 4–cycles A minimal number of 6–cycles which are due to the unavoidable repetitions A minimal number of 8–cycles Uniform distribution of the 6 and 8–cycles Relaxing doubly orthogonal conditions of CSO2C adds some 8-cycles leading to codes with substantially reduced coding spans S- CSO2C
33
Extension : Recursive Convolutional Doubly-Orthogonal Codes (RCDO) 33 In order to improve the error performances of the iterative decoding algorithm the degree of the parity symbols must be increased Solution : Use Recursive Convolutional Encoders (RCDO)
34
RCDO codes RCDO are systematic recursive convolutional encoder RCDO can be represented by their sparse parity-check matrix H T (D) Forward connections Feedback connections RCDO encoder example : R=3/6, 3 inputs 6 outputs 34 1 st register 2 nd register 3 rd register
35
35 RCDO protograph structure The parity-check matrix H T (D) completely defined the RCDO codes. The memory of the RCDO encoder m is defined by the largest shift register of the encoder Each line of H T (D) represents one output symbol of the encoder. Each column of H T (D) represents one constraint equation. Protograph representation of a RCDO codes is defined by H T (D). The degree distributions of the nodes in the protograph become important in the convergence behavior of the decoding algorithm. Regular RCDO (d v, d c ) : d v = degree of variable (rows) d c = degree of constraint (col.) (same numbers of nonzero elements of H T (D) ) Irregular RCDO protograph
36
36 RCDO doubly-orthogonal conditions The analysis of the computational tree of RCDO codes shows that, as for the CSO2C, three conditions based on the differences must be respected by the connection positions of the encoder. For RCDO the decoding equations are completely independent over 2 decoding iterations. Estimation of parity symbols are now improved from iteration to iteration Resulting in improving the error performances
37
50 th iter LDPCn=1008 Increasing number of shift registers decoder limit’ RCDO (3,6) 1.10 dB 1.10 dB Characteristics : Small shift registers Error performances VS number of shift registers Low number of iterations compared to LDPC RCDO codes error performances Error performances of RCDO (3,6) codes, R=1/2, 25 th iteration The complexity per decoded symbol of all the decoders associated with the RCDOs ( in this figure ) is smaller than the one offered by the LDPC decoder of block length 1008. Attractive for VLSI implementation.
38
38 RCDO codes error performances Asymptotic error performances of RCDO close to BP decoder limit Characteristics : Coding rate-15/30 15 registers m = 149 Regular H T (D) (3,6) 40 th Iteration Close to optimal convergence behavior of the iterative decoder. After 40 iterations 0.4 dB Low error floor
39
Comparisons RCDO good error performance at low SNR CSO2C good error performances at moderate SNR P b = 10 -5 Error performances comparisons with other existing techniques 39 Figure from : C. Schlegel and L. Perez,Trellis and Turbo coding, Wiley, 2004.
40
Comparison of the techniques 40 CSO2C RCDOLDPC Implementation Complexity Encoding Decoding Low High Per Iteration Processing Size of operating window Number of decoded bits N/M 1 N/M 1 NNNN Error Performance E b /N 0 (Waterfall region) BER (Error floor) Error floor tendency Moderate Decreasing Low Decreasing Very small Low Flat Block length N, Iterations M Block length N, Iterations M
41
Conclusion 41 New iterative decoding technique based on systematic doubly orthogonal convolutional codes : CSO2C, RCDO. CSO2C : good error performances at moderate Eb/No : Single shift register encoder; J dominant Recursive doubly orthogonal convolutional codes RCDO. Error performances improvement at low Eb/No. Multiple shift registers encoder ; m dominant Error performances comparable to those of LDPC block codes. Simpler encoding and decoding processes. Attractive for VLSI high speed implementations Searching for optimal CSO2C & RCDO codes : open problem
42
42
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.