Download presentation
Presentation is loading. Please wait.
Published byKornélia Flóra Pap Modified over 5 years ago
1
Selection procedure of turbo code parameters by combinatorial optimization
Yannick Saouter 24/06/2019
2
Positioning of the problem
Turbo code encoding : data frame X=(x1,…,xn) is permuted by to give X’=(x(1),…,x(n)). The two frames are encoded by a RSC and gives redundancies Y1 and Y2. Turbo code decoding : Iterative MAP or SUB-MAP. Permutation design is the key for good performance.
3
Design of turbo code permutation : state of the art (I)
Pseudo-random generation of permutations is not satisfactory : error floors ! Criteria have been defined for permutations. A good permutation is a priori supposed to maximize those criteria. Examples : Spread (or Lee distance), weight-2 codeword girth, correlation, minimal distance … Selection procedure : generate permutations, select those with large criteria values, verification of performance by true simulation. Try and guess : extremal permutations are unlikely to be found if they are rare !
4
Design of turbo code permutation : state of the art (II)
Unstructured models : permutation mapping is stored in an array. Examples: S-Random, 3GPP , Quasi Cyclic model. Structured models : permutation is defined by algebraic equations. Examples: DVB-RCS, ARP, QPP, DRP. Unstructured models are expensive in terms of memory area. Structured models are cheaper but are less general.
5
A structured model : the Almost Regular Permutation (ARP) model.
ARP model was defined by Berrou et al. [2004]. It is a modification of former DVB-RCS model. Frame length equal to N. If i=(j) then we have i=P*j+Q(j) [mod. N] where: Values P, Q1, Q2, Q3 are parameters. P is relatively prime with N. Q1, Q2, Q3 are divisible by 4 (to ensure bijectivity).
6
Effects of correlation
Iterative decoding imply that a bad decision can reinforce itself through a correlation loop. Autocorrelation effect is more important than minimal distance ! Hokfelt et al. [2001] 512 bits turbo code C1: Dmin=32 C2 : Dmin=27, lower autocorrelation
7
Correlation girth Correlation girth is defined to be a weighted girth of the correlation graph of the permutation. Neighbor to neighbor connections have a weight equal to 1, while permutations connections weighs 0. Correlation graph can be converted to a 4-regular graph.
8
Maximizing girth of graphs: the Progressive Edge Growth algorithm
Problem : Building a k-regular graph with n vertices. Applied : by Hu et al. [2001] in the case of bipartite graphs for the design of LDPC codes. Result : Chandran [2003], If n tends to when k is fixed, then the girth of the final graph also tends to . 1. Initial set of edges empty. Set k’=0. 2. While k’ k do 3. Select one vertex with k’ neighbors. If not possible go to step 7. 4. Compute graph distance between and any vertex. 5. Add an edge between and the most distant vertex of valency k’. 6. Go to step 3. 7. Set k’=k’+1. End while step 2. 8. End.
9
Combinatorial optimization for ARP model
Reminder : (j) =P*j+Q(j) [mod. N] where Q(j)={0,Q1,Q2,Q3} according to j [mod 4]. Shadow graphs : If only P is defined, 1 quarter of the final ARP graph is defined. If P and Q1 are defined, 1 half of the final graph is defined… Periodicity : Shadow graphs and final graph are periodic with period 4. Algorithm : 1. Choose P at random 2. For j from 1 to 3 do 3. Compute graph distance of vertices from vertex j. 4. Select the most distant one i (with valency 2) and set i=(j). 5. Compute Q(j) and add corresponding edges. 6. End for step 2. 7. Compute final girth and stop.
10
Remarks At each step, N/4 edges are added unlike to PEG algorithm. No garanty of optimality and even of efficiency. Selected vertices have no modulo constraint unlike to original ARP model. Naturally generalizes to larger periods. Instead of selecting the most distant vertex, it is possible to select one amongst the most distant Declerq et al [2008]. More flexibility in constructed graphs.
11
Unconditional upper bound : Moore’s bound
Moore’s bound : general bound on diameter (and girth) of regular graphs.
12
Complexity Quite low thanks to Dijskra’s algorithm [Dijskra 1959] : Demo.
13
Performances Binary turbo codes, Gaussian BPSK, Rate=1/3
14
ARP vs. DVB-S2 (I) Gaussian BPSK, 5400 bits, Rate=1/3
15
ARP vs. DVB-S2 (II) Rayleigh BPSK, 5400 bits, Rate=1/3
16
ARP vs. DVB-S2 (III) Rayleigh + 50 % erasures, BPSK, 5400 bits, Rate=1/3
17
Further axes of research
Could we obtain an optimality result (like Chandran’s one) in the case of structured graphs ? Can we get closer to Moore’s bound ? Could it be possible to find new decoding algorithms specially designed to limit autocorrelation ?
18
Thank you for your attention !
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.