Selection procedure of turbo code parameters by combinatorial optimization Yannick Saouter 24/06/2019.

Slides:



Advertisements
Similar presentations
Lower Bounds for Additive Spanners, Emulators, and More David P. Woodruff MIT and Tsinghua University To appear in FOCS, 2006.
Advertisements

Compressing Forwarding Tables Ori Rottenstreich (Technion, Israel) Joint work with Marat Radan, Yuval Cassuto, Isaac Keslassy (Technion, Israel) Carmi.
ECE 667 Synthesis and Verification of Digital Circuits
What is a good code? Ideal system
Size-estimation framework with applications to transitive closure and reachability Presented by Maxim Kalaev Edith Cohen AT&T Bell Labs 1996.
INTRODUCTION  The problem of classification of optimal ternary constant-weight codes (TCW) is considered  We use combinatorial and computer methods.
Locally Decodable Codes from Nice Subsets of Finite Fields and Prime Factors of Mersenne Numbers Kiran Kedlaya Sergey Yekhanin MIT Microsoft Research.
Inserting Turbo Code Technology into the DVB Satellite Broadcasting System Matthew Valenti Assistant Professor West Virginia University Morgantown, WV.
(speaker) Fedor Groshev Vladimir Potapov Victor Zyablov IITP RAS, Moscow.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Arbitrary Bit Generation and Correction Technique for Encoding QC-LDPC Codes with Dual-Diagonal Parity Structure Chanho Yoon, Eunyoung Choi, Minho Cheong.
Chapter 3 The Greedy Method 3.
By Hua Xiao and Amir H. Banihashemi
Near Shannon Limit Performance of Low Density Parity Check Codes
Graph Algorithms: Minimum Spanning Tree We are given a weighted, undirected graph G = (V, E), with weight function w:
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Asymptotic Enumerators of Protograph LDPCC Ensembles Jeremy Thorpe Joint work with Bob McEliece, Sarah Fogal.
Division of Engineering and Applied Sciences March 2004 Belief-Propagation with Information Correction: Near Maximum-Likelihood Decoding of LDPC Codes.
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Interconnect Efficient LDPC Code Design Aiman El-Maleh Basil Arkasosy Adnan Al-Andalusi King Fahd University of Petroleum & Minerals, Saudi Arabia Aiman.
Mario Vodisek 1 HEINZ NIXDORF INSTITUTE University of Paderborn Algorithms and Complexity Erasure Codes for Reading and Writing Mario Vodisek ( joint work.
Improving the Performance of Turbo Codes by Repetition and Puncturing Youhan Kim March 4, 2005.
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
CS774. Markov Random Field : Theory and Application Lecture 10 Kyomin Jung KAIST Oct
Analysis of Iterative Decoding
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 7 Low Density Parity Check Codes.
Combinatorial Algorithms Reference Text: Kreher and Stinson.
Distributed computing using Projective Geometry: Decoding of Error correcting codes Nachiket Gajare, Hrishikesh Sharma and Prof. Sachin Patkar IIT Bombay.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 5 Turbo Code.
15-853:Algorithms in the Real World
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
Low Density Parity Check codes
Some Computation Problems in Coding Theory
OR Chapter 8. General LP Problems Converting other forms to general LP problem : min c’x  - max (-c)’x   = by adding a nonnegative slack variable.
1 Design of LDPC codes Codes from finite geometries Random codes: Determine the connections of the bipartite Tanner graph by using a (pseudo)random algorithm.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Memory-efficient Turbo decoding architecture for LDPC codes
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
Discrete ABC Based on Similarity for GCP
Data Structures Using C++ 2E
Hashing, Hash Function, Collision & Deletion
Lecture 2-2 NP Class.
Sequential Algorithms for Generating Random Graphs
Data Structures Using C++ 2E
Rate 7/8 LDPC Code for 11ay Date: Authors:
Rate 7/8 (1344,1176) LDPC code Date: Authors:
Interleaver-Division Multiple Access on the OR Channel
Watermarking with Side Information
January 2004 Turbo Codes for IEEE n
Types of Algorithms.
Computability and Complexity
Localizing the Delaunay Triangulation and its Parallel Implementation
Chapter 6.
Enumerating Distances Using Spanners of Bounded Degree
RS – Reed Solomon List Decoding.
Information Redundancy Fault Tolerant Computing
Chris Jones Cenk Kose Tao Tian Rick Wesel
COSC 4335: Other Classification Techniques
Distributed Compression For Binary Symetric Channels
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
Improving turbocode performance by cross-entropy
Irregular Structured LDPC Codes and Structured Puncturing
Time Varying Convolutional Codes for Punctured Turbocodes
Graph Algorithms: Shortest Path
Error Correction Coding
Presentation transcript:

Selection procedure of turbo code parameters by combinatorial optimization Yannick Saouter 24/06/2019

Positioning of the problem Turbo code encoding : data frame X=(x1,…,xn) is permuted by  to give X’=(x(1),…,x(n)). The two frames are encoded by a RSC and gives redundancies Y1 and Y2. Turbo code decoding : Iterative MAP or SUB-MAP. Permutation design is the key for good performance.

Design of turbo code permutation : state of the art (I) Pseudo-random generation of permutations is not satisfactory : error floors ! Criteria have been defined for permutations. A good permutation is a priori supposed to maximize those criteria. Examples : Spread (or Lee distance), weight-2 codeword girth, correlation, minimal distance … Selection procedure : generate permutations, select those with large criteria values, verification of performance by true simulation. Try and guess : extremal permutations are unlikely to be found if they are rare !

Design of turbo code permutation : state of the art (II) Unstructured models : permutation mapping is stored in an array. Examples: S-Random, 3GPP , Quasi Cyclic model. Structured models : permutation is defined by algebraic equations. Examples: DVB-RCS, ARP, QPP, DRP. Unstructured models are expensive in terms of memory area. Structured models are cheaper but are less general.

A structured model : the Almost Regular Permutation (ARP) model. ARP model was defined by Berrou et al. [2004]. It is a modification of former DVB-RCS model. Frame length equal to N. If i=(j) then we have i=P*j+Q(j) [mod. N] where: Values P, Q1, Q2, Q3 are parameters. P is relatively prime with N. Q1, Q2, Q3 are divisible by 4 (to ensure bijectivity).

Effects of correlation Iterative decoding imply that a bad decision can reinforce itself through a correlation loop. Autocorrelation effect is more important than minimal distance ! Hokfelt et al. [2001] 512 bits turbo code C1: Dmin=32 C2 : Dmin=27, lower autocorrelation

Correlation girth Correlation girth is defined to be a weighted girth of the correlation graph of the permutation. Neighbor to neighbor connections have a weight equal to 1, while permutations connections weighs 0. Correlation graph can be converted to a 4-regular graph.

Maximizing girth of graphs: the Progressive Edge Growth algorithm Problem : Building a k-regular graph with n vertices. Applied : by Hu et al. [2001] in the case of bipartite graphs for the design of LDPC codes. Result : Chandran [2003], If n tends to  when k is fixed, then the girth of the final graph also tends to . 1. Initial set of edges empty. Set k’=0. 2. While k’ k do 3. Select one vertex  with k’ neighbors. If not possible go to step 7. 4. Compute graph distance between  and any vertex. 5. Add an edge between  and the most distant vertex of valency k’. 6. Go to step 3. 7. Set k’=k’+1. End while step 2. 8. End.

Combinatorial optimization for ARP model Reminder : (j) =P*j+Q(j) [mod. N] where Q(j)={0,Q1,Q2,Q3} according to j [mod 4]. Shadow graphs : If only P is defined, 1 quarter of the final ARP graph is defined. If P and Q1 are defined, 1 half of the final graph is defined… Periodicity : Shadow graphs and final graph are periodic with period 4. Algorithm : 1. Choose P at random 2. For j from 1 to 3 do 3. Compute graph distance of vertices from vertex j. 4. Select the most distant one i (with valency 2) and set i=(j). 5. Compute Q(j) and add corresponding edges. 6. End for step 2. 7. Compute final girth and stop.

Remarks At each step, N/4 edges are added unlike to PEG algorithm. No garanty of optimality and even of efficiency. Selected vertices have no modulo constraint unlike to original ARP model. Naturally generalizes to larger periods. Instead of selecting the most distant vertex, it is possible to select one amongst the most distant Declerq et al [2008]. More flexibility in constructed graphs.

Unconditional upper bound : Moore’s bound Moore’s bound : general bound on diameter (and girth) of regular graphs.

Complexity Quite low thanks to Dijskra’s algorithm [Dijskra 1959] : Demo.

Performances Binary turbo codes, Gaussian BPSK, Rate=1/3

ARP vs. DVB-S2 (I) Gaussian BPSK, 5400 bits, Rate=1/3

ARP vs. DVB-S2 (II) Rayleigh BPSK, 5400 bits, Rate=1/3

ARP vs. DVB-S2 (III) Rayleigh + 50 % erasures, BPSK, 5400 bits, Rate=1/3

Further axes of research Could we obtain an optimality result (like Chandran’s one) in the case of structured graphs ? Can we get closer to Moore’s bound ? Could it be possible to find new decoding algorithms specially designed to limit autocorrelation ?

Thank you for your attention !