Improving turbocode performance by cross-entropy

Slides:



Advertisements
Similar presentations
Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Advertisements

Iterative Equalization and Decoding
Fixed point iterations and solution of non-linear functions
What is a good code? Ideal system
Applied Algorithmics - week7
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Inserting Turbo Code Technology into the DVB Satellite Broadcasting System Matthew Valenti Assistant Professor West Virginia University Morgantown, WV.
The Cutoff Rate and Other Limits: Passing the Impassable Richard E. Blahut University of Illinois UIUC 5/4/20151.
Capacity of Wireless Channels
Part 4 b Forward-Backward Algorithm & Viterbi Algorithm CSE717, SPRING 2008 CUBS, Univ at Buffalo.
06 Dec 04Turbo Codes1 TURBO CODES Michelle Stoll.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
EE436 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Analysis of Iterative Decoding
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
1 A(n) (extremely) brief/crude introduction to minimum description length principle jdu
Telex Magloire Ngatched Centre for Radio Access Technologies University Of Natal Durban, South-Africa Telex Magloire Ngatched Centre for Radio Access Technologies.
III. Turbo Codes.
Turbo codes for short and medium block length: the state of the art Department 1 Paris, June 25, 2004 Claude Berrou, Catherine Douillard GET-ENST Bretagne/PRACOM/CNRS.
1 –Mandatory exercise for Inf 244 –Deadline: October 29th –The assignment is to implement an encoder/decoder system.
Iterative Soft Decoding of Reed-Solomon Convolutional Concatenated Codes Li Chen Associate Professor School of Information Science and Technology, Sun.
A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding.
Iterative decoding If the output of the outer decoder were reapplied to the inner decoder it would detect that some errors remained, since the columns.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
Real-Time Turbo Decoder Nasir Ahmed Mani Vaya Elec 434 Rice University.
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Low Density Parity Check codes
5. Maximum Likelihood –II Prof. Yuille. Stat 231. Fall 2004.
Iterative detection and decoding to approach MIMO capacity Jun Won Choi.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: MLLR For Two Gaussians Mean and Variance Adaptation MATLB Example Resources:
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Log-Likelihood Algebra
Implementation of Turbo Code in TI TMS320C8x Hao Chen Instructor: Prof. Yu Hen Hu ECE734 Spring 2004.
Computational Intelligence: Methods and Applications Lecture 26 Density estimation, Expectation Maximization. Włodzisław Duch Dept. of Informatics, UMK.
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
Exercise in the previous class (1) Define (one of) (15, 11) Hamming code: construct a parity check matrix, and determine the corresponding generator matrix.
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
CSC2535: Lecture 4: Autoencoders, Free energy, and Minimum Description Length Geoffrey Hinton.
Objectives: Loss Functions Risk Min. Error Rate Class. Resources: DHS – Chap. 2 (Part 1) DHS – Chap. 2 (Part 2) RGO - Intro to PR MCE for Speech MCE for.
Evolutionary Computation: Advanced Algorithms and Operators
Lecture 1.31 Criteria for optimal reception of radio signals.
Bridging the Gap Between Parallel and Serial Concatenated Codes
LECTURE 03: DECISION SURFACES
Recent Advances in Iterative Parameter Estimation
An Efficient Software Radio Implementation of the UMTS Turbo Codec
STATISTICAL INFERENCE PART I POINT ESTIMATION
MAP decoding: The BCJR algorithm
Coding for Noncoherent M-ary Modulation
Context-based Data Compression
General Strong Polarization
January 2004 Turbo Codes for IEEE n
Partial Proposal: Turbo Codes
Solution of Equations by Iteration
Bayesian Models in Machine Learning
Mr. Ali Hussain Mugaibel
10701 / Machine Learning Today: - Cross validation,
CT-474: Satellite Communications
Time Varying Convolutional Codes for Punctured Turbocodes
About rate-1 codes as inner codes
Selection procedure of turbo code parameters by combinatorial optimization Yannick Saouter 24/06/2019.
IV. Convolutional Codes
Presentation transcript:

Improving turbocode performance by cross-entropy Yannick Saouter 24/02/2019

Turbocodes Turbo code encoding : data frame X=(x1,…,xn) is permuted by  to give X’=(x(1),…,x(n)). The two frames are encoded by a RSC and gives redundancies Y1 and Y2. Invented by Claude Berrou and Alain Glavieux [BerrouGlavieux93]

Soft decoding Turbodecoding code : computation of a posteriori estimates given as a priori to the second decoder SISO decoders : generally MAP or SUBMAP for convolutive turbocodes

Definition : Let p, q be probability density functions over  then: Cross-entropy Definition : Let p, q be probability density functions over  then: Also known as Kullback distance. Property : CE(p,q)0 with CE(p,q)=0 iff. p=q almost everywhere.

Cross-entropy decoding CE decoding : code C, a priori distribution q(x) , outputs distribution p(x) such that [Battail87] : Generally computationally infeasible Turbocode decoding is a suboptimal cross entropy decoding [MoherGulliver98] With SISO decoders, cross entropy can give a numeric valuation of quality of decoding If CE(p,q) is low (resp. large), decoding is reliable (resp. unreliable). p(x) satifies parity constraints of C CE(p,q) is minimal

Pratical computation of cross-entropy X=(x1,…,xn), R=(r1,…,rn) convolutive frame with input extrinsic values Lein=(Lein1,…,Leinn). SISO decoder produces output extrinsic values Leout=(Leout1,…,Leoutn). A priori distribution Lin=Lc.X+Lein and A posteriori distribution Lout=Lc.X+Leout. Gaussian hypothesis : We have [Hagenauer96] : Too computational.

Hypothesis 2 : IfLiini is low, then Liouti also. Some hypothesis Goal : Minimizing computational load, avoid use of exp and log functions Hypothesis 1 : If Liini is large, then Liouti also and has the same sign. Hypothesis 2 : IfLiini is low, then Liouti also. Correspond generally to correct decoded samples Unreliable inputs cannot produce reliable ouputs

The « large » case Cross entropy : Assume Lini large and positive. Same for Louti, thus : Same result if Lini large and negative.

Hypothesis : Lini and Louti are near 0. The « low » case Hypothesis : Lini and Louti are near 0. Use Taylor developments. We obtain:

Practical criterion Criterion : Definition consistent with the « large » case where T is a positive threshold value, Lin=Lc.X+Lein and Lout=Lc.X+Leout

Use with turbocodes Idea : use the cross entropy criterion to advantage good sets of extrinsic values and penalize bad sets. We set: (Lein)step+1= (Lein)step with

Simulation results Details : Frame length 2000 bits, ARP permutation model, SUBMAP decoding Comparison cross entropy scaling vs. Fixed scaling (8 iterations) Comparison cross entropy scaling (6 iterations) vs. Fixed scaling (6-8 iterations)

Further axes of research Summary : Implementation issues Adaptation to duo-binary turbocodes Other types of codes : LDPC, product codes Using gaussian mixtures for likelihood representation More complex correction : Chebyshev interpolation - Use of cross entropy principle in turbocode systems - Derivation of simplified cross entropy formula - Improvements in terms of speed of convergence and performance in the waterfall area

Thank you for your attention !