Download presentation
Presentation is loading. Please wait.
Published byEustace Whitehead Modified over 5 years ago
1
Improving turbocode performance by cross-entropy
Yannick Saouter 24/02/2019
2
Turbocodes Turbo code encoding : data frame X=(x1,…,xn) is permuted by to give X’=(x(1),…,x(n)). The two frames are encoded by a RSC and gives redundancies Y1 and Y2. Invented by Claude Berrou and Alain Glavieux [BerrouGlavieux93]
3
Soft decoding Turbodecoding code : computation of a posteriori estimates given as a priori to the second decoder SISO decoders : generally MAP or SUBMAP for convolutive turbocodes
4
Definition : Let p, q be probability density functions over then:
Cross-entropy Definition : Let p, q be probability density functions over then: Also known as Kullback distance. Property : CE(p,q)0 with CE(p,q)=0 iff. p=q almost everywhere.
5
Cross-entropy decoding
CE decoding : code C, a priori distribution q(x) , outputs distribution p(x) such that [Battail87] : Generally computationally infeasible Turbocode decoding is a suboptimal cross entropy decoding [MoherGulliver98] With SISO decoders, cross entropy can give a numeric valuation of quality of decoding If CE(p,q) is low (resp. large), decoding is reliable (resp. unreliable). p(x) satifies parity constraints of C CE(p,q) is minimal
6
Pratical computation of cross-entropy
X=(x1,…,xn), R=(r1,…,rn) convolutive frame with input extrinsic values Lein=(Lein1,…,Leinn). SISO decoder produces output extrinsic values Leout=(Leout1,…,Leoutn). A priori distribution Lin=Lc.X+Lein and A posteriori distribution Lout=Lc.X+Leout. Gaussian hypothesis : We have [Hagenauer96] : Too computational.
7
Hypothesis 2 : IfLiini is low, then Liouti also.
Some hypothesis Goal : Minimizing computational load, avoid use of exp and log functions Hypothesis 1 : If Liini is large, then Liouti also and has the same sign. Hypothesis 2 : IfLiini is low, then Liouti also. Correspond generally to correct decoded samples Unreliable inputs cannot produce reliable ouputs
8
The « large » case Cross entropy : Assume Lini large and positive.
Same for Louti, thus : Same result if Lini large and negative.
9
Hypothesis : Lini and Louti are near 0.
The « low » case Hypothesis : Lini and Louti are near 0. Use Taylor developments. We obtain:
10
Practical criterion Criterion :
Definition consistent with the « large » case where T is a positive threshold value, Lin=Lc.X+Lein and Lout=Lc.X+Leout
11
Use with turbocodes Idea : use the cross entropy criterion to advantage good sets of extrinsic values and penalize bad sets. We set: (Lein)step+1= (Lein)step with
12
Simulation results Details : Frame length 2000 bits, ARP permutation model, SUBMAP decoding Comparison cross entropy scaling vs. Fixed scaling (8 iterations) Comparison cross entropy scaling (6 iterations) vs. Fixed scaling (6-8 iterations)
13
Further axes of research
Summary : Implementation issues Adaptation to duo-binary turbocodes Other types of codes : LDPC, product codes Using gaussian mixtures for likelihood representation More complex correction : Chebyshev interpolation - Use of cross entropy principle in turbocode systems - Derivation of simplified cross entropy formula - Improvements in terms of speed of convergence and performance in the waterfall area
14
Thank you for your attention !
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.