Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code Reporter.

Similar presentations


Presentation on theme: "1 Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code Reporter."— Presentation transcript:

1 1 Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code Reporter :林煜星 Advisor : Prof. Y.M. Huang

2 2 Outline Introduction Related Research –Transmission Model for BCJR –Simulation for BCJR Algorithm Proposed Methodology –Transmission Model for Sequential –Simulation for Soft-Decision Sequential Algorithm Conclusion

3 3 Demodulator Channel Decoder Source Decoder User Joint Decoder 資料壓縮資料壓縮 錯誤更正碼錯誤更正碼 Discrete source Source Encoder Channel Encoder Modulator Introduction Channel

4 4 Related Research [1]L. Guivarch, J.C. Carlach and P. Siohan [2]M. Jeanne, J.C. Carlach, P. Siohan and L.Guivarch [3]M. Jeanne, J.C. Carlach, Pierre Siohan

5 5 Transmission Model for BCJR Independent Source or first order Markov Source Huffman Coding Turbo Coding parallel concatenation Additive White Gaussian Noise Channel Turbo decoding Utilization of the SUBMAP Huffman Decoding P symbolsK bits P symbols a priori

6 6 Transmission Model for BCJR-Independent Source or first order Markov Source Independent Source or first order Markov Source Huffman Coding Turbo Coding parallel concatenation Additive White Gaussian Noise Channel Turbo decoding Utilization of the SUBMAP Huffman Decoding P symbolsK bits P symbols a priori

7 7 Transmission Model for BCJR-Independent Source or first order Markov Source(1) SymbolProbability A0.75 B0.125 C

8 8 Transmission Model for BCJR-Independent Source or first order Markov Source(2)

9 9 Transmission Model for BCJR- Independent Source or first order Markov Source(3) Y↓ ∣ X→ abC a0.940.18 b0.030.7120.108 c0.030.1080.712 Example:

10 10 Transmission Model for BCJR-Huffman Codign Independent Source or first order Markov Source Huffman Coding Turbo Coding parallel concatenation Additive White Gaussian Noise Channel Turbo decoding Utilization of the SUBMAP Huffman Decoding P symbolsK bits P symbols a priori

11 11 VLCSymbolProbability 0A0.75 10B0.125 11C0.125 Transmission Model for BCJR-Huffman Coding

12 12 Transmission Model for BCJR-Turbo Coding parallel concatenation Independent Source or first order Markov Source Huffman Coding Turbo Coding parallel concatenation Additive White Gaussian Noise Channel Turbo decoding Utilization of the SUBMAP Huffman Decoding P symbolsK bits P symbols a priori

13 13 d = (11101) u v Non Systematic Convolution code Transmission Model for BCJR-Turbo Coding parallel concatenation(1)

14 14 1 0 (4,1) (4,3) (5,1) (5,3) (5,2) (5,0) (4,0) (4,2) (3,1) (3,3) (3,0) (3,2) (2,1) (2,3) (2,0) (2,2) (1,1) (1,3) (1,0) (1,2) (0,1) (0,3) (0,0) (0,2) 11 d = (11101) 01 10 01 00 Transmission Model for BCJR-Turbo Coding parallel concatenation(2)

15 15 Recursive Systematic Convolution(RSC) Transmission Model for BCJR-Turbo Coding parallel concatenation(3) Rate=1/2

16 16 Rate=1/4 Transmission Model for BCJR-Turbo Coding parallel concatenation(4)

17 17 1234 5678 9101112 13141516 Interleaver 15913 261014 371115 481216 Transmission Model for BCJR-Turbo Coding parallel concatenation(5)

18 18 Rate=1/4 Transmission Model for BCJR-Turbo Coding parallel concatenation(6) Turbo Code rate1/3

19 19 Turbo Code rate=1/2 Transmission Model for BCJR-Turbo Coding parallel concatenation(7)

20 20 Independent Source or first order Markov Source Huffman Coding Turbo Coding parallel concatenation Additive White Gaussian Noise Channel Turbo decoding Utilization of the SUBMAP Huffman Decoding P symbolsK bits P symbols a priori Transmission Model for BCJR-AWGN

21 21 Transmission Model for BCJR-AWGN(1)

22 22 Independent Source or first order Markov Source Huffman Coding Turbo Coding parallel concatenation Additive White Gaussian Noise Channel Turbo decoding Utilization of the SUBMAP Huffman Decoding P symbolsK bits P symbols a priori Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP

23 23 BCJR1 priori Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(5) BCJR2

24 24 MAP Decoder Define Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(1)

25 25 Logarithm of Likelihood Ratio(LLR) Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(2) Recall

26 26 (4,1) (4,3) 10 01 00 11 (5,1) (5,3) 10 01 00 11 (5,2) 1 0 Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(3)

27 27 (0,0) (2,3)(3,3) (5,0) Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(4)

28 28 BCJR1 priori Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(5) BCJR2

29 29 Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(6)

30 30 Simulation for BCJR Algorithm The end of the transmission occurs when either the maximum bit error number fixed to 1000, or the maximum transmitted bits equal to 10 000 000 is reached. Input date into blocks of 4096 bits

31 31 Simulation for BCJR Algorithm(1) 1NP : 1 次 iteration independent source No Use a priori probability 1NP : 1 次 iteration independent source Use a priori probability

32 32 Simulation for BCJR Algorithm(2) 1NP : 1 次 iteration Markov Source No use a proiri probability 1MP : 1 次 iteration Markov Source Use Markvo a priori probability

33 33 Simulation for BCJR Algorithm(3) 12D : 1 次 iteration Independent Source Use a priori probability Bit time(level) 、 Convolution state 13D : 1 次 iteration Independent Source Use a priori probability Bit time(level) 、 tree state Convolution state

34 34 Proposed Methodology [4]Catherine Lamy, Lisa Perros-Meilhac

35 35 Sequential priori Transmission Model for Sequential- Sequential Decoding BCJR2

36 36 1 : if 0 : Otherwise Code word bits Transmission Model for Sequential- Sequential Decoding(1)

37 37 (0,0) 0 3 1 (1,1) 11 00 y=(10) |r| =(13) (1,0) (0,0) (1,1) (1,0) Origin node Open y=(00) |r| =(21) 1 4 (2,1) 11 00 (2,0) (2,1) (1,1) (2,0) (1,0) 4 1 (3,1) 11 00 y=(11) |r| =(21) (3,0) (3,1) (3,0) (2,0) Close 4 2 (4,3) 01 10 y=(11) |r| =(31) (4,2) (2,1) (3,0) (4,3) (1,1) (4,2) (3,1) (5,0) y=(00) |r| =(12) (5,1) 2 5 00 11 (5,0) (2,1) (3,0) (4,3) (1,1) (5,1) (4,2) Example: r=(-1, 3,2,1,-2,-1,-3,-1,1,2) y=(1,0,0,0,1,1,1,1,0,0) 2 3 4 4 4 5 Transmission Model for Sequential- Sequential Decoding(2)

38 38 Transmission Model for Sequential- Sequential Decoding(2)

39 39 Simulation for Sequential Algorithm 2D1 : 1 次 iteration Independent Source Use a priori probability Bit time(level) 、 Convolution state 3D1 : 1 次 iteration Independent Source Use a priori probability Bit time(level) 、 Convolution state 、 tree state

40 40

41 41 Heuristic 方法求 Sequential Decoder Soft- Output value 運用在 Iterative 解碼架構,雖然 使錯誤降低,節省運算時間,但解碼效果 無法接近 Tubro Decoder 的解碼效果,為來 將繼續研究更佳的方法求 Sequential Decoder Soft-Output value 使解碼效果更逼近 Turbo Decoder 的解碼效果 Conclusion


Download ppt "1 Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code Reporter."

Similar presentations


Ads by Google