Download presentation
Presentation is loading. Please wait.
1
Iterative Equalization and Decoding
John G. Proakis COMSOC Distinguished Lecture Tour
2
Conventional Equalization
Soft Output Hard Output From Receiver Filter Equalizer Decoder Possible Equalizer Types: Linear Equalizer Decision Feedback Equalizer (DFE) Maximum A posteriori Probability (MAP) Equalizer Soft-output Viterbi (MLSE) Equalizer Possible Decoder Types: Maximum A posteriori Probability (MAP) Decoder Viterbi (MLSE) Decoder
3
Turbo Principle / Turbo Coding
+ cs(n) cp1(n) P cp2(n) d`(n) Puncturer and P/S converter Encoder 1 Encoder 2 d(n) Turbo Encoder: Parallel concatenated recursive systematic convolutional encoders Encoders separated by an interleaver b(n) Turbo Decoder: Two Soft-Input Soft-Output (SISO) decoders separated by interleavers SISO modules can be SOVA MAP Extrinsic information passed between modules Le21 P-1 SISO Decoder 1 Le12 P xp1 SISO Decoder 2 xs xp2 P Ld
4
Serially Concatenated Systems
Serially Concatenated Coding: Serial concatenated (recursive) convolutional encoders Encoders separated by an interleaver Encoder 1 d(n) P c(n) c’(n) b(n) Encoder 2 Coded Transmission over Multipath Channels: (recursive) convolutional encoder Interleaved bits mapped to symbols Symbols passed through a multipath channel c(n) c’(n) d(n) Encoder Symbol Mapper P x(n) Multipath Channel r(n)
5
Channel Model / Precoding
Multipath Channel Model: Received signal: Rate 1/1 convolutional code D X h1(n) h2(n) hL-1(n) h0(n) + w(n) x(n) r(n) Precoded System: Iteration gain only possible with recursive inner code (channel) [1], [2] Recursive rate 1/1 precoder is employed before transmission Most common precoder: Differential encoder D X h1(n) h2(n) hL-1(n) h0(n) + w(n) x(n) r(n) y(n) y(n-1)
6
Iterative Equalization and Decoding (Turbo Equalizer)
Data bits are convolutionally encoded and interleaved M-ary PSK modulated signals transmitted through a multipath channel, which is treated as an encoder Received signals are jointly equalized and decoded in a turbo structure First proposed by Douillard, et.al [3], where SOVA modules are employed Bauch extended the idea by employing MAP modules [4] Convolutional Encoder P Symbol Mapper Multipath Channel dn cn c’n xn rn MAP Decoder + P P-1 Equalizer Channel Estimator r LDe(c’) LDe(c) LD(c) LD(d) LEe(c) LEe(c’) LE(c’) - -
7
Time-invariant Test Channel
Proakis C channel [5] 0.688 0.460 0.227 t Impulse Response Frequency Response
8
Low Complexity Alternative Equalizers: DFE and MLSE
Performance of DFE and MLSE over the Proakis C channel [5] Bit error rate performance [5]
9
Iterative Equalization and Decoding Performance
Iterative equalization and decoding with MAP modules Recursive systematic convolutional encoder with R=1/2, K=5, Time-invariant 5 tap channel with a spectral null (Proakis C [5]) Equalizer has perfect knowledge of the channel Block length 4096 Bit error rate performance of turbo equalizer [4]
10
Hard Iterative DFE and MAP Decoding
Output Data Input from receiver filter Forward Filter MAP Decoder + P-1 Symbol Detector DFE with hard input feedback Feedback Filter Hard encoded symbols P During the first pass, symbol detector output is passed to the feedback filter After the first pass, hard encoded symbol output of the decoder is used in the feedback filter
11
Performance of Hard Iterative DFE and MAP decoder
BPSK modulation R=1/2, K=7 convolutional coding Block length 2048 Channel Proakis C
12
Soft Iterative DFE and MAP Decoding
Output Data Forward Filter MAP Decoder + P-1 DFE with hard input feedback Feedback Filter Decision Device Soft encoded symbols P Soft decisions of the decoder is combined with the soft outputs of the DFE: Hard detected symbols + Soft APP from last iteration
13
Histogram of DFE Output
The histogram of equalizer estimated output for SNR = 12 dB The histogram of equalizer estimated output for SNR = 20 dB
14
Modified Soft Iterative DFE and MAP Decoding
Output Data Forward Filter Conversion to LLR MAP Decoder + P-1 - DFE with hard input feedback Feedback Filter Decision Device Soft encoded symbols P + Only extrinsic information is passed to the DFE from the decoder Variance Estimator Re Im + Hard detected symbols + Soft APP from last iteration Variance Estimator
15
Performance of Soft Iterative DFE and MAP Decoder
Recursive systematic convolutional encoder with R=1/2, K=5, Time-invariant 5 tap channel with a spectral null (Proakis C [5]) RLS updates at the DFE Block length 4096 BPSK modulation
16
Performance of Soft Iterative DFE and MAP Decoder
Recursive systematic convolutional encoder with R=1/2, K=5, Time-invariant 5 tap channel with a spectral null (Proakis C [5]) RLS updates at the DFE Block length 4096 QPSK modulation
17
Iterative Linear MMSE Equalization and Decoding
SISO Linear MMSE Equalizer [6]: Known channel: Received signal: Likelihood ratio for MMSE estimator output, : Channel matrix: MMSE estimator output: where,
18
Iterative Linear MMSE Equalization and Decoding
Steps to compute symbol estimates with the Linear MMSE equalizer: Soft output calculation assuming Gaussian distributed estimates:
19
Performance of SISO MMSE Linear Iterative Equalizer
Recursive systematic convolutional encoder with R=1/2, K=5, Time-invariant 5 tap channel with a spectral null (Proakis C [5]) Equalizer has perfect knowledge of the channel Block length 4096
20
Comparison of System Performances
Recursive systematic convolutional encoder with R=1/2, K=5, Time-invariant 5 tap channel with a spectral null (Proakis C [5]) BER results after 6 iterations Block length 4096
21
Experimental Study of Iterative Equalizers
Channel Probe Training Symbols Information Dead Time
22
Joint MMSE Equalization and Turbo Decoding
Le12 P-1 Le12 MAP Decoder 1 P xp1 Demapper & S/P Converter MAP Decoder 2 xs xp2 Ld Lp2 Lp1 P x(n) ^ P P-1 s2(n) e(n) Adaptive Algorithm + e-jq(n) y(n) Forward Filter x + x(n) ~ Decision Device Feedback Filter x(n) Training Symbols
23
Decision Feedback Equalizer (DFE)
Soft output of the DFE: RLS algorithm is used to track channel variation: Noise variance estimate:
24
MAP Decoding Maximize a posteriori probability:
Decision variable written in the form of log-likelihood ratio:
25
MAP Decoding (BCJR Algorithm)
State transition probability: where channel value a priori information extrinsic information
26
MAP Equalizer depends on the channel trellis defined by hl(n) with 2(L-1) states If xn-l for J<l<L-1 is known Number of states is reduced to 2(J-1)
27
Per-Survivor Processing
Discarded Paths Survivor Paths 00 01 1 1 1 1 1 10 1 1 11 1 1 1 n n n n n Path metric: Survivor path:
28
Each survivor path has a separate channel estimator
hl (n) + - Adaptive Algorithm Each survivor path has a separate channel estimator The input to the channel estimator, , is the estimates within the survivors RLS algorithm is employed
29
Channel Estimator Initial channel estimate is based on the correlation of the preamble RLS algorithm is employed to track the channel Noise variance estimate:
30
Experimental Results DFE results for transducer 7.
Channel impulse response estimate for transducer seven obtained using the channel probe DFE results for transducer 7. Eye Pattern - Filter coefficients PLL phase estimate - Bit error distribution
31
Experimental Results Channel impulse response estimate for transducer seven obtained using adaptive channel estimator Comparison of received signal with the estimated received signal based on the channel estimate
32
Experimental Results Sparse Channel with multipath delay in the order of 200 symbols Length of the DFE or channel estimator filters cannot cover the channel Sparse processing is needed
33
Results of DFE Turbo Decoder
34
Results of Iterative DFE Turbo Decoder
35
Results of Iterative DFE MAP Decoder
36
Results for Iterative Map Equalizer Turbo Decoder
37
Results for Iterative MAP Equalizer MAP Decoder
38
Conclusions Due to error propagation in the DFE, turbo decoder cannot provide performance improvement beyond the second iteration Error Floor Joint DFE and turbo decoding adds an additional loop to the system and lowers the error floor Joint channel estimator and iterative equalizer is able to decode packets with low SNR, which cannot be decoded with the DFE Tail cancellation is an effective way to reduce the computational complexity of the MAP equalizer If the channel is sparse, although the DFE filter lengths are short, the DFE is able to provide enough information to the turbo decoder A sparse DFE can be used to improve the performance of the DFE/MAP Decoder and the DFE/Turbo Decoder
39
References [1] S. Benedetto, et.al., “Serial concatenation of interleaved codes: Design and performance analysis,” IEEE Trans. Info. Theory, vol. 42, pp , April 1998 [2] I. Lee, “The effect of a precoder on serially concatenated coding systems with ISI channel,” IEEE Trans. Commun., pp , July 2001 [3] C. Douilard, et.al., “Iterative correction of intersymbol interference: Turbo-equalization,” European Transactions on Telecommunications, vol. 6, pp , Sep.-Oct. 1995 [4] G. Bauch, H. Khorram, and J. Hagenauer, “Iterative equalization and decoding in mobile communications systems,” in Proc. European Personal Mobile Commun. Conf., pp [5] J. Proakis, Digital Communications, McGraw-Hill Inc., 2001 [6] M. Tuchler, A. Singer, and R. Koetter, “Minimum mean squared error equalization using a priori information,” IEEE Trans. Signal Proc., vol. 50, pp , March 2002
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.