Presentation is loading. Please wait.

Presentation is loading. Please wait.

ECE 4371, Fall, 2014 Introduction to Telecommunication Engineering/Telecommunication Laboratory Zhu Han Department of Electrical and Computer Engineering.

Similar presentations


Presentation on theme: "ECE 4371, Fall, 2014 Introduction to Telecommunication Engineering/Telecommunication Laboratory Zhu Han Department of Electrical and Computer Engineering."— Presentation transcript:

1 ECE 4371, Fall, 2014 Introduction to Telecommunication Engineering/Telecommunication Laboratory Zhu Han Department of Electrical and Computer Engineering Class 21 Nov. 12 th, 2014

2 Outline Modern code –Turbo Code –LDPC Code –Fountain Code MIMO/Space time coding Coded Modulation –Trellis code modulation –BICM

3 Turbo Codes Backgound –Turbo codes were proposed by Berrou and Glavieux in the 1993 International Conference in Communications. –Performance within 0.5 dB of the channel capacity limit for BPSK was demonstrated. Features of turbo codes –Parallel concatenated coding –Recursive convolutional encoders –Pseudo-random interleaving –Iterative decoding

4 Motivation: Performance of Turbo Codes Comparison: –Rate 1/2 Codes. –K=5 turbo code. –K=14 convolutional code. Plot is from: –L. Perez, “Turbo Codes”, chapter 8 of Trellis Coding by C. Schlegel. IEEE Press, 1997 Gain of almost 2 dB! Theoretical Limit!

5 Concatenated Coding A single error correction code does not always provide enough error protection with reasonable complexity. Solution: Concatenate two (or more) codes –This creates a much more powerful code. Serial Concatenation (Forney, 1966) Outer Encoder Block Interleaver Inner Encoder Outer Decoder De- interleaver Inner Decoder Channel

6 Parallel Concatenated Codes Instead of concatenating in serial, codes can also be concatenated in parallel. The original turbo code is a parallel concatenation of two recursive systematic convolutional (RSC) codes. –systematic: one of the outputs is the input. Encoder #1 Encoder #2 Interleaver MUX Input Parity Output Systematic Output

7 Pseudo-random Interleaving The coding dilemma: –Shannon showed that large block-length random codes achieve channel capacity. –However, codes must have structure that permits decoding with reasonable complexity. –Codes with structure don’t perform as well as random codes. –“Almost all codes are good, except those that we can think of.” Solution: –Make the code appear random, while maintaining enough structure to permit decoding. –This is the purpose of the pseudo-random interleaver. –Turbo codes possess random-like properties. –However, since the interleaving pattern is known, decoding is possible.

8 Recursive Systematic Convolutional Encoding An RSC encoder can be constructed from a standard convolutional encoder by feeding back one of the outputs. An RSC encoder has an infinite impulse response. An arbitrary input will cause a “good” (high weight) output with high probability. Some inputs will cause “bad” (low weight) outputs. DD Constraint Length K= 3 DD

9 Why Interleaving and Recursive Encoding? In a coded systems: –Performance is dominated by low weight code words. A “good” code: –will produce low weight outputs with very low probability. An RSC code: –Produces low weight outputs with fairly low probability. –However, some inputs still cause low weight outputs. Because of the interleaver: –The probability that both encoders have inputs that cause low weight outputs is very low. –Therefore the parallel concatenation of both encoders will produce a “good” code.

10 Iterative Decoding There is one decoder for each elementary encoder. Each decoder estimates the a posteriori probability (APP) of each data bit. The APP’s are used as a priori information by the other decoder. Decoding continues for a set number of iterations. –Performance generally improves from iteration to iteration, but follows a law of diminishing returns. Decoder #1 Decoder #2 DeMUX Interleaver Deinterleaver systematic data parity data APP hard bit decisions

11 The Turbo-Principle Turbo codes get their name because the decoder uses feedback, like a turbo engine.

12 Performance as a Function of Number of Iterations K=5, r=1/2, L=65,536

13 Performance Factors and Tradeoffs Complexity vs. performance –Decoding algorithm. –Number of iterations. –Encoder constraint length Latency vs. performance –Frame size. Spectral efficiency vs. performance –Overall code rate Other factors –Interleaver design. –Puncture pattern. –Trellis termination.

14 Influence of Interleaver Size Voice Video Conferencing Replayed Video Data Constraint Length 5. Rate r = 1/2. Log-MAP decoding. 18 iterations. AWGN Channel.

15 Power Efficiency of Existing Standards

16 Turbo Code Summary Turbo code advantages: –Remarkable power efficiency in AWGN and flat-fading channels for moderately low BER. –Deign tradeoffs suitable for delivery of multimedia services. Turbo code disadvantages: –Long latency. –Poor performance at very low BER. –Because turbo codes operate at very low SNR, channel estimation and tracking is a critical issue. The principle of iterative or “turbo” processing can be applied to other problems. –Turbo-multiuser detection can improve performance of coded multiple-access systems.

17 LDPC Introduction Low Density Parity Check (LDPC) History of LDPC codes –Proposed by Gallager in his 1960 MIT Ph. D. dissertation –Rediscovered by MacKay and Richardson/Urbanke in 1999 Features of LDPC codes –Performance approaching Shannon limit –Good block error correcting performance –Suitable for parallel implementation Advantages over turbo codes –LDPC do not require a long interleaver –LDPC’s error floor occurs at a lower BER –LDPC decoding is not trellis based

18 Tanner Graph (1/2) Tanner Graph (A kind of bipartite graph) –LDPC codes can be represented by a sparse bipartite graph u Si: the message nodes (or called symbol nodes) u Ci: the check nodes u Because G is the null space of H, H . x T = 0 u According to the equation above, we can define some relation between the message bits –Example u n = 7, k=3, J = 4, λ=1

19 Decoding of LDPC Codes For linear block codes –If c is a valid codeword, we have u c H T = 0 –Else the decoder needs to find out error vector e Graph-based algorithms –Sum-product algorithm for general graph-based codes –MAP (BCJR) algorithm for trellis graph-based codes –Message passing algorithm for bipartite graph-based codes

20 Pro and Con ADVANTAGES –Near Capacity Performance: Shannon’s Limit –Some LDPC Codes perform better than Turbo Codes –Trellis diagrams for Long Turbo Codes become very complex and computationally elaborate –Low Floor Error –Decoding in the Log Domain is quite fast. DISADVANTAGES –Long time to Converge to Good Solution –Very Long Code Word Lengths for good Decoding Efficiency –Iterative Convergence is SLOW u Takes ~ 1000 iterations to converge under standard conditions. –Due to the above reason transmission time increases u i.e. encoding, transmission and decoding –Hence Large Initial Latency u (4086,4608) LPDC codeword has a latency of almost 2 hours

21 Fountain Code Sender sends a potentially limitless stream of encoded bits. Receivers collect bits until they are reasonably sure that they can recover the content from the received bits, and send STOP feedback to sender. Automatic adaptation: Receivers with larger loss rate need longer to receive the required information. Want that each receiver is able to recover from the minimum possible amount of received data, and do this efficiently. Original content Encoded packets Users reconstruct Original content as soon as they receive enough packets Encoding Engine Transmission

22 MIMO Model T: Time index W: Noise

23 Alamouti Space-Time Code Alamouti Space-Time Code Transmitted signals are orthogonal => Simplified receiver Redundance in time and space => Diversity Equivalent diversity gain as maximum ratio combining => Smaller terminals Antenna 1Antenna 2 Time n d0d0 d1d1 Time n + T - d 1 * d0*d0*

24 Space Time Code Performance STBC Block of K symbols Block of T symbols n t transmit antennas Constellation mapper Data in K input symbols, T output symbols T  K code rate R=K/T is the code rate full rate If R=1 the STBC has full rate If T= minimum delay If T= n t the code has minimum delay Detector is linear !!! Detector is linear !!!

25 BLAST Bell Labs Layered Space Time Architecture V-BLAST implemented -98 by Bell Labs (40 bps/Hz) Steps for V-BLAST detection 1.Ordering: choosing the best channel 2.Nulling: using ZF or MMSE 3.Slicing: making a symbol decision 4.Canceling: subtracting the detected symbol 5.Iteration: going to the first step to detect the next symbol Time s0 s1 s2 V-BLAST D-BLAST Antenna s1 s2 s3

26 Trellis Coded Modulation 1. Combine both encoding and modulation. (using Euclidean distance only) 2. Allow parallel transition in the trellis. 3. Has significant coding gain (3~4dB) without bandwidth compromise. 4. Has the same complexity (same amount of computation, same decoding time and same amount of memory needed). 5. Has great potential for fading channel. 6. Widely used in Modem

27 Set Partitioning 1. Branches diverging from the same state must have the largest distance. 2. Branches merging into the same state must have the largest distance. 3. Codes should be designed to maximize the length of the shortest error event path for fading channel (equivalent to maximizing diversity). 4. By satisfying the above two criterion, coding gain can be increased.

28 Coding Gain About 3dB

29 Bit-Interleaved Coded Modulation Coded bits are interleaved prior to modulation. Performance of this scheme is quite desirable Relatively simple (from a complexity standpoint) to implement. Binary Encoder Bitwise Interleaver M-ary Modulator Soft Decoder Bitwise Deinterleaver Soft Demodulator Channel

30 BICM Performance Minimum Eb/No (in dB) Code Rate R 00.10.20.30.40.50.60.70.80.91 0 2 4 6 8 10 12 CM BICM M = 2 M = 64 M = 16 M = 4 AWGN Channel, Noncoherent Detection M: Modulation Alphabet Size


Download ppt "ECE 4371, Fall, 2014 Introduction to Telecommunication Engineering/Telecommunication Laboratory Zhu Han Department of Electrical and Computer Engineering."

Similar presentations


Ads by Google