Turbo Codes Azmat Ali Pasha
Goals of Presentation Why Coding, Error Correction, etc? Basic terms and concepts Methods of handling the noise issues
Error Control Coding/Channel Coding What can you do in situations where data is transmitted over a noisy channel? Adding redundancy to information Check code Correct Errors
Transmission Data is digitally recorded and compressed Data is encoded by error control coding Data is modulated from digital data to an analog signal
Reception Analog signal is received and demodulated back to a digital signal Data is processed in the Error Control Decoder Redundancy is used to check for errors and correct them Data is uncompressed and presented
Transmission Process with Coding Application Layer Data Compression Application Layer Data Decompression Convolutional or Turbo coding Viterbi or Turbo decoding Channel Coding Channel Decoding Modulation Frequency Up-conversion Power Amplification Demodulation Frequency Down-conversion Receiver
Sensitivity to Error Media Sensitivity to Error Uncompressed Voice Low Sensitivity Uncompressed Video Compressed Voice High Sensitivity Compressed Video Data
Repetition Code Simple Repetition Code Problems with Repetition Information Sequence {010011} Codeword {00 11 00 00 11 11} Code-rate = ½ Problems with Repetition Bandwidth Increase Decrease the information rate
Channel Coding When NOT to channel code! Best case Channel Coding Transmitter power is irrelevant No noise in the channel Best case Channel Coding Shannon Limit (ideal) Shannon hasn’t been reached yet (Turbo codes are the closest)
Code Performance Bit Error Rate (BER) Signal to Noise Ratio (SNR) Probability of any particular bit being in error in a transmission Signal to Noise Ratio (SNR) The ratio of channel power to the noise power Best Case Low BER (fewer errors in final data) Low SNR (less power req.)
Coding System Comparison
Error Correction Codes Block Convolutional Turbo code Technically a block code Works like both Block and Convolutional codes
Block Code Most common is Hamming Code Take a block of length, k (information sequence) Then encode them into a codeword, the last (n-k) bits are called parity bits Parity bits used for error checking and correcting
Block Code (2D Mapping) higher minimum weight of code, higher the minimum weight between valid code words higher weight, better decoder performance
Convolutional Codes Continuous or Streaming coding Viterbi and Soft Output Viterbi are the most common
Turbo Codes Mix between Convolutional and Block codes Require a Block code HOWEVER, they use shift registers like Convolutional Codes
Turbo Codes (contd.) Most common is the PCCC (Parallel Concatenated Convolutional Codes) Produce high weight code words Interleaver shuffles the input sequence, uk, in such a way that it produces a high weight
Turbo Code Decoder It requires a soft output decoder Soft-output Assign a probability to decoded information (eg. 1 with a 80% likelihood) Outperform hard decision algorithms MAP (Maximum A Posteriori)
Iterative Decoding
Turbo Decoding Cycle will continue until certain conditions are met The decoder circulates estimates of the sent data like a turbo engine circulates air Once the decoder is ready, the hard decision is made
Error Corrections Old and New
Uses Cell Phone Satellite Communication Dial-up Communication RF Communication (AutoID? WiFi?)
Questions, Clarifications, and Comments Turbo Coding Method? Business Implications? Reduced Power Requirements Higher Bandwidth (lower redundancy)