The Cutoff Rate and Other Limits: Passing the Impassable Richard E. Blahut University of Illinois UIUC 5/4/20151.

Slides:



Advertisements
Similar presentations
Convolutional Codes Mohammad Hanaysheh Mahdi Barhoush.
Advertisements

Decoding of Convolutional Codes  Let C m be the set of allowable code sequences of length m.  Not all sequences in {0,1}m are allowable code sequences!
(speaker) Fedor Groshev Vladimir Potapov Victor Zyablov IITP RAS, Moscow.
Modern Digital and Analog Communication Systems Lathi Copyright © 2009 by Oxford University Press, Inc. C H A P T E R 15 ERROR CORRECTING CODES.
Information Theory Introduction to Channel Coding Jalal Al Roumy.
Maximum Likelihood Sequence Detection (MLSD) and the Viterbi Algorithm
Chapter 6 Information Theory
Cellular Communications
Near Shannon Limit Performance of Low Density Parity Check Codes
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Turbo Codes – Decoding and Applications Bob Wall EE 548.
Fundamental limits in Information Theory Chapter 10 :
Turbo Codes Azmat Ali Pasha.
Figure 6.1. A convolutional encoder. Figure 6.2. Structure of a systematic convolutional encoder of rate.
Error Correcting Codes To detect and correct errors Adding redundancy to the original message Crucial when it’s impossible to resend the message (interplanetary.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
CS774. Markov Random Field : Theory and Application Lecture 10 Kyomin Jung KAIST Oct
296.3Page :Algorithms in the Real World Convolutional Coding & Viterbi Decoding.
Analysis of Iterative Decoding
Daphne Koller Message Passing Loopy BP and Message Decoding Probabilistic Graphical Models Inference.
ECE 4371, Fall, 2014 Introduction to Telecommunication Engineering/Telecommunication Laboratory Zhu Han Department of Electrical and Computer Engineering.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon.
Introduction of Low Density Parity Check Codes Mong-kai Ku.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Chapter Thirteen Channel Coding and Encryption.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
Turbo Codes COE 543 Mohammed Al-Shammeri. Agenda PProject objectives and motivations EError Correction Codes TTurbo Codes Technology TTurbo decoding.
Coding Theory Efficient and Reliable Transfer of Information
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Last time, we talked about:
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Low Density Parity Check codes
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Error Correction Code (2)
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Wireless Communication Research Lab. CGU What is Convolution Code? 指導教授:黃文傑 博士 學生:吳濟廷
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Log-Likelihood Algebra
Implementation of Turbo Code in TI TMS320C8x Hao Chen Instructor: Prof. Yu Hen Hu ECE734 Spring 2004.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Bounds on Redundancy in Constrained Delay Arithmetic Coding Ofer ShayevitzEado Meron Meir Feder Ram Zamir Tel Aviv University.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
1 Reliability-Based SD Decoding Not applicable to only graph-based codes May even help with some algebraic structure SD alternative to trellis decoding.
Information Theory & Coding for Digital Communications Prof JA Ritcey EE 417 Source; Anderson Digital Transmission Engineering 2005.
1 Code design: Computer search Low rate: Represent code by its generator matrix Find one representative for each equivalence class of codes Permutation.
Classical Coding for Forward Error Correction Prof JA Ritcey Univ of Washington.
FEC decoding algorithm overview VLSI 자동설계연구실 정재헌.
The Concavity of the Auxiliary Function for Classical-Quantum Channels Speaker: Hao-Chung Cheng Co-work: Min-Hsiu Hsieh Date: 01/09/
296.3:Algorithms in the Real World
The Viterbi Decoding Algorithm
What is this “Viterbi Decoding”
VLSI Architectures For Low-Density Parity-Check (LDPC) Decoders
Interleaver-Division Multiple Access on the OR Channel
January 2004 Turbo Codes for IEEE n
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Error Correction Code (2)
Error Correction Code (2)
Distributed Compression For Binary Symetric Channels
Improving turbocode performance by cross-entropy
Miguel Griot, Andres I. Vila Casado, and Richard D. Wesel
Time Varying Convolutional Codes for Punctured Turbocodes
Error Correction Code (2)
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Error Correction Coding
IV. Convolutional Codes
Presentation transcript:

The Cutoff Rate and Other Limits: Passing the Impassable Richard E. Blahut University of Illinois UIUC 5/4/20151

Shannon’s Ideal Channel Example: Binary Memoryless Channel Stationary Discrete Memoryless 5/4/20152

… … … … … … …. 1 1 … … … … … … A Large Code 5/4/20153

+ + + A convolutional encoder 5/4/20154

Information theory asserts existence of good codes Coding theory wants practical codes and decoders There are binary codes 5/4/20155

Brief History of Codes Algebraic Block Codes 1948 Reed-Solomon codes (1960) Convolutional Codes 1954 Sequential decoding (1951) Viterbi algorithm (1967) Euclidean Trellis Codes 1982 Turbo Codes 1993 Gallager (LDPC) codes (1960) 5/4/20156

Decoders Maximum Likelihood Maximum Block Posterior Maximum Symbol Posterior Typical Sequence Iterative Posterior Minimum Distance Bounded Distance 5/4/20157

My View 1)Channel Capacity 2)Cutoff Rate 3)Critical Rate Distance -based codes Likelihood -based codes Posterior -based codes Polar codes 5/4/20158

For any fixed there is a sequence of codes for which exponentially in blocklength. This sequence does not approach Channel Error Exponent Fact #2 Every code satisfies Fact #1 Codes exist such that 5/4/20159

E(R) 5/4/201510

A sequence of codes drawn from a set of ensembles 5/4/201511

Channel Capacity Channel Critical Rate Channel Cutoff Rate 5/4/201512

Binary Hypotheses Testing Type 1 Error Type 2 Error 5/4/201513

Binary Hypotheses Testing Change Notation 5/4/201514

Bounds onand Upper Bounds on Sphere Packing Bound Minimum Distance Bound Lower Bounds on Random Coding Bound Expurgated Bound 5/4/201515

Bhattacharyya Distance 5/4/201516

175/4/2015

is quadratic near Let be a sequence with Then with so A Code Sequence Approaching Capacity with if 5/4/201518

Capacity: C Shannon(1948) Cutoff Rate: Jacobs & Berlekamp(1968) Massey(1981) Arikan(1985/1988) Error Exponent: Gallager(1965) Forney(1968) Blahut(1972) 5/4/201519

Gallager (1965) Forney (1968) Blahut (1972) where is the Kullback divergence 5/4/201520

Forney’s List Decoding Likelihood Function Likelihood Ratio 5/4/201521

Sequential Decoding Exponential waiting time Work exponential in time Pareto Distribution with Work unbounded if Sequential decoding fails if Is maximum likelihood decoding sequential decoding? 5/4/201522

Two Pareto parameters and Pareto Distribution 5/4/201523

Start with an exponential distribution If is exponential, then is a Pareto distribution The Origin of a Pareto Distribution 5/4/201524

The Origins of Graph-Based Codes Brillouin deBrogle Shannon Battail (1987) Hagenauer (1989) Berrou et al (1993) 5/4/201525

Coding Beyond the Cutoff Rate Parallel – Pinsker Hybrid – Jelinek Turbo – Berrou/Glavieux LDPC – Gallager/Tanner/Wiberg Polar - Arikan 5/4/201526

The Massey Distraction (1981) QECBEC QEC 2 BEC 5/4/201527

Performance Measures Bit Error Rate vs. Message Error Rate 5/4/201528

The Arikan Retraction 5/4/201529

5/4/201530

5/4/201531

The Arkan Redistraction* *Rhetorical 5/4/201532