Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.

Slides:



Advertisements
Similar presentations
Jesper H. Sørensen, Toshiaki Koike-Akino, and Philip Orlik 2012 IEEE International Symposium on Information Theory Proceedings Rateless Feedback Codes.
Advertisements

Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Company LOGO F OUNTAIN C ODES, LT C ODES AND R APTOR C ODES Susmita Adhikari Eduard Mustafin Gökhan Gül.
Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
Cyclic Code.
Applied Algorithmics - week7
Digital Fountain Codes V. S
José Vieira Information Theory 2010 Information Theory MAP-Tele José Vieira IEETA Departamento de Electrónica, Telecomunicações e Informática Universidade.
(speaker) Fedor Groshev Vladimir Potapov Victor Zyablov IITP RAS, Moscow.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
D.J.C MacKay IEE Proceedings Communications, Vol. 152, No. 6, December 2005.
Capacity of Wireless Channels
Enhancing Secrecy With Channel Knowledge
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
1 Finite-Length Scaling and Error Floors Abdelaziz Amraoui Andrea Montanari Ruediger Urbanke Tom Richardson.
Cooperative Multiple Input Multiple Output Communication in Wireless Sensor Network: An Error Correcting Code approach using LDPC Code Goutham Kumar Kandukuri.
Chapter 6 Information Theory
UCLA Progress Report OCDMA Channel Coding Jun Shi Andres I. Vila Casado Miguel Griot Richard D. Wesel UCLA Electrical Engineering Department-Communication.
OCDMA Channel Coding Progress Report
Threshold Phenomena and Fountain Codes
Erasure Correcting Codes
Fountain Codes Amin Shokrollahi EPFL and Digital Fountain, Inc.
1 NETWORK CODING Anthony Ephremides University of Maryland - A NEW PARADIGM FOR NETWORKING - February 29, 2008 University of Minnesota.
1 Scalable Image Transmission Using UEP Optimized LDPC Codes Charly Poulliat, Inbar Fijalkow, David Declercq International Symposium on Image/Video Communications.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
RAPTOR CODES AMIN SHOKROLLAHI DF Digital Fountain Technical Report.
1 Verification Codes Michael Luby, Digital Fountain, Inc. Michael Mitzenmacher Harvard University and Digital Fountain, Inc.
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
Noise, Information Theory, and Entropy
Analysis of Iterative Decoding
Repairable Fountain Codes Megasthenis Asteris, Alexandros G. Dimakis IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 32, NO. 5, MAY /5/221.
Approximating the MST Weight in Sublinear Time Bernard Chazelle (Princeton) Ronitt Rubinfeld (NEC) Luca Trevisan (U.C. Berkeley)
Shifted Codes Sachin Agarwal Deutsch Telekom A.G., Laboratories Ernst-Reuter-Platz Berlin Germany Joint work with Andrew Hagedorn and Ari Trachtenberg.
An Optimal Partial Decoding Algorithm for Rateless Codes Valerio Bioglio, Rossano Gaeta, Marco Grangetto, and Matteo Sereno Dipartimento di Informatica.
Chih-Ming Chen, Student Member, IEEE, Ying-ping Chen, Member, IEEE, Tzu-Ching Shen, and John K. Zao, Senior Member, IEEE Evolutionary Computation (CEC),
Channel Capacity.
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
Cyclic Code. Linear Block Code Hamming Code is a Linear Block Code. Linear Block Code means that the codeword is generated by multiplying the message.
CprE 545 project proposal Long.  Introduction  Random linear code  LT-code  Application  Future work.
Stochastic Networks Conference, June 19-24, Connections between network coding and stochastic network theory Bruce Hajek Abstract: Randomly generated.
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
Chapter 6. Effect of Noise on Analog Communication Systems
Multi-Edge Framework for Unequal Error Protecting LT Codes H. V. Beltr˜ao Neto, W. Henkel, V. C. da Rocha Jr. Jacobs University Bremen, Germany IEEE ITW(Information.
Computer Science Division
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
OPTIMIZATION of GENERALIZED LT CODES for PROGRESSIVE IMAGE TRANSFER Suayb S. Arslan, Pamela C. Cosman and Laurence B. Milstein Department of Electrical.
Hongjie Zhu,Chao Zhang,Jianhua Lu Designing of Fountain Codes with Short Code-Length International Workshop on Signal Design and Its Applications in Communications,
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Error-Correcting Code
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
1 Implementation and performance evaluation of LT and Raptor codes for multimedia applications Pasquale Cataldi, Miquel Pedros Shatarski, Marco Grangetto,
RS – Reed Solomon Error correcting code. Error-correcting codes are clever ways of representing data so that one can recover the original information.
Coding for Multipath TCP: Opportunities and Challenges Øyvind Ytrehus University of Bergen and Simula Res. Lab. NNUW-2, August 29, 2014.
8 Coding Theory Discrete Mathematics: A Concept-based Approach.
UCLA Progress Report OCDMA Channel Coding
Context-based Data Compression
Chapter 9: Huffman Codes
Chapter 6.
CRBcast: A Collaborative Rateless Scheme for Reliable and Energy-Efficient Broadcasting in Wireless Sensor/Actuator Networks Nazanin Rahnavard, Badri N.
Presentation transcript:

Raptor Codes Amin Shokrollahi EPFL

BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels

Example: Popular Download

Example: Peer-to-Peer

Example: Satellite

The erasure probabilities are unknown. Want to come arbitrarily close to capacity on each of the erasure channels, with minimum amount of feedback. Traditional codes don’t work in this setting since their rate is fixed. Need codes that can adapt automatically to the erasure rate of the channel.

Fountain Codes Sender sends a potentially limitless stream of encoded bits. Receivers collect bits until they are reasonably sure that they can recover the content from the received bits, and send STOP feedback to sender. Automatic adaptation: Receivers with larger loss rate need longer to receive the required information. Want that each receiver is able to recover from the minimum possible amount of received data, and do this efficiently.

Universality and Efficiency [Universality] Want sequences of Fountain Codes for which the overhead is arbitrarily small [Efficiency] Want per-symbol-encoding to run in close to constant time, and decoding to run in time linear in number of output symbols.

LT-Codes Invented by Michael Luby in First class of universal and almost efficient Fountain Codes Output distribution has a very simple form Encoding and decoding are very simple

LT-Codes LT-codes use a restricted distribution on : Fix distribution on Distribution is given by where is the Hamming weight of Parameters of the code are

2 Insert header, and send XOR Choose weight Choose 2 Random original symbols Input symbols Weight Prob Weight table The LT Coding Process

LT-light Traditional pre-code Input symbols  – fraction erasures Raptor Codes Output symbols

Further Content How do Raptor Codes designed for the BEC perform on other symmetric channels (with BP decoding)? Information theoretic bounds, and fraction of nodes of degrees one and two in capacity-achieving Raptor Codes. Some examples Applications

Parameters Overhead, if decoding is possible from many output symbols. Channel Raptor Code with parameters Measure residual error probability as a function of the overhead for a given channel.

Incremental Redundancy Codes Raptor codes are true incremental redundancy codes. A sender can generate as many output bits as necessary for successful decoding. Suitably designed Raptor codes are close to the Shannon capacity for a variety of channel conditions (from very good to rather bad). Raptor codes are competitive with the best LDPC codes under different channel conditions.

Sequences Designed for the BEC Type: Left-regular of degree 4, right Poisson, rate 0.98 Simulations done on AWGN(  ) for various 

Sequences Designed for the BEC Threshold Turbo Normalized SNR Eb/N0 Best designs (so far)

Sequences Designed for the BEC Not too bad, but quality decreases when the amount of noise on the channel increases. Need to design better distributions. Idea: adapt the Gaussian approximation technique of Chung, Richardson, and Urbanke.

Gaussian Approximation Assume that the messages sent from input to output nodes are Gaussian. Track the mean of these Gaussians from one round to another. Degree distributions can be designed using this approximation. However, they don’t perform that well. Anything else we can do with this?

Nodes of Degree 2 Use equality, differentiate, and compare values at 0 where and It can be rigorously proved that above condition is necessary for error probability of BP to converge to zero.

Use graph induced on input symbols by output symbols of degree 2. Nodes of Degree 2

Nodes of Degree 2: BEC

New output node of degree 2 Information Loss! Nodes of Degree 2: BEC

Fraction of Nodes of Degree 2 If there exists component of linear size (i.e., a giant component), then next output node of degree 2 has constant probability of being useless. Therefore, graph should not have giant component. This means that for capacity achieving degree distributions we must have: On the other hand, if then algorithm cannot start successfully. So, for capacity-achieving codes:

The -ary symmetric channel (large ) Double verification decoding (Luby-Mitzenmacher): If and are correct, then they verify. Remove all of them from graph and continue. Can be shown that number of correct output symbols needs to be at least Times number of input symbols. (Joint work with Karp and Luby.)

The -ary symmetric channel (large ) More sophisticated algorithms: induced graph! If two input symbols are connected by a correct output symbol, and each of them is connected to a correct output symbol of degree one, then the input symbols are verified. Remove from them from graph.

The -ary symmetric channel (large ) More sophisticated algorithms: induced graph! More generally: if there is a path consisting of correct edges, and the two terminal nodes are connected to correct output symbols of degree one, then the input symbols get verified. (More complex algorithms.)

The -ary symmetric channel (large ) Limiting case: Giant component consisting of correct edges, two correct output symbols of degree one “poke” the component. So, ideal distribution “achieves” capacity.

Nodes of Degree 2 What is the fraction of nodes of degree 2 for capacity- achieving Raptor Codes? Turns out, that in the limit we need to have equality: where, in general and is the LLR of the channel.

General Symmetric Channels: Mimic Proof Proof is information theoretic: if fraction of nodes of degree 2 is larger by a constant, then : Expectation of the hyperbolic tangent of messages passed from input to output symbols at given round of BP is larger than a constant. This shows that So code cannot achieve capacity.

General Symmetric Channels: Mimic Proof Fraction of nodes of degree one for capacity-achieving Raptor Codes: Therefore, if, and if denote output nodes of degree one, then Noisy observations of So

A better Gaussian Approximation Uses adaptation of a method of Ardakani and Kschischang. Heuristic: Messages passed from input to output symbols are Gaussian, but not vice-versa. Find recursion for the means of these Gaussians, and apply linear programming.

Raptor Codes for any Channel From the formula on the fraction of nodes of degree 2 we know that it is not possible to design truly universal Raptor codes. However, how does a Raptor code designed for a binary input memoryless symmetric channel perform for some other channel of this type? The question has been answered (joint work with O. Etesamit): if a code achieves capacity on the BEC, then its overhead over any binary input memoryless symmetric channel is not larger than log(e)-1 = 0.442…

Conclusions For LT- and Raptor codes, some decoding algorithms can be phrased directly in terms of subgraphs of graphs induced by output symbols of degree 2. This leads to a simpler analysis without the use the tree assumption. For the BEC, and for the q-ary symmetric channel (large q) we obtain essentially the same limiting capacity- achieving degree distribution, using the giant component analysis. An information theoretic analysis gives the optimal fraction of output nodes of degree 2 for general memoryless symmetric channels. A graph analysis reveals very good degree distributions, which perform very well experimentally.