Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research 2002.11.25.

Slides:



Advertisements
Similar presentations
Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Advertisements

(speaker) Fedor Groshev Vladimir Potapov Victor Zyablov IITP RAS, Moscow.
The Impact of Channel Estimation Errors on Space-Time Block Codes Presentation for Virginia Tech Symposium on Wireless Personal Communications M. C. Valenti.
Improving BER Performance of LDPC Codes Based on Intermediate Decoding Results Esa Alghonaim, M. Adnan Landolsi, Aiman El-Maleh King Fahd University of.
1 Finite-Length Scaling and Error Floors Abdelaziz Amraoui Andrea Montanari Ruediger Urbanke Tom Richardson.
Cooperative Multiple Input Multiple Output Communication in Wireless Sensor Network: An Error Correcting Code approach using LDPC Code Goutham Kumar Kandukuri.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep
Chapter 6 Information Theory
Bayesian Learning Rong Jin. Outline MAP learning vs. ML learning Minimum description length principle Bayes optimal classifier Bagging.
By Hua Xiao and Amir H. Banihashemi
Near Shannon Limit Performance of Low Density Parity Check Codes
OCDMA Channel Coding Progress Report
Asymptotic Enumerators of Protograph LDPCC Ensembles Jeremy Thorpe Joint work with Bob McEliece, Sarah Fogal.
2015/6/15VLC 2006 PART 1 Introduction on Video Coding StandardsVLC 2006 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
Spatial and Temporal Data Mining
Division of Engineering and Applied Sciences March 2004 Belief-Propagation with Information Correction: Near Maximum-Likelihood Decoding of LDPC Codes.
Low Density Parity Check Codes LDPC ( Low Density Parity Check ) codes are a class of linear bock code. The term “Low Density” refers to the characteristic.
© 2005, it - instituto de telecomunicações. Todos os direitos reservados. Gerhard Maierbacher Scalable Coding Solutions for Wireless Sensor Networks IT.
EE436 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Interconnect Efficient LDPC Code Design Aiman El-Maleh Basil Arkasosy Adnan Al-Andalusi King Fahd University of Petroleum & Minerals, Saudi Arabia Aiman.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
1 Verification Codes Michael Luby, Digital Fountain, Inc. Michael Mitzenmacher Harvard University and Digital Fountain, Inc.
How to Turn on The Coding in MANETs Chris Ng, Minkyu Kim, Muriel Medard, Wonsik Kim, Una-May O’Reilly, Varun Aggarwal, Chang Wook Ahn, Michelle Effros.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
Optimizing LDPC Codes for message-passing decoding. Jeremy Thorpe Ph.D. Candidacy 2/26/03.
Channel Polarization and Polar Codes
CS774. Markov Random Field : Theory and Application Lecture 10 Kyomin Jung KAIST Oct
Analysis of Iterative Decoding
Low Density Parity Check (LDPC) Code Implementation Matthew Pregara & Zachary Saigh Advisors: Dr. In Soo Ahn & Dr. Yufeng Lu Dept. of Electrical and Computer.
Review of modern noise proof coding methods D. Sc. Valeri V. Zolotarev.
Block-LDPC: A Practical LDPC Coding System Design Approach
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 7 Low Density Parity Check Codes.
Tinoosh Mohsenin and Bevan M. Baas VLSI Computation Lab, ECE Department University of California, Davis Split-Row: A Reduced Complexity, High Throughput.
Distributed computing using Projective Geometry: Decoding of Error correcting codes Nachiket Gajare, Hrishikesh Sharma and Prof. Sachin Patkar IIT Bombay.
Basic Characteristics of Block Codes
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
Iterative decoding If the output of the outer decoder were reapplied to the inner decoder it would detect that some errors remained, since the columns.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Information Theory Linear Block Codes Jalal Al Roumy.
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Low Density Parity Check codes
Multi-Split-Row Threshold Decoding Implementations for LDPC Codes
1 Design of LDPC codes Codes from finite geometries Random codes: Determine the connections of the bipartite Tanner graph by using a (pseudo)random algorithm.
Multipe-Symbol Sphere Decoding for Space- Time Modulation Vincent Hag March 7 th 2005.
A Simple Transmit Diversity Technique for Wireless Communications -M
FEC Linear Block Coding
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Forschungszentrum Telekommunikation Wien [Telecommunications Research Center Vienna] Göttfried Lächner, Ingmør Lønd, Jössy Säyir Optimization of LDPC codes.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Memory-efficient Turbo decoding architecture for LDPC codes
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Error-Correcting Code
1 Channel Coding: Part III (Turbo Codes) Presented by: Nguyen Van Han ( ) Wireless and Mobile Communication System Lab.
1 Aggregated Circulant Matrix Based LDPC Codes Yuming Zhu and Chaitali Chakrabarti Department of Electrical Engineering Arizona State.
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
Factor Graphs and the Sum-Product Algorithm
Rate 7/8 (1344,1176) LDPC code Date: Authors:
Context-based Data Compression
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
An Improved Split-Row Threshold Decoding Algorithm for LDPC Codes
Effective Social Network Quarantine with Minimal Isolation Costs
Optimizing LDPC Codes for message-passing decoding.
Chris Jones Cenk Kose Tao Tian Rick Wesel
CT-474: Satellite Communications
Presentation transcript:

Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research

Talk Overview  Introduction (13 slides)  Wiring Complexity ( 9 slides)  Logic Complexity (7 slides)

Reliable Communication over Unreliable Channels  Channel is the means by which information is communicated from sender to receiver  Sender chooses X  Channel generates Y from conditional probability distribution P(Y|X)  Receiver observes Y channel P(Y|X) Y X

Shannon’s Channel Coding Theorem  Using the channel n times, we can communicate k bits of information with probability of error as small as we like as long as  as long as n is large enough. C is a number that characterizes any channel.  The same is impossible if R>C.

The Coding Strategy  Encoder chooses the m th codeword in codebook C and transmits it across the channel  Decoder observes the channel output y and generates m’ based on the knowledge of the codebook C and the channel statistics. Decoder Encoder Channel

Linear Codes  A linear code C can be defined in terms of either a generator matrix or parity-check matrix.  Generator matrix G (k×n)  Parity-check matrix H (n-k×n)

Regular LDPC Codes  LDPC Codes linear codes defined in terms of H.  The number of ones in each column of H is a fixed number λ.  The number of ones in each row of H is a fixed number ρ.  Typical parameters for Regular LDPC codes are (λ,ρ)=(3,6).

Graph Representation of LDPC Codes  H is represented by a bipartite graph.  nodes v (degree λ) on the left represent variables.  Nodes c (degree ρ)on the right represent equations:... Variable nodes Check nodes

Message-Passing Decoding of LDPC Codes  Message Passing (or Belief Propagation) decoding is a low-complexity algorithm which approximately answers the question “what is the most likely x given y?”  MP recursively defines messages m v,c (i) and m c,v (i) from each node variable node v to each adjacent check node c, for iteration i=0,1,...

Two Types of Messages...  Liklihood Ratio  For y 1,...y n independent conditionally on x:  Probability Difference  For x 1,...x n independent:

...Related by the Biliniear Transform  Definition:  Properties:

Variable to Check Messages  On any iteration i, the message from v to c is:... v c

Check to Variable Messages  On any iteration, the message from c to v is:... v c

Decision Rule  After sufficiently many iterations, return the likelihood ratio:

Theorem about MP Algorithm  If the algorithm stops after r iterations, then the algorithm returns the maximum a posteriori probability estimate of x v given y within radius r of v.  However, the variables within a radius r of v must be dependent only by the equations within radius r of v, v r...

Wiring Complexity

Physical Implementation (VLSI)  We have seen that the MP decoding algorithm for LDPC codes is defined in terms of a graph  Nodes perform local computation  Edges carry messages from v to c, and c to v  Instantiate this graph on a chip! Edges →Wires Nodes →Logic units

Complexity vs. Performance  Longer codes provide: More efficient use of the channel (eg. less power used over the AWGN channel) Faster throughput for fixed technology and decoding parameters (number of iterations)  Longer codes demand: More logic resources Way more wiring resources

The Wiring Problem  The number of edges in the graph grows like the number of nodes n.  The length of the edges in a random graph also grows like. A random graph

Graph Construction?  Idea: find a construction that has low wire- length and maintains good performance...  Drawback: it is difficult to construct any graph that has the performance of random graph.

A Better Solution:  Use an algorithm which generates a graph at random, but with a preference for: Short edge length Quantities related to code performance

Conventional Graph Wisdom  Short loops give rise to dependent messages (which are assumed to be independent) after a small number of iterations, and should be avoided.

Simulated Annealing!  Simulated annealing approximately minimizes an Energy Function over a Solution space.  Requires a good way to traverse the solution space.

Generating LDPC graphs with Simulated Annealing  Define energy function with two components: Wirelength Loopiness  traverse the space by picking two edges at random and do:

Results of Simulated Annealing  The graph on the right has nearly identical performance to the one shown previously A graph generated by Simulated Annealing

Logic Complexity

Complexity of Classical Algorithm  Original algorithm defines messages in terms of arithmetic operations over real numbers:  However, this implies floating-point addition, multiplication, and even division!

A modified Algorithm  We define a modified algorithm in which all messages are their logarithms in the original scheme  The channel message λ is similarly replaced by it's logarithm.

Quantization  Replaced a product by a sum, but now we have a transcendental function φ.  However, if we quantize the messages, we can pre-compute φ for all values!

Quantized MP Performance  The graph to the following page shows the bit error rate for a regular (3,6) of length n=10,000 code using between 2 and 4 bits of quantization.  (Some error floors predicted by density evolution, some are not)

Quantization Tradeoffs  A quantizer is characterized by its range and granularity  For fixed channel quantization: A finely granulated quantizer (Q 1 ) performs well at low SNR. However, the quantizer must be broadened (Q 2 ) to avoid saturation, and resulting error floor. Q1Q1 Q2Q2

Conclusion  There is a tradeoff between logic complexity and performance  Nearly optimal performance (+.1 dB = × 1.03 power) is achievable with 4-bit messages.  More work is needed to avoid error-floors due to quantization.