Download presentation
1
DIGITAL COMMUNICATIONS Linear Block Codes
EEM 467 DIGITAL COMMUNICATIONS Linear Block Codes Assist.Prof.Dr. Nuray At
2
Codes can either correct or merely detect errors
Error Control Coding Designing codes for the reliable transmission of digital information over a noisy channel. Codes can either correct or merely detect errors Codes that can detect errors are called error-detecting codes Codes that can correct errors are called error-correcting codes Error correction is more complex than error detection! Error control codes are classified into Block Codes Convolutional Codes The channel is discrete when the alphabets of X and Y are both finite The channel is memoryless when the current output depends only on the current input and not on any of the previous inputs is a channel transition probability Entropy can be changed from one base to another
3
Channel Coding The channel encoder introduces systematic redundancy into the data stream The combined objective of the channel encoder and decoder is to minimize the effect of channel noise Channel Coding Theorem: Given a DMS X with entropy H(X) and a DMC with capacity C, if , there exists a coding scheme for which the source output can be transmitted over the channel with an arbitrary small probability of error. Entropy can be changed from one base to another
4
Block Codes Data sequence is divided into sequential blocks each k bits long Each k-bit block is converted into an n-bit block, where n > k The resultant block code is called (n,k) block code and the ratio k/n is called code rate. Entropy can be changed from one base to another
5
Linear Block Codes Binary Field: The set K = {0, 1} is a binary field. The binary field has two operations, addition and multiplication Addition Multiplication Entropy can be changed from one base to another
6
Let and be two codewords in C.
Linear Codes: Let and be two codewords in C. A code C is called linear if the sum of two codewords is also a codeword in C. A linear code C must contain the zero codeword Hamming Weight and Distance: Let a, b, and c be codewords of length n. The Hamming weight of c, denoted by w(c), is the number of 1's in c. The Hamming distance between a and b, denoted by d(a, b), is the number of positions in which a and b differ. Entropy can be changed from one base to another
7
Thus, the Hamming weight of a codeword c is the Hamming distance between c and 0, that is Similarly, the Hamming distance can be written in terms of Hamming weight as Minimum Distance: The minimum distance dmin of a linear code C is defined as the smallest Hamming distance between any pair of codewords in C. Theorem: The minimum distance dmin of a linear code C is the smallest Hamming weight of the nonzero codeword in the C. Entropy can be changed from one base to another
8
Error Detection and Correction Capabilities:
The minimum distance dmin of a linear code C determines the error detection and correction capabilities of C. A linear code C of minimum distance dmin can detect up to t errors iff A linear code C of minimum distance dmin can correct up to t errors iff Entropy can be changed from one base to another
9
Generator Matrix: In an (n,k) linear block code C, If the data bits appear in specified location of c, the code C is called systematic. That is, Here we assume that the first k bits of c are the data bits. Entropy can be changed from one base to another
10
In a matrix form Hence, and The k x n matrix G is called the generator matrix.
Entropy can be changed from one base to another
11
Parity-Check Matrix: Let H denote an m x n matrix defined by where
Parity-Check Matrix: Let H denote an m x n matrix defined by where . The matrix H is called the parity-check matrix of C. We have Thus, Entropy can be changed from one base to another
12
Syndrome Decoding Let r denote the received word of length n when codeword c of length n was sent over a noisy channel. where e is called the error pattern. Consider first the case of a single error in the ith position. Then, Evaluate as where s is called syndrome of r. Using s and noting that is the ith row of HT, we can identify the error position by comparing s to the rows of HT. Note that the zero syndrome indicates that r is a codeword and is presumably correct. Entropy can be changed from one base to another
13
Determine the generator matrix G.
Example: Consider a linear block code with the following parity-check matrix Determine the generator matrix G. Suppose that the received word is r = [ ]. Decode this received word, i.e., find c and d. Entropy can be changed from one base to another
14
The Hamming Codes Code length: Number of parity symbols: n – k = m
Error correcting capability: t = 1 The parity-check matrices for binary Hamming codes are quite easy to construct. For a Hamming code of length construct a matrix whose columns consist of all nonzero m-tuples. For example, a parity-check matrix for a (15,11) Hamming code The ordering of columns is arbitrary; another arrangement would still define a (15,11) Hamming code. Entropy can be changed from one base to another
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.