Channel Coding Part 1: Block Coding

Slides:



Advertisements
Similar presentations
Mahdi Barhoush Mohammad Hanaysheh
Advertisements

1 S Digital Communication Systems Cyclic Codes.
Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
Cyclic Code.
Computer Networking Error Control Coding
Note: is very restrictive Would like more flexibility in determining the length If we shorten the length of the message.
CSE 461: Error Detection and Correction. Next Topic  Error detection and correction  Focus: How do we detect and correct messages that are garbled during.
Error Detection and Correction
Cellular Communications
Transmission Errors Error Detection and Correction
Error Detection and Correction
Error Detection and Correction
DIGITAL COMMUNICATION Coding
Transmission Errors1 Error Detection and Correction.
Chapter 11 Error-Control CodingChapter 11 : Lecture edition by K.Heikkinen.
Forward Error Correction Steven Marx CSC45712/04/2001.
1 CS 6910 – Pervasive Computing Spring 2007 Section 4 (Ch.4): Channel Coding and Error Control Prof. Leszek Lilien Department of Computer Science Western.
Forward Error Correction. FEC Basic Idea Send redundant data Receiver uses it to detect/correct errors Reduces retransmissions/NAKs Useful when RTT is.
Reliability and Channel Coding
DIGITAL COMMUNICATION Coding
Transmission Errors Error Detection and Correction
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
1 S Advanced Digital Communication (4 cr) Cyclic Codes.
Channel Coding and Error Control
Copyright © 2003, Dr. Dharma P. Agrawal and Dr. Qing-An Zeng. All rights reserved. 1 Chapter 4 Channel Coding.
Lecture 10: Error Control Coding I Chapter 8 – Coding and Error Control From: Wireless Communications and Networks by William Stallings, Prentice Hall,
Error Coding Transmission process may introduce errors into a message.  Single bit errors versus burst errors Detection:  Requires a convention that.
1 SNS COLLEGE OF ENGINEERING Department of Electronics and Communication Engineering Subject: Digital communication Sem: V Cyclic Codes.
Cyclic Code. Linear Block Code Hamming Code is a Linear Block Code. Linear Block Code means that the codeword is generated by multiplying the message.
Data Link Layer: Error Detection and Correction
Codes Codes are used for the following purposes: - to detect errors - to correct errors after detection Error Control Coding © Erhan A. Ince Types: -Linear.
COEN 180 Erasure Correcting, Error Detecting, and Error Correcting Codes.
MIMO continued and Error Correction Code. 2 by 2 MIMO Now consider we have two transmitting antennas and two receiving antennas. A simple scheme called.
Reed Solomon Code Doug Young Suh Last updated : Aug 1, 2009.
Basic Characteristics of Block Codes
Introduction of Low Density Parity Check Codes Mong-kai Ku.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
§6 Linear Codes § 6.1 Classification of error control system § 6.2 Channel coding conception § 6.3 The generator and parity-check matrices § 6.5 Hamming.
DIGITAL COMMUNICATIONS Linear Block Codes
6876: Communication Networks ERROR CONTROL STRATEGIES Error Control Strategies Forward Error Correction (FEC) Automatic Repeat Request (ARQ)
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
1 Error Detection and Correction Martin Weiss. Slide 2 Objectives of this Meeting u Describe the major error detection techniques u Describe forward error.
Error Detection. Data can be corrupted during transmission. Some applications require that errors be detected and corrected. An error-detecting code can.
Some Computation Problems in Coding Theory
Error Detection and Correction
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
FEC Linear Block Coding
INFORMATION THEORY Pui-chor Wong.
Transmission Errors Error Detection and Correction.
Error Control Coding. Purpose To detect and correct error(s) that is introduced during transmission of digital signal.
CHAPTER 8 CHANNEL CODING: PART 3 Sajina Pradhan
Information Theory & Coding for Digital Communications Prof JA Ritcey EE 417 Source; Anderson Digital Transmission Engineering 2005.
ECE 442 COMMUNICATION SYSTEM DESIGN LECTURE 10. LINEAR BLOCK CODES Husheng Li Dept. of EECS The University of Tennessee.
Classical Coding for Forward Error Correction Prof JA Ritcey Univ of Washington.
Class Report 林格名 : Reed Solomon Encoder. Reed-Solomom Error Correction When a codeword is decoded, there are three possible outcomes –If 2s + r < 2t (s.
Channel Coding and Error Control 1. Outline Introduction Linear Block Codes Cyclic Codes Cyclic Redundancy Check (CRC) Convolutional Codes Turbo Codes.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Communication Networks: Technology & Protocols
: An Introduction to Computer Networks
Advanced Computer Networks
Chapter 6.
Chapter 6.
DIGITAL COMMUNICATION Coding
Transmission Errors Error Detection and Correction
Cyclic Code.
Coding and Error Control
Transmission Errors Error Detection and Correction
Types of Errors Data transmission suffers unpredictable changes because of interference The interference can change the shape of the signal Single-bit.
Chapter 10 Error Detection and Correction
Presentation transcript:

Channel Coding Part 1: Block Coding Doug Young Suh

Physical Layer Bit error rate  ① Transmitting power  ② Noise power  Problems in hardware or operating cost Needs algorithmic(logical) approach!! 1V -1V 0.5 + AWGN = BER : bit error rate -1V 1V

Datalink/transport layer Error control coding by addition of redundancy Block coding Convolutional coding Error-control in computer network Error detection : ARQ automatic repeat request Error correction : FEC forward error correction Errors and erasure Error : 0  1 or 1  0 at unknown location Erasure : packet(frame) loss at known location

Discrete memoryless channels Decision between 0 and 1 Hard decision : Simple, loss of information Soft decision : complex

Channel model : BSC BSC (binary symmetric channel) 1-p BSC (binary symmetric channel) Bit error rate, p of BPSK with Gaussian noise Noise No and signal power Ec p p 1-p p SNR in dB

Entropy and codeword Example) Huffman coding 00 1/2 1/4 1 1/4 10 01 00 1/2 1/4 1 1/4 10 01 1/4 1 10 1/8 110 1 1/4 1 1 1 111 11 1/8 1/4 Average codeword length

Fine-to-coarse Quantization Dice vs. coinl 1/6 1/2 {1,2,3}  head {4,5,6}  tail 1 2 3 4 5 6 H T quantization 3 5 2 1 5 4 ∙∙∙ H T H H T T ∙∙∙ Effects of quantization Data compression Information loss, but not all 4/22/2017 Media Lab. Kyung Hee University

Example) dice p(i)= 1/6 for i=1,…,6 Shannon coding theory Entropy and bitrate R Example) dice p(i)= 1/6 for i=1,…,6 H(X) = Σ(log26)/6 = 2.58 bits Shannon coding theorem No error, if H(X) < R(X) = 3 bits If R(X) = 2, {00,01,10,11}{1,2,{3,4},{5,6}} With received information Y=“even number” H(X|Y) = Σlog23/3 = 1.58 < R(X|Y) = 2 If the receiver received 2 more bits  decodable

BSC and mutual information BSC (Binary Symmetric Channel) X=0 X=1 Y=0 Y=1 p 1-p H(X|Y) = - Σ Σ p(x,y)log2p(x|y) H(X|Y) = -p log2 p – (1-p) log2 (1-p) p=0.10.47, p=0.20.72, p=0.51 Mutual information (channel capacity) I(X;Y) = H(X) – H(X|Y) p=0.10.53, p=0.20.28, p=0.50. 1 bit transmission delivers I(X;Y) bit information. H(X|Y) 1bit P=0 P=1 H(X|Y) = loss of information

Probability of non-detected errors (n,k) Code (n,k) n-k redundant bits (parity bit, check bit) k data bits Code rate r = k/n Information of k bits is delivered by transmission of n bits. Parity symbol ⊃ parity bit (For RS, a byte is a symbol.) (n,k)=(4,3) even parity error detection code when bit error rate p=0.001 Probability of non-detected errors

Trade-offs Trade-off 1 : Error Performance vs. Bandwidth A : less bandwidth higher error rate than C at the same channel condition. Trade-off 2 : Coding gain (D-E) Trade-off 3 : Capacity vs. Bandwidth coded A B C E uncoded Eb/N0 (dB) 8 9 14 10-2 10-4 10-6 D BER

Trade-off : an example Example) Coded vs. Uncoded Performance R = 4800bps (n, k) = (15, 11) t=1 Performance of coding? Sol) without coding

(continued) Trade-off : an example Higher layer 11kbps, Datalink layer (11=>15) Physical layer 15 kbps, Higher layer Datalink layer (15=>11) Physical layer Code performance at low values of Too many errors to be corrected => Turbo codes

Linear Block Code (n, k) code n: length of a codeword k: number of message bits n-k: number of parity bits Example) Even parity check is a (8,7) code Systematic code : Message bits are left unchanged. (Parity check is one of systematic codes.) GF(2) (GF: Glois field, /galoa/) “field”: set of variables closed for an operation. closed: The results of an operation is also an element of the field. Ex) Set of positive integers is closed for + and x, but, open for – and /.

GF(2) : Galois Field GF(2) is closed for the following two operations. + 1 1 · Two operations above are XOR and AND, respectively. Block encoder Block decoder

Error-Detecting/correcting capability Hamming distance How many bits should be changed to be the same? Example) Effect of repetition code Send (0 1), by using (000 111) or (00000 11111). Minimum distance  maximum error detection capability maximum error correction capability     ◇ : single error detection ◇ : double error detection OR single error correction ◇ : (double error detection AND single error correction)          OR (triple error detection)

Error-Detecting/correcting capability Double error correction with k message bits perfect code Example) Extended Hamming code (7,4) + one parity bit = (8,4) ⇒ 

Cyclic code Cyclic codes (Cyclic codes ⊂ Linear block codes) For n=7, X7+1 = (X+1)(X3+X2+1)(X3+X+1) Generator polynomial g(X) =   X3+X2+1 or X3+X+1 Note that X7+1 = 0 when g(X)=0. where Example) mod operator(나머지 연산자) 7 3 = 7 % 3 = 1 Example) Calculate c(X) for m(X) = [1010]

Hamming (7,4) Decoding Example) Make the syndrome table of Hamming(7, 4) code. 0 0 0 0 0 0 0             0 0 0 1 0 0 0 0 0 0             0 1 0 0 0 0 0             0 0 1 0 0 0 0             0 0 0 1 0 0 0             0 0 0 0 1 0 0             0 0 0 0 0 1 0             0 0 0 0 0 0 1 Example) For Hamming(7,4), find and when              C1 [1010001] [1110001] [1011001] [1110011] [1110011] C2 [1110010] [1110110]

Example) Hamming (7,4) code b) Parity check polynomial h(X) If X is a root of  Since There exists which satisfies Then,

Entropy and Hamming (7,4) k=4, n=7 How many codewords? 2k = 24 Their entropy? P = 1/ 2k  k bits / codeword Information transmission rate = coding rate r = k/n [information/transmission] The value of a bit is k/n. Suitable when I(X;Y)=H(X)-H(X|Y) > k/n = 0.57 I(X;Y) = H(X) – H(X|Y) p=0.10.53, p=0.20.28, p=0.50. Suitable at BER of less than 10% H(X|Y) 1bit P=0 P=1

Other Block Codes CRC codes : error detection only for a long packet     CRC-CCITT code open question) How many combinations of non-detectable errors for CRC-12 code used for 100bits long data? What is the probability of the non-detectable errors when BER is 0.01? (2) BCH Codes (Bose-Chadhuri-Hocquenghem)      Block length :                          Number of message bits :               Minimum distance :                      Ex)        (7,4,1) g(X)=13 (15,11,1) g(X)=23 (15,7,2) g(X)=721 (15,5,3) g(X)=2467 (255,171,11) 15416214212342356077061630637 (3) Reed Solomon Codes :         arithmetic