CY2G2 Information Theory 5

Slides:



Advertisements
Similar presentations
DCSP-8: Minimal length coding II, Hamming distance, Encryption Jianfeng Feng
Advertisements

DCSP-10 Jianfeng Feng Department of Computer Science Warwick Univ., UK
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
STATISTIC & INFORMATION THEORY (CSNB134) MODULE 12 ERROR DETECTION & CORRECTION.
parity bit is 1: data should have an odd number of 1's
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
ERROR CORRECTION.
6.375 Project Arthur Chang Omid Salehi-Abari Sung Sik Woo May 11, 2011
Chapter 6 Information Theory
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Fundamental limits in Information Theory Chapter 10 :
DIGITAL COMMUNICATION Coding
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
Chapter 2 Error-Detecting Codes. Outline 2.1 Why Error-Detecting Codes? 2.2 Simple Parity Checks 2.3 Error-Detecting Codes 2.4 Independent Errors: White.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany packet transmission A.J. Han Vinck January 19, 2010.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
Error Correcting Codes To detect and correct errors Adding redundancy to the original message Crucial when it’s impossible to resend the message (interplanetary.
Noise, Information Theory, and Entropy
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.
Noise, Information Theory, and Entropy
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Engineering 1040: Mechanisms & Electric Circuits Winter 2015 Analog & Digital Signals Analog to Digital Conversion (ADC)
Channel Coding Part 1: Block Coding
Error Detection and Correction
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Image Compression (Chapter 8) CSC 446 Lecturer: Nada ALZaben.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Data and Computer Communications by William Stallings Eighth Edition Digital Data Communications Techniques Digital Data Communications Techniques Click.
Practical Session 10 Error Detecting and Correcting Codes.
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Basic Characteristics of Block Codes
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
§2 Discrete memoryless channels and their capacity function
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
DIGITAL COMMUNICATIONS Linear Block Codes
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
CS654: Digital Image Analysis
1 Central Limit Theorem The theorem states that the sum of a large number of independent observations from the same distribution has, under certain general.
ERROR DETECTING AND CORRECTING CODES -BY R.W. HAMMING PRESENTED BY- BALAKRISHNA DHARMANA.
Error Detection and Correction – Hamming Code
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Digital Image Processing Lecture 22: Image Compression
The Channel and Mutual Information
INFORMATION THEORY Pui-chor Wong.
ECE 101 An Introduction to Information Technology Information Coding.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Error Detecting and Error Correcting Codes
Practical Session 10 Computer Architecture and Assembly Language.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
parity bit is 1: data should have an odd number of 1's
The Viterbi Decoding Algorithm
Computer Architecture and Assembly Language
Coding and Interleaving
COT 5611 Operating Systems Design Principles Spring 2012
II. Linear Block Codes.
COT 5611 Operating Systems Design Principles Spring 2014
RAID Redundant Array of Inexpensive (Independent) Disks
Distributed Compression For Binary Symetric Channels
Information-Theoretic Security
Computer Architecture and Assembly Language
parity bit is 1: data should have an odd number of 1's
Theory: Question B & C For this task it has ask me to construct the ML(Maximum Likelihood) Table. Finally, I need to find the probability that the ML(Maximum.
Error Detection and Correction
Theory of Information Lecture 13
Presentation transcript:

CY2G2 Information Theory 5 Channel capacity C=max I(xy)  that is maximum information transfer Binary Symmetric Channels The noise in the system is random, then the probabilities of errors in ‘0’ and ‘1’ is the same. This is characterised by a single value p of binary error probability. 0 0 p(0) x (transmit) y (receive) p(1)=1-p(0) 1 1

Channel capacity of this channel Mutual information increases as error rate decreases This is an backward equivocation (error entropy),p is fixed, so the I(xy) is maximum when H(y) is maximum. This occurs when p(0)=p(1) at receiver (output) H(y)=1. C=1-H(p)

Example . Find the capacity of a binary symmetric channel with a binary error of 0.125. (a) Variation of information transfer with output probability (b) Variation of Capacity with error probability

How to overcome the problem of information loss in noisy channel? Physical solution? (b) System solution. (Channel coding). Source coding: The task of source coding is to represent the source information with the minimum of symbols under the assumption that channel is noisy-free. When a code is transmitted over a channel in the presence of noise, errors will occur. Channel coding: The task of channel coding is to represent the source information in a manner that minimises the error probability in decoding. Redundancy; --- put extra amount of information to compensate information loss; (temperature control of a room in winter for different outdoor temperature).

Symbol error is the error based on some decision rule; If a received code word (some bits might be in error) is classified as the wrong symbol (different than the original symbol it meant). Binomial distribution plays an important role in channel coding; A binomial distribution experiment consists of n identical trials, (think of coding a symbol by a binary digit sequence i.e. code word , so n is length of the code word). Each trial has two possible outcomes, S or F, respectively, with a probability p. Easily S can be defined as a transmission error (10 or 01). The probability p is bit error rate. is used to calculate probability of r bit errors in a codeword.

CY2G2 Information Theory 5 Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. or error correction Binary coding for error protection Example: Assume Binary Symmetrical Channel, p=0.01 ( error probability) Coding by repetition Code A=00000, B=11111, use majority decision rule.  If more 0’s than 1’s A 2 errors tolerated without producing symbol error. Use binomial probability distribution to find symbol error probability p(e) How to overcome the problem of information loss in noisy channel? Physical solution? (b) system solution. (purely computational). Redundancy; ---put extra amount of information to compensate information loss; (temperature control of a room in winter for different outdoor temperature). (i) compensate as necessary not too much for energy consumption ( transfer speed); (ii) release heating in a controlled way to maintain the temperature over a period of time (by channel coding for all possible symbols). Discuss the difference of source coding and channel code; (maybe just two stages of coding). ---------------------------------------------------------------------------------------------------------------------- error protection; code word; binary error rate (given, fixed) symbol error rate ( depends on classification of a binomial distribution) binomial distribution distribution function;

Information rate M  number of equiprobable code words. n  number of binary digits P(e) if R

2) Coding by selection of code words ( using 5 digits, there are 32 possible code words, But we don’t have to use them all. ) Two selections ( i.e. repetition) A=00000, B=11111 This gives Thirty -two selections

A compromise between two extremes 4 selections A compromise between two extremes A lot of code words to give reasonable R. Code words are as different as possible to reduce p(e), e.g. A 00000 B 00111 C 11001 D 11110 Each code word differs from all the other in at least three digit positions. Hamming distance is the number of digits positions in which a pair of code words differ.

CY2G2 Information Theory 5 Minimum Hamming distance (MHD) is the smallest hamming distance for the set of code words. MHD=3. One error can be tolerated.

32 Selections 2 Selections 4 Selections A 00000 Q 10000 B 00001 R 10001 11111 00111 C 00010 S 10010 11001 D 00011 T 10011 11110 E 00100 U 10100 F 00101 V 10101 G 00110 W 10110 H X 10111 I 01000 Y 11000 J 01001 Z K 01010 11010 L 01011 . 11011 M 01100 , 11100 N 01101 ; 11101 O 01110 : P 01111 ?