Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:

Slides:



Advertisements
Similar presentations
Topics discussed in this section:
Advertisements

Chapter 8 Channel Capacity. bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
Applied Algorithmics - week7
Information Theory EE322 Al-Sanie.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
II. Linear Block Codes. © Tallal Elshabrawy 2 Last Lecture H Matrix and Calculation of d min Error Detection Capability Error Correction Capability Error.
Information Theory Introduction to Channel Coding Jalal Al Roumy.
Chain Rules for Entropy
Entropy and Shannon’s First Theorem
Lab 2 COMMUNICATION TECHNOLOGY II. Capacity of a System The bit rate of a system increases with an increase in the number of signal levels we use to denote.
Chapter 6 Information Theory
Cellular Communications
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
The Goldreich-Levin Theorem: List-decoding the Hadamard code
Chapter 9 Mathematical Preliminaries. Stirling’s Approximation Fig by trapezoid rule take antilogs Fig by midpoint formula take antilogs.
Practical Session 11 Codes. Hamming Distance General case: The distance between two code words is the amount of 1-bit changes required to reach from one.
Error Correcting Codes To detect and correct errors Adding redundancy to the original message Crucial when it’s impossible to resend the message (interplanetary.
Variable-Length Codes: Huffman Codes
Reliability and Channel Coding
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
Hamming Codes 11/17/04. History In the late 1940’s Richard Hamming recognized that the further evolution of computers required greater reliability, in.
Noise, Information Theory, and Entropy
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
Analysis of Iterative Decoding
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
The Hat Game 11/19/04 James Fiedler. References Hendrik W. Lenstra, Jr. and Gadiel Seroussi, On Hats and Other Covers, preprint, 2002,
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
Channel Coding Part 1: Block Coding
CY2G2 Information Theory 5
Gaussian Channel. Introduction The most important continuous alphabet channel is the Gaussian channel depicted in Figure. This is a time-discrete channel.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Information and Coding Theory Transmission over noisy channels. Channel capacity, Shannon’s theorem. Juris Viksna, 2015.
Channel Capacity.
Codes Codes are used for the following purposes: - to detect errors - to correct errors after detection Error Control Coding © Erhan A. Ince Types: -Linear.
ERROR CONTROL CODING Basic concepts Classes of codes: Block Codes
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
DIGITAL COMMUNICATIONS Linear Block Codes
Coding Theory Efficient and Reliable Transfer of Information
University of Massachusetts Amherst · Department of Computer Science Square Root Law for Communication with Low Probability of Detection on AWGN Channels.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Chapter 31 INTRODUCTION TO ALGEBRAIC CODING THEORY.
Source Coding Efficient Data Representation A.J. Han Vinck.
EE 3220: Digital Communication
Basic Concepts of Encoding Codes and Error Correction 1.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Richard Cleve DC 2117 Introduction to Quantum Information Processing QIC 710 / CS 667 / PH 767 / CO 681 / AM 871 Lecture (2011)
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3c: Signal Detection in AWGN.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
The Viterbi Decoding Algorithm
Transmission over noisy channels. Channel capacity, Shannon’s theorem.
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
A Brief Introduction to Information Theory
Nyquist and Shannon Capacity
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Problem 1.
Topics discussed in this section:
Theory of Information Lecture 13
Watermarking with Side Information
Presentation transcript:

Chapter 10 Shannon’s Theorem

Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem: extends this idea to a channel with errors, allowing one to reach arbitrarily close to the channel capacity while simultaneously correcting almost all the errors. Proof: it does so without constructing a specific code, and relies instead on a random code.

Review/Example Choose a decision rule based on maximum likelihood: d(b 1 ) = a 1 ; d(b 2 ) = arbitrary; d(b 3 ) = a 2. The probability of making a mistake is P(E | b j ) = 1 − P(d(b j ) | b j ). assume all source symbols equally likely Calculation for above example P E = 1 − 1/3 ( ) = 17/ , b 1 b 2 b 3 a1a2a3a1a2a3

Random Codes Send an n-bit block code through a binary symmetric channel: 10.4 P Q Q P M distinct equiprobable n -bit blocks A = {a i : i = 1, …, M} I 2 (a i ) = log 2 M Intuitively, each block comes through with n∙C bits of information. C = 1 − H 2 (Q) Q < ½ To signal close to capacity, we want I 2 (a i ) = n (C − ε) small number ε > 0 intuitively, # of messages that can get thru channel by increasing n, this can be made arbitrarily large  we can choose M so that we use only a small fraction of the # of messages that could get thru – redundancy. Excess redundancy gives us the room required to bring the error rate down. For a large n, pick M random codewords from {0, 1} n. B = {b j : |b j | = n, j = 1, …, 2 n }

With high probability, almost all a i will be a certain distance apart (provided M « 2 n ). Picture the a i in n-dimensional Hamming space. As each a i goes thru channel, we expect nQ errors on average. Consider a sphere on radius n (Q + ε ′ ) about each a i : aiai nQ nε′nε′ received symbol By the law of large numbers, can be made « δ Similarly, around each b j : What us the probability that an uncorrectable error occurs? bjbj nQ nε′nε′ a′ a i aiai sent symbol 10.4 bjbj too much noise another a ′ is also inside

Idea Pick # of code words M to be 2 n(C−ε) where C is the channel capacity (the block size n is as yet undetermined and depends on how close ε we wish to approach the channel capacity). The number of possible random codes = (2 n ) M = 2 nM, each equally likely. Let P E = the probability of errors averaged over all random codes. The idea is to show that P E → 0. I.e. given any code, most of the time it will probably work!

Proof Suppose a is what’s sent, and b what’s received. Let X = 0/1 be a random variable representing errors in the channel, with probability P/Q. So if the error vector a  b = (X 1, …, X n ), then d(a, b) = X 1 + … + X n (by law of large numbers) N. B. Q = E{X}  Q < ½, pick ε′  Q + ε′ < ½

Since the a′ are randomly (uniformly) distributed throughout, by the binomial bound volume of whole space 10.5  Chance that some particular code word lands too close. Chance that any one is too close. N.b. e = log 2 ( 1 / Q –1) > 0, so we can choose ε′e < ε.