Chapter 8 Channel Capacity. bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty.

Slides:



Advertisements
Similar presentations
Mahdi Barhoush Mohammad Hanaysheh
Advertisements

Convolutional Codes Mohammad Hanaysheh Mahdi Barhoush.
1 Binary Signal Detection In MATLAB weve worked some with AWGN noise and the concept of bit error probability. Lets describe how this is measured.
Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
DCSP-8: Minimal length coding II, Hamming distance, Encryption Jianfeng Feng
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
II. Modulation & Coding. © Tallal Elshabrawy Design Goals of Communication Systems 1.Maximize transmission bit rate 2.Minimize bit error probability 3.Minimize.
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Information Theory EE322 Al-Sanie.
Chain Rules for Entropy
Chapter 6 Information Theory
Cellular Communications
Fundamental limits in Information Theory Chapter 10 :
Codes for Deletion and Insertion Channels with Segmented Errors Zhenming Liu Michael Mitzenmacher Harvard University, School of Engineering and Applied.
Revision of Chapter III For an information source {p i, i=1,2,…,N} its entropy is defined by Shannon’s first theorem: For an instantaneous coding, we have.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
Faculty of Computer Science © 2006 CMPUT 229 Special-Purpose Codes Binary, BCD, Hamming, Gray, EDC, ECC.
Channel Polarization and Polar Codes
exercise in the previous class (1)
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Channel Coding Part 1: Block Coding
Daphne Koller Message Passing Loopy BP and Message Decoding Probabilistic Graphical Models Inference.
Information and Coding Theory Linear Block Codes. Basic definitions and some examples. Juris Viksna, 2015.
Exercise in the previous class p: the probability that symbols are delivered correctly C: 1 00 → → → → What is the threshold.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
Channel Capacity.
§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion.
COEN 180 Erasure Correcting, Error Detecting, and Error Correcting Codes.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
Introduction to Digital and Analog Communication Systems
§2 Discrete memoryless channels and their capacity function
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Coding and Algorithms for Memories Lecture 4 1.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
DIGITAL COMMUNICATIONS Linear Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Information Theory Linear Block Codes Jalal Al Roumy.
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
Basic Concepts of Encoding Codes and Error Correction 1.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
Cryptography and Coding Theory
The Channel and Mutual Information
INFORMATION THEORY Pui-chor Wong.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Computer Architecture Error Correcting Codes Ralph Grishman Oct (Text pp and B-65-67) NYU.
Richard Cleve DC 2117 Introduction to Quantum Information Processing QIC 710 / CS 667 / PH 767 / CO 681 / AM 871 Lecture (2011)
Error Detecting and Error Correcting Codes
II. Linear Block Codes. © Tallal Elshabrawy 2 Digital Communication Systems Source of Information User of Information Source Encoder Channel Encoder Modulator.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
Introduction to Information theory
Information Theory Michael J. Watts
Chapter 6.
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Lecture 4 review What prevents us from sending data at infinite speed?
Distributed Compression For Binary Symetric Channels
II. Modulation & Coding.
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Theory of Information Lecture 13
Presentation transcript:

Chapter 8 Channel Capacity

bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty being sent: before receiving after receiving 8.1

Uniform Channel channel probabilities do not change from symbol to symbol I.e. the rows of the probability matrix are permutations of each other. So the following is independent of a: Consider no noise: P(b | a) = 1for some b 0all others W = 0 I(A ; B) = H(B) = H(A) (conforms to intuition only if permutation matrix) 8.2 All noise implies H(B) = W

Capacity of Binary Symmetric Channel P Q p(a = 0) p(a = 1) p(b = 0) p(b = 1) where x = pP + (1 p)Qp = p(a = 0) maximum occurs when x = ½, p = ½ also (unless all noise). C = 1 H 2 (P) 8.5

Numerical Examples If P = ½ + ε, then C(P) 3 ε 2 is a great approximation. ProbabilityCapacity P = ½Q = ½C = 0 % P = 0.6Q = 0.4C ~ 3 % P = 0.7Q = 0.3C ~ 12 % P = 0.8Q = 0.2C ~ 28 % P = 0.9Q = 0.1C ~ 53 % P = 0.99Q = 0.01C ~ 92 % P = 0.999Q = 0.001C ~ 99 % 8.5

Error Detecting Code Use a uniform channel with uniform input: p(a 1 ) = … = p(a q ). Apply to n-bit single error detection, with one parity bit among c i {0, 1}: 8.3 P Q Q P |A| = 2 n1 a = c 1 … c n (even parity) |B| = 2 n c 1 … c n = b (any parity) For blocks of size n, we know the probability of k errors = every b B can be obtained from any a A by k = 0 … n errors:

|| n th term = 0 || 0 th term = 0 || n … | B | = 2 n This is W for one bit W

Error Correcting Code × 3 noisy channel 3 encode triplicate decode majority P Q Q P think of this as the channel originalvs.new prob. of no errors = P 3 = probability of no error prob. of 1 error = 3P 2 Q prob. of 2 errors = 3PQ 2 = probability of an error prob. of 3 errors = Q 3 uncoded coded P 3 +3P 2 Q 3PQ 2 +Q 3 3PQ 2 +Q 3 P 3 +3P 2 Q let P = P 2 (P + 3Q) 8.4

PC(P)C(P)PC(P)/3.9992% %.953% %.828%.89617%.712% %.63%.6482%.51.03% % Shannons Theorem will say that as n = 3, there are codes that take P 1 while C(P)/n C(P). 8.4