Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 8 Channel Capacity. bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty.

Similar presentations


Presentation on theme: "Chapter 8 Channel Capacity. bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty."— Presentation transcript:

1 Chapter 8 Channel Capacity

2 bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty being sent: before receiving after receiving 8.1

3 Uniform Channel channel probabilities do not change from symbol to symbol I.e. the rows of the probability matrix are permutations of each other. So the following is independent of a: Consider no noise: P(b | a) = 1for some b 0all others W = 0 I(A ; B) = H(B) = H(A) (conforms to intuition only if permutation matrix) 8.2 All noise implies H(B) = W

4 Capacity of Binary Symmetric Channel 00 11 P Q p(a = 0) p(a = 1) p(b = 0) p(b = 1) where x = pP + (1 p)Qp = p(a = 0) maximum occurs when x = ½, p = ½ also (unless all noise). C = 1 H 2 (P) 8.5

5 Numerical Examples If P = ½ + ε, then C(P) 3 ε 2 is a great approximation. ProbabilityCapacity P = ½Q = ½C = 0 % P = 0.6Q = 0.4C ~ 3 % P = 0.7Q = 0.3C ~ 12 % P = 0.8Q = 0.2C ~ 28 % P = 0.9Q = 0.1C ~ 53 % P = 0.99Q = 0.01C ~ 92 % P = 0.999Q = 0.001C ~ 99 % 8.5

6 Error Detecting Code Use a uniform channel with uniform input: p(a 1 ) = … = p(a q ). Apply to n-bit single error detection, with one parity bit among c i {0, 1}: 8.3 P Q Q P |A| = 2 n1 a = c 1 … c n (even parity) |B| = 2 n c 1 … c n = b (any parity) For blocks of size n, we know the probability of k errors = every b B can be obtained from any a A by k = 0 … n errors:

7 || n th term = 0 || 0 th term = 0 || n … | B | = 2 n This is W for one bit W 8.3 11

8 Error Correcting Code × 3 noisy channel 3 encode triplicate decode majority P Q Q P think of this as the channel originalvs.new prob. of no errors = P 3 = probability of no error prob. of 1 error = 3P 2 Q prob. of 2 errors = 3PQ 2 = probability of an error prob. of 3 errors = Q 3 uncoded coded P 3 +3P 2 Q 3PQ 2 +Q 3 3PQ 2 +Q 3 P 3 +3P 2 Q let P = P 2 (P + 3Q) 8.4

9 PC(P)C(P)PC(P)/3.9992%.999733.2%.953%..97227%.828%.89617%.712%..7848.2%.63%.6482%.51.03%.512.014% Shannons Theorem will say that as n = 3, there are codes that take P 1 while C(P)/n C(P). 8.4


Download ppt "Chapter 8 Channel Capacity. bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty."

Similar presentations


Ads by Google