Download presentation
Presentation is loading. Please wait.
Published byPeregrine Tyler Modified over 9 years ago
1
§2 Discrete memoryless channels and their capacity function
§2.1 Channel capacity §2.2 The channel coding theorem §2.1 Channel capacity §2.2 The channel coding theorem
2
R = I(X;Y) = H(X) – H(X|Y)
§2.1 Channel capacity 1. Definition Review: R = I(X;Y) = H(X) – H(X|Y) Definition: bit/sig Let any discrete random variables X and Y, channel capacity is defined as Rt = I(X;Y)/t bit/second R--information rate
3
1. Definition C=1-H(p) §2.1 Channel capacity Example 2.1.1
Compute the capacity of BSC? Solution: C=1-H(p)
4
2. Simple discrete channel
§2.1 Channel capacity 2. Simple discrete channel One to one channel
5
2. Simple discrete channel
§2.1 Channel capacity 2. Simple discrete channel loseless channel
6
2. Simple discrete channel
§2.1 Channel capacity 2. Simple discrete channel noiseless channel
7
3. Symmetric channel §2.1 Channel capacity {p1’p2’… ps’} {q1’q2’… qr’}
definition: If the transition matrix P is as follow, it’s a symmetric channel: 1) Every row of P can be a permutation of the first row; 2) Every column of P can be a permutation of the first column. {p1’p2’… ps’} {q1’q2’… qr’}
8
3. Symmetric channel §2.1 Channel capacity Properties:
1) H(Y|X)=H(Y|X=ai)=H(p1’, p2’,…,ps’), i=1,2,…,r; 2) If the input random variable X has equal probabilities, then the output random variable Y has equal probabilities.
9
3. Symmetric channel §2.1 Channel capacity Example 2.1.2
Strongly symmetric channel
10
3. Symmetric channel §2.1 Channel capacity Example 2.1.3
Weakly symmetric channel
11
3. Symmetric channel §2.1 Channel capacity
characteristic of weakly symmetric channel The columns of its transition matrix P can be partitioned into subsets Ci such that , for each i, in the matrix Pi formed by the columns in Ci, each row is a permutation of every other row, and the same is true of columns. (P68 :T2.2 in textbook)
12
3. Symmetric channel §2.1 Channel capacity Example 2.1.4
Compute the capacity of BEC? Weakly symmetric channel
13
3. Symmetric channel §2.1 Channel capacity
characteristic of weakly symmetric channel The columns of its transition matrix P can be partitioned into subsets Ci such that , for each i, in the matrix Pi formed by the columns in Ci each row is a permutation of every other row, and the same is true of columns.
14
§2.1 Channel capacity 3. Symmetric channel problem C=?
15
4. Discrete memoryless extended channel
§2.1 Channel capacity 4. Discrete memoryless extended channel
16
Review KeyWords: Channel capacity Simple DMC Symmetric channel
(Strongly symmetric channel, Weakly symmetric channel)
17
Homework 1. P71: T2.19(a) ; 2. Calculate the channel capacity , and find the maximizing probability distribution.
18
Homework 3. Channel capacity. Consider the discrete memoryless channel Y = X+Z (mod 11), where and X Assume that Z is independent of X. Find the capacity. What is the maximizing p*(x)?
19
4. Find the capacity of the noisy typewriter channel.
Homework 4. Find the capacity of the noisy typewriter channel. (the channel input is either received unchanged at the output with probability ½, or is transformed into the next letter with probability ½ ) 到此,次8 汇报了第4次讨论题(刘滔,三班)
20
§2 Discrete memoryless channels and their capacity funtion
§2.1 The capacity function §2.2 The channel coding theorem §2.1 The capacity function §2.2 The channel coding theorem
21
1. The concept of channel coding
§2.2 The channel coding theorem 1. The concept of channel coding M C Source Encoder Channel Sink Decoder R M’ General digital communication system
22
MLD 1. The concept of channel coding §2.2 The channel coding theorem
Example 2.2.1 Repeating code MLD 000 001 000 010 Extended channel 000 100 111 011 110 111 101 111
23
1. The concept of channel coding
§2.2 The channel coding theorem 1. The concept of channel coding Example 2.2.2 Block code(5,2)
24
1. The concept of channel coding
§2.2 The channel coding theorem 1. The concept of channel coding Example 2.2.2 Block code(5,2)
25
(p62 corollary in textbook)
§2.2 The channel coding theorem 2. The channel coding theorem Theorem 2.1 (the channel coding theorem for DMC’s) . For any R < C and , for all sufficiently large n there exists a code [C] = {x1,…,xM} of length n and a decoding rule such that: 1) , 2) PE< . (p62 corollary in textbook)
26
2. The channel coding theorem
Statement 2 (The channel coding theorem): All rates below capacity C are achievable. Specifically, for every rate R ≤ C, there exists a sequence of (2nR,n) codes with maximum probability of error Conversely, any sequence of (2nR, n) codes with must have R ≤ C. 到此,次9.今天学生鲜花——为女人节,很是感动! 本次课有空余,最后啰嗦了多句,还是提前3~5分钟下课,留给学生问问题
27
Home work thinking The repeating code (2n+1,1), using MLD decoder.
Show that its average error probability is p is the error probability of BSC, compute PE when p = 0.01, n = 1,2,3,4.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.