Download presentation
1
§4 Continuous source and Gaussian channel
2
1. Differential entropy §4.1 Continuous source Definition:
Let X be a random variable with cumulative distribution function F(x) = Pr(X≤x). If F(x) is continuous, the random variable is said to be continuous. Let when the derivative is defined. If , then p(x) is called the probability density function for X. The set where p(x) > 0 is called the support set of X.
3
§4.1 Continuous source 1. Differential entropy Definition: The differential entropy h(X) of a continuous random variable X with a density p(x) is defined as where S is the support set of the random variable.
4
1. Differential entropy §4.1 Continuous source Example 4.1.1
(X~N (m,σ2), Normal distribution) please calculate the differential entropy.
5
§4.1 Continuous source 1. Differential entropy Definition: The differential entropy of a set X,Y of random variables with density p(xy) is defined as If X,Y have a joint density p(xy), we can define the conditional differential entropy h(X|Y) as
6
2. Properties of differential entropy
§4.1 Continuous source 2. Properties of differential entropy 1) h (XY) = h(X) + h(Y|X) = h(Y) + h(X|Y)
7
2. Properties of differential entropy
§4.1 Continuous source 2. Properties of differential entropy 2) h(X) can be negative. Example 4.1.2 Consider a random variable distributed uniformly from a to b. If (b-a)<1, h(X) < 0.
8
2. Properties of differential entropy
§4.1 Continuous source 2. Properties of differential entropy 3) h(X) is a convex function of the input probabilities p(x), it has the maximum. Theorem 4.1 If the peak power of the random variable X is restricted, the maximizing distribution is the uniform distribution. If the average power of the random variable X is restricted, the maximizing distribution is the normal distribution.
9
2. Properties of differential entropy
§4.1 Continuous source 2. Properties of differential entropy 4) let Y=g(X), the differential entropy of Y may be different with h(X). Example 4.1.3 Let X is a random variable distributed uniformly from -1 to 1, and Y=2X. h(X)=? h(Y)=? Theorem 4.2 Theorem 4.3
10
Review Differential entropy Chain rule of differential entropy
KeyWords: Differential entropy Chain rule of differential entropy Conditioning reduces entropy Independent bound of differential entropy may be negative convex function transformative
11
a) h (XY) = h(X) + h(Y|X) = h(Y) + h(X|Y)
Homework Prove the following conclusions: a) h (XY) = h(X) + h(Y|X) = h(Y) + h(X|Y) 到此,次12,时间正好
12
§4 Continuous source and Gaussian channel
13
1.The model of Gaussian channel
X Z Y Normal, mean 0, variance σz2 Y=X+Z X and Z are independent
14
I(X;Y) = h(Y) – h(Y|X) = h(Y) – h(Z|X) = h(Y) – h(Z)
§4.2 Gaussian channel 2. Average mutual information I(X;Y) = h(Y) – h(Y|X) (Y=X+Z) = h(Y) – h(Z|X) = h(Y) – h(Z) Example 4.2.1 Let X~N(0,σx2), Y~N(0,σx2+σz2),
15
3. The channel capacity I(X;Y) = h(Y) – h(Z) §4.2 Gaussian channel
Definition: The information capacity of the Gaussian channel with power constraint P is I(X;Y) = h(Y) – h(Z)
16
§4.2 Gaussian channel 3. The channel capacity
17
3. The channel capacity §4.2 Gaussian channel (bits/sample )
Thinking about the band-limited channels, transmission bandwidth is W, (bits/sample ) There are 2W samples per second,
18
§4.2 Gaussian channel 4. Shannon’s formula (bits/sec) Shannon’s famous expression for the capacity of a band-limited, power-limited Gaussian channel.
19
4. Shannon’s formula 1)Ct、W、SNR can be interchanged. 2)
§4.2 Gaussian channel 4. Shannon’s formula Remarks: 1)Ct、W、SNR can be interchanged. 2)
20
4. Shannon’s formula 3) shannon limit Ct (bps) W §4.2 Gaussian channel
For infinite bandwidth channels Ct (bps) W
21
4. Shannon’s formula §4.2 Gaussian channel
Let Eb is the energy per bit, then As
22
Information rate of Gaussian channel
Review KeyWords: Information rate of Gaussian channel Capacity of Gaussian channel Shannon’s fomula (Band limited, power limited) Shannon limit
23
Homework 1. In image transmission, there are 2.25*106 pixels per frame. Reproducing image needs 4 bits per pixel (assume that each bit has equal probability to choose ‘0’ and ‘1’ ). Compute the channel bandwidth needed for transmitting 30 frames image per second . (P/N = 30dB) 2. Consider a power-limited Gaussian channel , bandwidth is 3kHz, and (P + N)/N = 10dB. (1) Compute the maximum rate of this channel. (bps) (2) If SNR decreases to 5 dB, give the channel bandwidth with the same maximum rate.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.