Download presentation
Presentation is loading. Please wait.
1
資料壓縮 授課老師 : 陳建源 Email:cychen07@nuk.edu.tw 研究室 : 法 401 網站 http://www.csie.nuk.edu.tw/~cychen/ 資訊理論基本概念
2
Let S be a system of events Def: The self-information of the event E k is written I(E k ): 1. Self-information in which The base of the logarithm: 2 (log), e (ln) 單位: bit, nat
3
when then when then when then when then 愈小 愈大 1. Self-information
4
Ex1. A letter is chosen at random from the Enlish alphabet. Ex2. A binary number of m digits is chosen at random. 1. Self-information
5
Ex3. 64 points are arranged in a square grid. E j be the event that a point picked at random in the j th column E k be the event that a point picked at random in the k th row Why? 1. Self-information
6
E(f) be expectation or average or mean of f f: E k → f k Let S be the system with events the associated probabilities being 2. Entropy
7
觀察 最小值為 0 ,表示已確定。但最大值呢 ? Let certainty Def: The entropy of S, called H(S), is the average of the self-information Self-information of an event increases as its uncertainty grows 2. Entropy
8
Thm: with equality only when Proof: 2. Entropy
9
Thm 2.2:For x>0 with equality only when x=1. Assume that p k ≠0 2. Entropy
11
In the system S the probabilities p 1 and p 2 where p 2 > p 1 are replaced by p 1 +ε and p 2 -εrespectively under the proviso 0<2ε<p 2 -p 1. Prove the H(S) is increased. We know that entropy H(S) can be viewed as a measure of _____ about S. Please list 3 items for this blank. information uncertainty randomness 2. Entropy
12
Let S 1 be the system with events the associated probabilities being Let S 2 be the system with events the associated probabilities being 3. Mutual information
13
Two systems S 1 and S 2 satisfying relation 3. Mutual information
14
relation 3. Mutual information
15
conditional probability conditional self-information mutual information NOTE: 3. Mutual information
16
conditional entropy mutual information 3. Mutual information
17
conditional self-informationmutual information and If E j and F k are statistically independent 3. Mutual information
18
joint entropy joint entropy and conditional entropy 3. Mutual information
19
mutual information and conditional entropy 3. Mutual information
20
mutual information of two systems cannot exceed the sum of their separate entropies Thm: 3. Mutual information
21
Joint entropy of two statistically independent systems is the sum of their separate entropies System’s independent If S 1 and S 2 are statistically independent 3. Mutual information
22
Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:Assume that p jk ≠0
23
Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:
24
Ex: A binary symmetric channel with crossover probability ε Let S 1 be the input E 0 =0, E 1 =1 and S 2 be the output F 0 =0, F 1 =1 0 0 11 1-ε ε ε 3. Mutual information
25
Assume that Then 3. Mutual information
26
Compute the output Then If then 3. Mutual information
27
Compute the mutual information 3. Mutual information
28
Compute the mutual information 3. Mutual information
29
whenever the integral exists. The differential entropy of f(x) is defined by NOTE: (1)The entropy of a continuous distribution need not exist. (2)Entropy may be negative Def: The entropy of S, called H(S), is the average of the self-information 4. Differential entropy
30
whenever the integral exists. Example: Consider a random variable distributed uniformly from 0 to a so that its density is 1/a from 0 to a and 0 elsewhere. Then its differential entropy is 4. Differential entropy
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.