資料壓縮 授課老師 : 陳建源 研究室 : 法 401 網站 資訊理論基本概念
Let S be a system of events Def: The self-information of the event E k is written I(E k ): 1. Self-information in which The base of the logarithm: 2 (log), e (ln) 單位: bit, nat
when then when then when then when then 愈小 愈大 1. Self-information
Ex1. A letter is chosen at random from the Enlish alphabet. Ex2. A binary number of m digits is chosen at random. 1. Self-information
Ex3. 64 points are arranged in a square grid. E j be the event that a point picked at random in the j th column E k be the event that a point picked at random in the k th row Why? 1. Self-information
E(f) be expectation or average or mean of f f: E k → f k Let S be the system with events the associated probabilities being 2. Entropy
觀察 最小值為 0 ,表示已確定。但最大值呢 ? Let certainty Def: The entropy of S, called H(S), is the average of the self-information Self-information of an event increases as its uncertainty grows 2. Entropy
Thm: with equality only when Proof: 2. Entropy
Thm 2.2:For x>0 with equality only when x=1. Assume that p k ≠0 2. Entropy
In the system S the probabilities p 1 and p 2 where p 2 > p 1 are replaced by p 1 +ε and p 2 -εrespectively under the proviso 0<2ε<p 2 -p 1. Prove the H(S) is increased. We know that entropy H(S) can be viewed as a measure of _____ about S. Please list 3 items for this blank. information uncertainty randomness 2. Entropy
Let S 1 be the system with events the associated probabilities being Let S 2 be the system with events the associated probabilities being 3. Mutual information
Two systems S 1 and S 2 satisfying relation 3. Mutual information
relation 3. Mutual information
conditional probability conditional self-information mutual information NOTE: 3. Mutual information
conditional entropy mutual information 3. Mutual information
conditional self-informationmutual information and If E j and F k are statistically independent 3. Mutual information
joint entropy joint entropy and conditional entropy 3. Mutual information
mutual information and conditional entropy 3. Mutual information
mutual information of two systems cannot exceed the sum of their separate entropies Thm: 3. Mutual information
Joint entropy of two statistically independent systems is the sum of their separate entropies System’s independent If S 1 and S 2 are statistically independent 3. Mutual information
Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:Assume that p jk ≠0
Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:
Ex: A binary symmetric channel with crossover probability ε Let S 1 be the input E 0 =0, E 1 =1 and S 2 be the output F 0 =0, F 1 = ε ε ε 3. Mutual information
Assume that Then 3. Mutual information
Compute the output Then If then 3. Mutual information
Compute the mutual information 3. Mutual information
Compute the mutual information 3. Mutual information
whenever the integral exists. The differential entropy of f(x) is defined by NOTE: (1)The entropy of a continuous distribution need not exist. (2)Entropy may be negative Def: The entropy of S, called H(S), is the average of the self-information 4. Differential entropy
whenever the integral exists. Example: Consider a random variable distributed uniformly from 0 to a so that its density is 1/a from 0 to a and 0 elsewhere. Then its differential entropy is 4. Differential entropy