資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch2: Basic Concepts
Let S be a system of events Def: The self-information of the event E k is written I(E k ): 2. 1 Self-information in which The base of the logarithm: 2 (log), e (ln) 單位: bit, nat
Ch2: Basic Concepts when 2. 1 Self-information then when then when then when then 愈小 愈大
Ch2: Basic Concepts 2. 1 Self-information Ex1. A letter is chosen at random from the Enlish alphabet. Ex2. A binary number of m digits is chosen at random.
Ch2: Basic Concepts 2. 1 Self-information Ex3. 64 points are arranged in a square grid. E j be the event that a point picked at random in the j th column E k be the event that a point picked at random in the k th row Why?
Ch2: Basic Concepts 2. 2 Entropy E(f) be expectation or average or mean of f f: E k → f k Let S be the system with events the associated probabilities being
Ch2: Basic Concepts 2. 2 Entropy 觀察 最小值為 0 ,表示已確定。但最大值呢 ? Let certainty Def: The entropy of S, called H(S), is the average of the self-information Self-information of an event increases as its uncertainty grows
Ch2: Basic Concepts 2. 2 Entropy Thm: with equality only when Proof:
Ch2: Basic Concepts 2. 2 Entropy Thm 2.2:For x>0 with equality only when x=1. Assume that p k ≠0
Ch2: Basic Concepts 2. 2 Entropy
Ch2: Basic Concepts 2. 2 Entropy In the system S the probabilities p 1 and p 2 where p 2 > p 1 are replaced by p 1 +ε and p 2 -εrespectively under the proviso 0<2ε<p 2 -p 1. Prove the H(S) is increased. We know that entropy H(S) can be viewed as a measure of _____ about S. Please list 3 items for this blank. information uncertainty randomness
Ch2: Basic Concepts 2. 3 Mutual information Let S 1 be the system with events the associated probabilities being Let S 2 be the system with events the associated probabilities being
Ch2: Basic Concepts 2. 3 Mutual information Two systems S 1 and S 2 satisfying relation
Ch2: Basic Concepts 2. 3 Mutual information relation
Ch2: Basic Concepts 2. 3 Mutual information conditional probability conditional self-information mutual information NOTE:
Ch2: Basic Concepts 2. 3 Mutual information conditional entropy mutual information
Ch2: Basic Concepts 2. 3 Mutual information conditional self-informationmutual information and If E j and F k are statistically independent
Ch2: Basic Concepts 2. 3 Mutual information joint entropy joint entropy and conditional entropy
Ch2: Basic Concepts 2. 3 Mutual information mutual information and conditional entropy
Ch2: Basic Concepts 2. 3 Mutual information mutual information of two systems cannot exceed the sum of their separate entropies Thm:
Ch2: Basic Concepts 2. 3 Mutual information Joint entropy of two statistically independent systems is the sum of their separate entropies System’s independent If S 1 and S 2 are statistically independent
Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:Assume that p jk ≠0
Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:
Ch2: Basic Concepts 2. 3 Mutual information Ex: A binary symmetric channel with crossover probability ε Let S 1 be the input E 0 =0, E 1 =1 and S 2 be the output F 0 =0, F 1 =1
Ch2: Basic Concepts 2. 3 Mutual information Assume that Then
Ch2: Basic Concepts 2. 3 Mutual information Compute the output Then If then
Ch2: Basic Concepts 2. 3 Mutual information Compute the mutual information
Ch2: Basic Concepts 2. 3 Mutual information Compute the mutual information
Ch2: Basic Concepts 2. 3 Mutual information Ex: The following message may be sent over a binary symmetric channel with crossover probability ε and they are equally probable at the input. What is the mutual information between M 1 and the first output digit being 0? What additional mutual information is conveyed by the knowledge that the second output digit is also 0?
Ch2: Basic Concepts 2. 3 Mutual information For the output 00 The extra mutual infoemation
Ch2: Basic Concepts 2. 4 Data processing theorem If S 1 and S 3 are statistically independent when conditioned on S 2, then If S 1 and S 3 are statistically independent when conditioned on S 2, then Data processing theorem convexity theorem
Ch2: Basic Concepts 2. 4 Data processing theorem If S 1 and S 3 are statistically independent when conditioned on S 2, then Data processing theorem proof
Ch2: Basic Concepts 2. 5 Uniqueness theorem be a continuous function of its arguments in which Def: 滿足 (a) f takes its largest value of p k =1/n (b) f is unaltered if an impossible event is added to the system (c)
Ch2: Basic Concepts 2. 5 Uniqueness theorem Uniqueness theorem for a positive constant C proof