資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch2: Basic Concepts.

Slides:



Advertisements
Similar presentations
Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
Advertisements

Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
CY2G2 Information Theory 1
Sampling and Pulse Code Modulation
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Information Theory EE322 Al-Sanie.
Probability Theory Part 1: Basic Concepts. Sample Space - Events  Sample Point The outcome of a random experiment  Sample Space S The set of all possible.
Chain Rules for Entropy
Entropy and Shannon’s First Theorem
On the size of dissociated bases Raphael Yuster University of Haifa Joint work with Vsevolod Lev University of Haifa.
Entropy Rates of a Stochastic Process
Chapter 6 Information Theory
Chapter 2 Random Vectors 與他們之間的性質 (Random vectors and their properties)
Chapter 3 Growth of Functions
ACT1 Slides by Vera Asodi & Tomer Naveh. Updated by : Avi Ben-Aroya & Alon Brook Adapted from Oded Goldreich’s course lecture notes by Sergey Benditkis,
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch4: Channel.
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch3: Coding Theory.
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch1: Elements of Probability.
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站
1 Chapter 5 A Measure of Information. 2 Outline 5.1 Axioms for the uncertainty measure 5.2 Two Interpretations of the uncertainty function 5.3 Properties.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch6: Differential entropy.
資料壓縮 授課老師 : 陳建源 研究室 : 法 401 網站 資訊理論基本概念.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Lossless Compression - I Hao Jiang Computer Science Department Sept. 13, 2007.
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch5: Error Correction.
Information and Coding Theory
§1 Entropy and mutual information
STATISTIC & INFORMATION THEORY (CSNB134)
2. Mathematical Foundations
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Basic Concepts in Number Theory Background for Random Number Generation 1.For any pair of integers n and m, m  0, there exists a unique pair of integers.
Channel Capacity.
3. Counting Permutations Combinations Pigeonhole principle Elements of Probability Recurrence Relations.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
Chapter 12 Probability. Chapter 12 The probability of an occurrence is written as P(A) and is equal to.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
§2 Discrete memoryless channels and their capacity function
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
1 Information Theory Nathanael Paul Oct. 09, 2002.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Ch Counting Principles. Example 1  Eight pieces of paper are numbered from 1-8 and placed in a box. One piece of paper is drawn from the box, its.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
2IS80 Fundamentals of Informatics
Digital Image Processing Lecture 22: Image Compression
Data Structures -3 st exam- 授課教師 : 李錫智 教授. 1. [5] Assume we have a binary tree which is implemented in a pointer-based scheme. Describe how to know the.
The Channel and Mutual Information
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
Statistical methods in NLP Course 2 Diana Trandab ă ț
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Kolmogorov Complexity
Proving Statements about Segments
Module #16: Probability Theory
The Curve Merger (Dvir & Widgerson, 2008)
Module #16: Probability Theory
Clustering.
Module #16: Probability Theory
Presentation transcript:

資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch2: Basic Concepts

Let S be a system of events Def: The self-information of the event E k is written I(E k ): 2. 1 Self-information in which The base of the logarithm: 2 (log), e (ln) 單位: bit, nat

Ch2: Basic Concepts when 2. 1 Self-information then when then when then when then 愈小 愈大

Ch2: Basic Concepts 2. 1 Self-information Ex1. A letter is chosen at random from the Enlish alphabet. Ex2. A binary number of m digits is chosen at random.

Ch2: Basic Concepts 2. 1 Self-information Ex3. 64 points are arranged in a square grid. E j be the event that a point picked at random in the j th column E k be the event that a point picked at random in the k th row Why?

Ch2: Basic Concepts 2. 2 Entropy E(f) be expectation or average or mean of f f: E k → f k Let S be the system with events the associated probabilities being

Ch2: Basic Concepts 2. 2 Entropy 觀察 最小值為 0 ,表示已確定。但最大值呢 ? Let certainty Def: The entropy of S, called H(S), is the average of the self-information Self-information of an event increases as its uncertainty grows

Ch2: Basic Concepts 2. 2 Entropy Thm: with equality only when Proof:

Ch2: Basic Concepts 2. 2 Entropy Thm 2.2:For x>0 with equality only when x=1. Assume that p k ≠0

Ch2: Basic Concepts 2. 2 Entropy

Ch2: Basic Concepts 2. 2 Entropy In the system S the probabilities p 1 and p 2 where p 2 > p 1 are replaced by p 1 +ε and p 2 -εrespectively under the proviso 0<2ε<p 2 -p 1. Prove the H(S) is increased. We know that entropy H(S) can be viewed as a measure of _____ about S. Please list 3 items for this blank. information uncertainty randomness

Ch2: Basic Concepts 2. 3 Mutual information Let S 1 be the system with events the associated probabilities being Let S 2 be the system with events the associated probabilities being

Ch2: Basic Concepts 2. 3 Mutual information Two systems S 1 and S 2 satisfying relation

Ch2: Basic Concepts 2. 3 Mutual information relation

Ch2: Basic Concepts 2. 3 Mutual information conditional probability conditional self-information mutual information NOTE:

Ch2: Basic Concepts 2. 3 Mutual information conditional entropy mutual information

Ch2: Basic Concepts 2. 3 Mutual information conditional self-informationmutual information and If E j and F k are statistically independent

Ch2: Basic Concepts 2. 3 Mutual information joint entropy joint entropy and conditional entropy

Ch2: Basic Concepts 2. 3 Mutual information mutual information and conditional entropy

Ch2: Basic Concepts 2. 3 Mutual information mutual information of two systems cannot exceed the sum of their separate entropies Thm:

Ch2: Basic Concepts 2. 3 Mutual information Joint entropy of two statistically independent systems is the sum of their separate entropies System’s independent If S 1 and S 2 are statistically independent

Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:Assume that p jk ≠0

Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:

Ch2: Basic Concepts 2. 3 Mutual information Ex: A binary symmetric channel with crossover probability ε Let S 1 be the input E 0 =0, E 1 =1 and S 2 be the output F 0 =0, F 1 =1

Ch2: Basic Concepts 2. 3 Mutual information Assume that Then

Ch2: Basic Concepts 2. 3 Mutual information Compute the output Then If then

Ch2: Basic Concepts 2. 3 Mutual information Compute the mutual information

Ch2: Basic Concepts 2. 3 Mutual information Compute the mutual information

Ch2: Basic Concepts 2. 3 Mutual information Ex: The following message may be sent over a binary symmetric channel with crossover probability ε and they are equally probable at the input. What is the mutual information between M 1 and the first output digit being 0? What additional mutual information is conveyed by the knowledge that the second output digit is also 0?

Ch2: Basic Concepts 2. 3 Mutual information For the output 00 The extra mutual infoemation

Ch2: Basic Concepts 2. 4 Data processing theorem If S 1 and S 3 are statistically independent when conditioned on S 2, then If S 1 and S 3 are statistically independent when conditioned on S 2, then Data processing theorem convexity theorem

Ch2: Basic Concepts 2. 4 Data processing theorem If S 1 and S 3 are statistically independent when conditioned on S 2, then Data processing theorem proof

Ch2: Basic Concepts 2. 5 Uniqueness theorem be a continuous function of its arguments in which Def: 滿足 (a) f takes its largest value of p k =1/n (b) f is unaltered if an impossible event is added to the system (c)

Ch2: Basic Concepts 2. 5 Uniqueness theorem Uniqueness theorem for a positive constant C proof