Presentation is loading. Please wait.

Presentation is loading. Please wait.

Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.

Similar presentations


Presentation on theme: "Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise."— Presentation transcript:

1 Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise Characteristics of a Communication Channel. 1

2 Entropy. Basic Properties Continuity: if the probabilities of the occurrence of events are slightly changed, the entropy is slightly changed accordingly. Symmetry: Extremal Property : when all the events are equally likely, the average uncertainty has the largest value: 2

3 Entropy. Basic Properties Additivity. Let is the entropy associated with a complete set of events E 1, E 2, …, E n. Let the event E n is divided into k disjoint subsets: Thus and where 3

4 Entropy. Basic Properties In general, is continuous in p i for all 4

5 ENTROPY FOR TWO-DIMENSIONAL DISCRETE FINITE PROBABILITY SCHEMES 5

6 Entropy for Two-dimensional Discrete Finite Probability Schemes The two-dimensional probability scheme provides the simplest mathematical model for a communication system with a transmitter and a receiver. Consider two finite discrete sample spaces Ω 1 (transmitter space) Ω 2 (receiver space) and their product space Ω. 6

7 Entropy for Two-dimensional Discrete Finite Probability Schemes In Ω 1 and Ω 2 we select complete sets of events Each event may occur in conjunction with any event. Thus for the product space Ω= Ω 1 Ω 2 we obtain the following complete set of events: 7

8 Entropy for Two-dimensional Discrete Finite Probability Schemes We may consider the following three complete sets of probability schemes Each one of them is, by assumption, a finite complete probability scheme like 8

9 Entropy for Two-dimensional Discrete Finite Probability Schemes The joint probability matrix for the random variables X and Y associated with spaces Ω 1 and Ω 2 : Respectively, 9

10 Entropy for Two-dimensional Discrete Finite Probability Schemes Complete Probability SchemeEntropy 10

11 Entropy for Two-dimensional Discrete Finite Probability Schemes If all marginal probabilities and are known then the marginal entropies can be expressed according to the entropy definition: 11

12 Conditional Entropies Let now an event F i may occur not independently, but in conjunction with 12

13 Conditional Entropies Consider the following complete probability scheme: Hence 13

14 Conditional Entropies Taking this conditional entropy for all admissible y j, we obtain a measure of average conditional entropy of the system: Respectively, 14

15 Conditional Entropies Since Then finally conditional entropies can be written as 15

16 Five Entropies Pertaining to Joint Distribution Thus we have considered: Two conditional entropies H(X|Y), H(Y|X) Two marginal entropies H(X), H(Y) The joint entropy H(X,Y) 16

17 COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 17

18 Communication Network Consider a source of communication with a given alphabet. The source is linked to the receiver via a channel. The system may be described by a joint probability matrix: by giving the probability of the joint occurrence of two symbols, one at the input and another at the output. 18

19 Communication Network x i – a symbol, which was sent; y j - a symbol, which was received The joint probability matrix: 19

20 Communication Network: Probability Schemes There are following five probability schemes of interest in a product space of the random variables X and Y : [P{X,Y}] – joint probability matrix [P{X}] – marginal probability matrix of X [P{Y}] – marginal probability matrix of Y [P{X|Y}] – conditional probability matrix of X | Y [P{Y|X}] – conditional probability matrix of Y | X 20

21 Communication Network: Entropies There is the following interpretation of the five entropies corresponding to the mentioned five probability schemes: H(X,Y) – average information per pairs of transmitted and received characters (the entropy of the system as a whole); H(X) – average information per character of the source (the entropy of the source) H(Y) – average information per character at the destination (the entropy at the receiver) H(Y|X) – a specific character x k being transmitted and one of the permissible y j may be received (a measure of information about the receiver, where it is known what was transmitted) H(X|Y) – a specific character y j being received ; this may be a result of transmission of one of the x k with a given probability (a measure of information about the source, where it is known what was received) 21

22 Communication Network: Entropies’ Meaning H(X) and H(Y) give indications of the probabilistic nature of the transmitter and receiver, respectively. H(X,Y) gives the probabilistic nature of the communication channel as a whole H(Y|X) gives an indication of the noise (errors) in the channel H(X|Y) gives a measure of equivocation (how well one can recover the input content from the output) 22

23 Communication Network: Derivation of the Noise Characteristics In general, the joint probability matrix is not given for the communication system. It is customary to specify the noise characteristics of a channel and the source alphabet probabilities. From these data the joint and the output probability matrices can be derived. 23

24 Communication Network: Derivation of the Noise Characteristics Let us suppose that we have derived the joint probability matrix: 24

25 Communication Network: Derivation of the Noise Characteristics In other words : where: 25

26 Communication Network: Derivation of the Noise Characteristics If [P{X}] is not diagonal, but a row matrix ( n -dimensional vector) then where [P{Y}] is also a row matrix ( m -dimensional vector) designating the probabilities of the output alphabet. 26

27 Communication Network: Derivation of the Noise Characteristics Two discrete channels of our particular interest: Discrete noise-free channel (an ideal channel) Discrete channel with independent input- output (errors in the channel occur, thus noise is presented) 27


Download ppt "Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise."

Similar presentations


Ads by Google