Presentation is loading. Please wait.

Presentation is loading. Please wait.

Subject Name: Information Theory Coding Subject Code: 10EC55

Similar presentations


Presentation on theme: "Subject Name: Information Theory Coding Subject Code: 10EC55"— Presentation transcript:

1 Subject Name: Information Theory Coding Subject Code: 10EC55
Prepared By: Shima Ramesh,Pavana .H Department: ECE Date: 10/11/2014 11/22/2018 MVJCE

2 UNIT 3 Fundamental limits on Performance 11/22/2018 MVJCE

3 Discrete memory less channel Mutual information Channel capacity.
Topics to be covered Source coding theorem Huffman coding Discrete memory less channel Mutual information Channel capacity. 11/22/2018 MVJCE

4 Discrete memoryless channel
Communication network may be described by specified by joint probabilities matrix (JPM). JPM of above system can be written as 11/22/2018 MVJCE

5 Sum of all entries of JPM
Sum of all entries of JPM in the kth r Sum of all entries of JPM in jth column 11/22/2018 MVJCE

6 Huffman’s minimum redundancy code:
Source coding theorem. Huffman’s minimum redundancy code: Huffman has suggested simple that guarantees an optimal code (i) the procedure consists of step-step by reduction of original source followed by code construction, starting with final the final reduced source and working backward to original source. The procedure has α steps: 11/22/2018 MVJCE

7 Procedure is as follows
List the source symbols in decreasing order of probabilities. Check if q= r+α(r-1) is satisfied and find the integer α. Club the last r symbols into one composite symbols whose probability of occurrence is equal to the sum of the probabilities of occurrence of last r symbols. Repeat step 1 and 3 until exactly r symbols are left. Assign codes freely to the last r composite symbols and work backwards to the original source to arrive at the optimum codes. Discard the codes of dummy symbols. 11/22/2018 MVJCE

8 Accordingly there are five entropy Functions
For simple communication networks there are five probability schemes of interest: P(X), P(Y), P(X,Y),P(X/Y) and P(Y/X) Accordingly there are five entropy Functions H(X): Average information per character or symbol transmitted by the source or the entropy of the source H(Y): Average information received per character at the receiver or entropy of the receiver. H(X,Y): Average information per pair of transmitted and received characters H(X/Y) or H(Y/X): Measure of information about the receiver. 11/22/2018 MVJCE

9 . similarly From the equation mentioned below H(X) & H(Y)
Joint and conditional entropies . From the equation mentioned below H(X) & H(Y) Can be written as similarly 11/22/2018 MVJCE

10 The average conditional entropy or Equivocation
Equivocation specifies the average amount of information Needed to specify an input character provided we are allowed To make observation of the output produced by the input. 11/22/2018 MVJCE

11 The relation between entropies are given as
similarly 11/22/2018 MVJCE

12 If and only if input symbols and output symbols are
Are statistically independent of each other. Mutual information or transinformation Mutual information between the symbols xk and yj can be written as 11/22/2018 MVJCE

13 Mutual information is symmetrical with respect to its arguments.
Self information: Mutual information is symmetrical with respect to its arguments. Entropy relations. 11/22/2018 MVJCE

14 Entropy Relations The entropy of X is represented by the circle in left The entropy of Y is represented by the circle in right. Overlap between the two circle is the mutual information and So that the remaining portions of H(X) and H(Y) represents respective equivocation 11/22/2018 MVJCE

15 The joint entropy H(X,Y) is the sum of H(X) and H(Y) expect for the fact that overlap is added twice. Also observe that 11/22/2018 MVJCE

16 Shannon theorem on channel capacity or coding theorem
Coding is a technique by which the communication system can be made to transmit information with an arbitrary small probability of error provided information rate is less then or equal to channel capacity C. 11/22/2018 MVJCE

17 The Shannon theorem is split into two parts. positive statement
Given a source of M equally likely messages with M>>1, which has an information rate R and Channel capacity C. If R<=C then there exists a coding technique such that output of the source may be transmitted with prob of error of receiving that message that can be made arbitrarily small. Negative statement: Given a source of M equally likely messages with M>>1 which has information rate R and channel capacity C. then if R>C probability of error of receiving the message is close to unity for every set of M transmiyted symbols. 11/22/2018 MVJCE

18 Shannon defines channel capacity of communication channel as the maximum value of transinformation.
11/22/2018 MVJCE

19 E= (Absolute redundancy)/H(X)
Redundant source is one that produces dependent symbols. These redundant symbols do not convey any information. But there are required to facilitate error correction and detection. Redundancy of sequence of symbols can be measured by noting the amount by which the entropy has been reduced. The difference H(X)-H(Y/X) is the net reduction in the entropy and is called absolute redundancy. Its is also called as relative redundancy or simply redundancy (E). E= (Absolute redundancy)/H(X) R= actual rate of transinformation and C is channel capacity. 11/22/2018 MVJCE

20 Efficiency of the channel is given by
11/22/2018 MVJCE

21 Where Qi are the solutions of the matrix equations
Muroga’s theorem: the channel capacity of a channel whose noise characteristics ,P(Y/X) is square and non singular the channel capacity is given by Where Qi are the solutions of the matrix equations Where Are the rows of entropies of P(Y/X) 11/22/2018 MVJCE

22 11/22/2018 MVJCE

23 Symmetric channels. A channel is said to be symmetric or uniform if the second and subsequent rows of the channel matrix are certain permutations of first row. The elements of second and subsequent rows are exactly the same as those of the First row except for the locations. 11/22/2018 MVJCE

24 Conditional entropy H(Y/X) of this cannel is given by
The important property of P(Y/X) is that the sum of all elements in any row should add to 1. Conditional entropy H(Y/X) of this cannel is given by 11/22/2018 MVJCE

25 Mutual information of symmetric channels is given by
H(Y) will be maximum if and only if all the received symbols Are equally probable as there are n symbols at the output Thus for symmetric channel we have 11/22/2018 MVJCE

26 And also the channel to be symmetric the CPM P(X/Y) should be
The second and subsequent columns of CPM are permutations of the first columns 11/22/2018 MVJCE

27 UNIT 4 Channel coding Theorem 11/22/2018 MVJCE

28 Channel coding theorem Differential entropy
Topics to be covered Channel coding theorem Differential entropy Mtual information for continuous ensembles Channel capacity theorem 11/22/2018

29 continuous random variable X.
Differential entropy: Entropy H(X) of a continuous source can be defined as Where f(x) is the probability density function (p.d.f) of continuous random variable X. Consider an example : suppose X is a uniform random variable over the interval (0,2) hence 0 else where Hence using the equation of entropy we can write 11/22/2018

30 5) Average power limitation, with unidirectional distribution.
Maximization of entropy: In practical systems, the sources for example radio,transmitters,are constrained to either average power or peak power limitations. Objective then is to maximize the entropy under such restrictions. The general constraints may be listed below: 5) Average power limitation, with unidirectional distribution. 11/22/2018

31 Mutual information of a continuous channel: Consider
The mutual information isThe mutual information is The channel capacity is given by 11/22/2018

32 Amount of mutual information:
Since the channel capacity is defined as Then it follows 11/22/2018

33 Capacity of band limited channels with AWGN and Average power:
Limitation of signals: the shannon hartley law: Topic details The received signal will be composed of transmitted signal X and noise n Joint entropy at the transmitter end assuming signal and noise are independent 11/22/2018

34 Joint entropy in receiver end Is given by
Since the received signal is Y=X+n From above two equations channel capacity in bits/second is 11/22/2018

35 If the additive noise is white and gaussian and has a power of N ,in bandwidth B hz then we have
Further if the input signal is also limited power S over a bandwidthX,n are independent then it follows: For a given mean square value, the entropy will become a maximum .If the signal is gaussian and there fore entropy of the output is 11/22/2018

36 Using all above equations
If n is an AWGN ,Y will be gaussian if and only if Xi also gaussian this implies Using all above equations ………….1 Equation 1 is called Shannon-Hartley law. 11/22/2018

37 Bandwidth SNR trade off
Primary significance of shannon-hartley law is that it is possible to transmit over a channel of bandwidth B Hz perturbed by AWGN at a rate of C bits/sec with an arbitrarily small prob of error if the signal is encoded in such a manner that the smaples are all gaussian signals. Bandwidth SNR trade off 11/22/2018

38 From the figure we find the noise power over (-B,B) as or
That is, the noise power is directly proportional to to band width B. Thus the noise power will be reduced by reducing the band width and vice versa. 11/22/2018

39 For wide band system where (S/N)>>1 Using Leads to
The above equation predicts an exponential improvement in (S/N) Ratio with the band width for an ideal system. 11/22/2018

40 Capacity of a channel of infinite bandwidth.:
Shannon- Hartley formula predicts that a noiseless gaussian channel with (S/N)=infinity has an infinite capacity. However the channel capacity does not become infinite when the bandwidth is made infinite 11/22/2018

41 Places an upper limit on channel capacity with increasing Band width.
Accordingly Places an upper limit on channel capacity with increasing Band width. 11/22/2018

42 From the shannon formula becomes
Band width efficiency: Shannon Limit.The average transmitted power is expressed as From the shannon formula becomes From which one can show (C/B) Is called “bandwidth efficiency” of the system. 11/22/2018

43 If C/B=1 then it follows that Eb=No
If C/B=1 then it follows that Eb=No. This implies that the signal power equals the noise power. Suppose B=Bo for which S=N ,then That is the “the maximum signaling rate for a given S is bits/sec/Hz In the bandwidth over which the signal power can be spread without its falling Below the noise level.” 11/22/2018

44 Is known as shannon’s limit for transmission capacity
Topic details Is known as shannon’s limit for transmission capacity And the communication fail other wise 11/22/2018


Download ppt "Subject Name: Information Theory Coding Subject Code: 10EC55"

Similar presentations


Ads by Google