Download presentation
Presentation is loading. Please wait.
Published byAleesha Norman Modified over 9 years ago
1
A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon
2
Contents Introduction Summary of Paper Discussion
3
Introduction This paper opened the information theory. Before this paper, people believed the only way to make the err. Prob. smaller is to reduce the data rate. This paper revealed that there is an achievable positive data rate with negligible errors. C.E. Shannon
4
Summary of Paper Preliminary Discrete Source & Discrete Channel Discrete Source & Cont. Channel Cont. Source & Cont. Channel
5
[Summary of Paper] Preliminary Entropy Ergodic source Irreducible, aperiodic property Capacity
6
[Summary of Paper] Disc. Source & Disc. Channel Capacity Theory (Theorem 11 at page 22) -The most important result of this paper If the discrete source entropy H is less than or equal to the channel capacity C then there exists a code that can be transmitted over the channel with arbitrarily small amount of errors. If H>C then there is no method of encoding which gives equivocation less than H-C.
7
[Summary of Paper] Disc. Source & Cont. Channel Domain size of input and output channel becomes infinity. The capacity of a continuous channel is: Tx rate does not exceed the channel capacity.
8
[Summary of Paper] Cont. Source & Cont. Channel Continuous source needs an infinite number of binary digits for exact specification. Fidelity: the measurement of how much distortion we allow Rate with fidelity constraint D of Cont. source P(X) is : with For given fidelity constraint D,
9
Discussion Ergodic source Practical approach Rate distortion
10
[Discussion] Ergodic source Ergodic Source assumption is the essential one in the paper. Source is ergodic -> AEP holds -> capacity theorem Finding a source that is not ergodic and holds AEP is a meaningful work. One example:
11
[Discussion] Practical approach -1 This paper provides the upper bound of achievable data rate. Finding a good encoding scheme is another problem. Turbo code, LDPC code are most efficient codes. Block size, rate, BER, decoding complexity are important factors when choosing a code for a specific system.
12
[Discussion] Practical approach -2 YearRate ½ CodeSNR Required for BER < 10 -5 1948SHANNON0dB 1967(255,123) BCH5.4dB 1977Convolutional Code4.5dB 1993Iterative Turbo Code0.7dB 2001Iterative LDPC Code0.0245dB ** This graph and chart are modified from the presentation data of Engling Yeo at Jan 15 2003 C. Berrou and A. Glavieux, "Near Optimum Error Correcting Coding And Decoding: Turbo- Codes," IEEE Trans. Comms., Vol.44, No.10, Oct 1996.
13
[Discussion] Rate distortion The ‘Fidelity’ concept motives ‘Rate Distortion’ theory. Rate with D distortion(fidelity) of Discrete source P(x) is defined as: subject to H(Entropy) is the rate with 0 distortion. (The Rate Distortion Theory) We can compress a Disc. source P(x) up to ratio when allowing D distortion.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.