Download presentation
Presentation is loading. Please wait.
Published byΜενέλαος Γιάγκος Modified over 5 years ago
1
Lecture 11 The Noiseless Coding Theorem (Section 3.4)
Theory of Information Lecture 11 Theory of Information Lecture 11 The Noiseless Coding Theorem (Section 3.4)
2
Theory of Information Lecture 11
Idea Theory of Information Lecture 11 The average codeword length can never be better than the entropy. (version 1 of the Noiseless Coding Theorem). It will also never be worse than entropy+1. (version 2 of the Noiseless Coding Theorem). By encoding extensions of a source S, that is blocks of symbols rather than individual symbols, we can reduce the average codeword length as close to the entropy as desired. In other words, entropy is the best we can achieve when seeking effeciency of encoding. (version 3 of the Noiseless Coding Theorem).
3
The Noiseless Coding Theorem
Theory of Information Lecture 11 MinAveCodeLen(S) means the minimum average codeword length among all uniquely decipherable binary encoding schemes for source S. For any source S we have: Version 1. H(S) MinAveCodeLen(S) Version 2. H(S) MinAveCodeLen(S) H(S)+ 1 MinAveCodeLen(Sn) Version 3. H(S) H(S)+1/n n
4
Theory of Information Lecture 11
Homework Theory of Information Lecture 11 Exercises 1,2 and 3 of Section 3.4.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.