Download presentation
Presentation is loading. Please wait.
1
Revision of Chapter III For an information source {p i, i=1,2,…,N} its entropy is defined by Shannon’s first theorem: For an instantaneous coding, we have where B s is the average length of the code.
2
The optimal coding could be achieved via Hoffman coding which is created via
3
Shannon’s capacity: C = B log 2 ( 1 + (S/N) ) b/s Hamming distance between two codes X and Y is given by Parity check code is achieved via adding a bit to its actual code Encryption is achieved via matched key approach
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.