Download presentation
Presentation is loading. Please wait.
Published byGrace Selwyn Modified over 9 years ago
1
EE 4780 Huffman Coding Example
2
Bahadir K. Gunturk2 Huffman Coding Example Suppose X is a source producing symbols; the symbols comes from the alphabet A={a1, a2, a3, a4, a5}. Suppose that the probability of each symbol is as follows: {0.4, 0.2, 0.2, 0.15, 0.05}. Form the Huffman tree: a1 a2 a3 a4 a5 0.4 0.2 0.15 0.05 0.2 0.4 0.6 1.0 0 1 0 1 0 1 0 1 Symbol | Probability | Codeword a1 0.4 0 a2 0.2 10 a3 0.2 110 a4 0.15 1110 a5 0.05 1111 Average codeword length = 0.4*1 + 0.2*2 + 0.2*3 + 0.15*4 + 0.05*4 = 2.2 per symbol Entropy = = 2.08
3
Bahadir K. Gunturk3 Huffman Coding Example Another possible tree with the same source is: a1 a2 a3 a4 a5 0.4 0.2 0.15 0.05 0.2 0.4 0.6 1.0 0 1 0 1 0 1 0 1 Symbol | Probability | Codeword a1 0.4 0 a2 0.2 100 a3 0.2 101 a4 0.15 110 a5 0.05 111 Average codeword length = 0.4*1 + 0.2*3 + 0.2*3 + 0.15*3 + 0.05*3 = 2.2 per symbol
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.