Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 8 Huffman Encoding (Section 2.2)

Similar presentations


Presentation on theme: "Lecture 8 Huffman Encoding (Section 2.2)"— Presentation transcript:

1 Lecture 8 Huffman Encoding (Section 2.2)
Theory of Information Lecture 8 Theory of Information Lecture 8 Huffman Encoding (Section 2.2)

2 The Appeal of Huffman Encoding
Theory of Information Lecture 8 THEOREM All Huffman encoding schemes are instantaneous. Furthermore, they have the smallest average codeword length among all instantaneous encoding schemes. a 0.35 b 0.10 c 0.19 d 0.25 e 0.06 f 0.05

3 Theory of Information Lecture 8
Step 1 Theory of Information Lecture 8 Place each symbol inside a node. Then label each node with the probability of occurrence of the symbol and arrange the nodes in order of increasing probability. f 0.05 e 0.06 b 0.10 c 0.19 d 0.25 a 0.35 a 0.35 b 0.10 c 0.19 d 0.25 e 0.06 f 0.05

4 Theory of Information Lecture 8
Step 2 Theory of Information Lecture 8 Connect the two leftmost nodes to a new node. Label the new nodes with the sum of the probabilities of those two nodes. Lower this portion of the figure so that the new node is at the top row. 0.11 0.10 b 0.19 c 0.25 d 0.35 a 0.05 f 0.06 e

5 Theory of Information Lecture 8
Step 3, 1st Iteration Theory of Information Lecture 8 Repeat the process of arranging the figure so that the nodes on the top level are in increasing order of probabilities, and then connecting the two leftmost nodes, until only one node remains on the top row. 0.21 c 0.25 d 0.35 a 0.19 0.10 b 0.11 0.05 f 0.06 e

6 Theory of Information Lecture 8
Step 3, 2nd Iteration Theory of Information Lecture 8 Repeat the process of arranging the figure so that the nodes on the top level are in increasing order of probabilities, and then connecting the two leftmost nodes, until only one node remains on the top row. 0.25 d 0.35 a 0.40 c 0.21 0.19 0.10 b 0.11 0.05 f 0.06 e

7 Theory of Information Lecture 8
Step 3, 3rd Iteration Theory of Information Lecture 8 Repeat the process of arranging the figure so that the nodes on the top level are in increasing order of probabilities, and then connecting the two leftmost nodes, until only one node remains on the top row. 0.60 0.40 0.25 d 0.35 a c 0.21 0.19 0.10 b 0.11 0.05 f 0.06 e

8 Theory of Information Lecture 8
Step 3, 4th Iteration Theory of Information Lecture 8 Repeat the process of arranging the figure so that the nodes on the top level are in increasing order of probabilities, and then connecting the two leftmost nodes, until only one node remains on the top row. 1.00 0.60 0.40 c 0.21 0.25 d 0.35 a 0.19 0.10 b 0.11 0.05 f 0.06 e

9 Theory of Information Lecture 8
Step 4 Theory of Information Lecture 8 Discard all of the probabilities, and label each left arc by a 0 and each right arc by a 1. Huffman Tree 1 1 1 c d a 1 b 1 f e

10 Theory of Information Lecture 8
Step 5 Theory of Information Lecture 8 To determine the codeword associated to each source symbol, start at the root and write down the sequence of bits encountered en route to the source symbol. Huffman Tree 1 1 1 c d a 1 a 11 b 010 c 00 d 10 e 0111 f 0110 b 1 f e

11 Theory of Information Lecture 8
Homework Theory of Information Lecture 8 Exercises 2, 4, 6 of Section 2.2.


Download ppt "Lecture 8 Huffman Encoding (Section 2.2)"

Similar presentations


Ads by Google