Presentation is loading. Please wait.

Presentation is loading. Please wait.

Entropy coding Present by 陳群元. outline constraints  Compression efficiency  Computational efficiency  Error robustness.

Similar presentations


Presentation on theme: "Entropy coding Present by 陳群元. outline constraints  Compression efficiency  Computational efficiency  Error robustness."— Presentation transcript:

1 Entropy coding Present by 陳群元

2 outline

3 constraints  Compression efficiency  Computational efficiency  Error robustness

4 Encoding  DCT  Reordering  Run-level coding  Entropy coding

5 Encoding  DCT  Reordering  Run-level coding  Entropy coding

6 DCT

7 Reordering  The optimum method of reordering the quantised data depends on the distribution of the non-zero coefficients.

8 Evenly distribution

9 Zigzag reordering pattern

10 Interlaced video-vary more in vertical

11 Modified reordering pattern

12 Encoding  DCT  Reordering  Run-level coding  Entropy coding

13 Run-level coding  Long sequences of identical values (zeros in this case) can be represented as a (run, level) code  (run) indicates the number of zeros preceding a non-zero value  (level) indicates the sign and magnitude of the non-zero coefficient

14 Run-level coding(ex)

15 Encoding  DCT  Reordering  Run-level coding  Entropy coding

16 Huffman Coding  ‘True’ Huffman Coding  Modified Huffman Coding  Table Design  Entropy Coding Example  Variable Length Encoder Design  Variable Length Decoder Design  Dealing with Errors

17 Huffman Coding  ‘True’ Huffman Coding  Modified Huffman Coding  Table Design  Entropy Coding Example  Variable Length Encoder Design  Variable Length Decoder Design  Dealing with Errors

18 True Huffman Coding

19 Generating the HufSman code tree  1. Order the list of data in increasing order of probability.  2. Combine the two lowest-probability data items into a ‘node’ and assign the joint probability of the data items to this node.  3. Reorder the remaining data items and node(s) in increasing order of probability and repeat step 2.

20

21 encoding

22 decoding

23 Carphone vs claire

24 claire

25 disadvantage  the decoder must use the same codeword set as the encoder  reduces compression efficiency.  calculating the probability table for a large video sequence a significant computational overhead  cannot be done until after the video data is encoded

26 Huffman Coding  ‘True’ Huffman Coding  Modified Huffman Coding  Table Design  Entropy Coding Example  Variable Length Encoder Design  Variable Length Decoder Design  Dealing with Errors

27 Table design Final pair

28 Table design

29 H.263/MPEG-4 motion vector difference (MVD)  The H.263MPEG-4 differentially coded motion vectors (MVD)  are each encoded as a pair of VLCs, one for the x- component and one for the y-component

30 mvd

31

32 H.26L universal VLC (UVLC)  The emerging H.26L standard takes a step away from individually calculated Huffman tables by using a ‘universal’ set of VLCs for any coded element. Each codeword is generated from the following systematic list:

33

34

35 Huffman Coding  ‘True’ Huffman Coding  Modified Huffman Coding  Table Design  Entropy Coding Example  Variable Length Encoder Design  Variable Length Decoder Design  Dealing with Errors

36 Entropy Coding Example

37 Huffman Coding  ‘True’ Huffman Coding  Modified Huffman Coding  Table Design  Entropy Coding Example  Variable Length Encoder Design  Variable Length Decoder Design  Dealing with Errors

38 Variable Length Encoder Design

39 Huffman Coding  ‘True’ Huffman Coding  Modified Huffman Coding  Table Design  Entropy Coding Example  Variable Length Encoder Design  Variable Length Decoder Design  Dealing with Errors

40 Variable Length Decoder Design

41 Huffman Coding  ‘True’ Huffman Coding  Modified Huffman Coding  Table Design  Entropy Coding Example  Variable Length Encoder Design  Variable Length Decoder Design  Dealing with Errors

42 Dealing with Errors

43 Arithmetic Coding

44 Ex. encode (0,-1,0,2) 0.394

45

46 Ex. decode (0,-1,0,2)

47 Arithmetic coding vs huffman  Ideal  0.394 in this case, which can be represented as a fixed- point number with sufficient accuracy using 9 bits  Huffman: 1;0011;1;0000110=>13 bits

48  The end  Thank you


Download ppt "Entropy coding Present by 陳群元. outline constraints  Compression efficiency  Computational efficiency  Error robustness."

Similar presentations


Ads by Google