EE 4780 Huffman Coding Example. Bahadir K. Gunturk2 Huffman Coding Example Suppose X is a source producing symbols; the symbols comes from the alphabet.

Slides:



Advertisements
Similar presentations
DCSP-8: Minimal length coding II, Hamming distance, Encryption Jianfeng Feng
Advertisements

DCSP-8: Minimal length coding I Jianfeng Feng Department of Computer Science Warwick Univ., UK
Chapter 4 Variable–Length and Huffman Codes. Unique Decodability We must always be able to determine where one code word ends and the next one begins.
CREATING a HUFFMAN CODE EVERY EGG IS GREEN E ///// V/V/ R // Y/Y/ I/I/ S/S/ N/N/ Sp /// V/V/ Y/Y/ I/I/ S/S/ N/N/ R // Sp /// G /// E /////
Applied Algorithmics - week7
Lecture 3: Source Coding Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Problem: Huffman Coding Def: binary character code = assignment of binary strings to characters e.g. ASCII code A = B = C =
Entropy and Shannon’s First Theorem
Image Compression, Transform Coding & the Haar Transform 4c8 – Dr. David Corrigan.
Lossless Compression - II Hao Jiang Computer Science Department Sept. 18, 2007.
Lecture04 Data Compression.
Lecture 6: Huffman Code Thinh Nguyen Oregon State University.
SWE 423: Multimedia Systems
ENGS Assignment 3 ENGS 4 – Assignment 3 Technology of Cyberspace Winter 2004 Thayer School of Engineering Dartmouth College Assignment 3 – Due Sunday,
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
CSE 326 Huffman coding Richard Anderson. Coding theory Conversion, Encryption, Compression Binary coding Variable length coding A B C D E F.
Variable-Length Codes: Huffman Codes
Fundamentals of Multimedia Chapter 7 Lossless Compression Algorithms Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
Lossless Compression - I Hao Jiang Computer Science Department Sept. 13, 2007.
CSI Uncertainty in A.I. Lecture 201 Basic Information Theory Review Measuring the uncertainty of an event Measuring the uncertainty in a probability.
Information Theory and Security
Basics of Compression Goals: to understand how image/audio/video signals are compressed to save storage and increase transmission efficiency to understand.
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
Algorithm Design & Analysis – CS632 Group Project Group Members Bijay Nepal James Hansen-Quartey Winter
Entropy coding Present by 陳群元. outline constraints  Compression efficiency  Computational efficiency  Error robustness.
Dr. O.Bushehrian ALGORITHM DESIGN HUFFMAN CODE. Fixed length code a: 00b: 01c: 11 Given this code, if our file is ababcbbbc our encoding is
Huffman Coding Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
7. 1 Arithmetic vs. Huffman Most of Multimedia methods use Huffman coding. We should check why Huffman has been chosen: Error Resilience. Average Codeword's.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
A Memory-efficient Huffman Decoding Algorithm
Lossless Compression CIS 465 Multimedia. Compression Compression: the process of coding that will effectively reduce the total number of bits needed to.
DCSP-8: Minimal length coding I Jianfeng Feng Department of Computer Science Warwick Univ., UK
COMPRESSION. Compression in General: Why Compress? So Many Bits, So Little Time (Space) CD audio rate: 2 * 2 * 8 * = 1,411,200 bps CD audio storage:
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Huffman Code and Data Decomposition Pranav Shah CS157B.
9/26 디지털 영상통신 Mathematical Preliminaries Math Background Predictive Coding Huffman Coding Matrix Computation.
Abdullah Aldahami ( ) April 6,  Huffman Coding is a simple algorithm that generates a set of variable sized codes with the minimum average.
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
Huffman’s Algorithm 11/02/ Weighted 2-tree A weighted 2-tree T is an extended binary tree with n external nodes and each of the external nodes is.
Foundation of Computing Systems
Bahareh Sarrafzadeh 6111 Fall 2009
Lossless Decomposition and Huffman Codes Sophia Soohoo CS 157B.
1Computer Sciences Department. 2 Advanced Design and Analysis Techniques TUTORIAL 7.
I REALLY REALLY LIKE EE THIRTEEN EIGHTY TWO I 4 _ 7 R 3 E 8 A 2 L 5 Y 3 K 1 T 4 N 1 H 2 G 1 W 1 O 1 Calc Relative Freq Sort E 8 _ 7 L 5 I 4 T 4 R 3 Y 3.
1 Data Compression Hae-sun Jung CS146 Dr. Sin-Min Lee Spring 2004.
1 Huffman Codes. 2 ASCII use same size encoding for all characters. Variable length codes can produce shorter messages than fixed length codes Huffman.
Chapter 7 Lossless Compression Algorithms 7.1 Introduction 7.2 Basics of Information Theory 7.3 Run-Length Coding 7.4 Variable-Length Coding (VLC) 7.5.
بسم الله الرحمن الرحيم My Project Huffman Code. Introduction Introduction Encoding And Decoding Encoding And Decoding Applications Applications Advantages.
Huffman encoding.
Huffman Coding (2 nd Method). Huffman coding (2 nd Method)  The Huffman code is a source code. Here word length of the code word approaches the fundamental.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
Efficient Huffman Decoding Aggarwal, M. and Narayan, A., International Conference on Image Processing, vol. 1, pp. 936 – 939, 2000 Presenter :Yu-Cheng.
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
Information theory Data compression perspective Pasi Fränti
Ch4. Zero-Error Data Compression Yuan Luo. Content  Ch4. Zero-Error Data Compression  4.1 The Entropy Bound  4.2 Prefix Codes  Definition and.
EE465: Introduction to Digital Image Processing
Increasing Information per Bit
Proving the Correctness of Huffman’s Algorithm
The Huffman Algorithm We use Huffman algorithm to encode a long message as a long bit string - by assigning a bit string code to each symbol of the alphabet.
Arithmetic coding Let L be a set of items.
Chapter 11 Data Compression
CSE 326 Huffman coding Richard Anderson.
Lecture 7 Information Sources; Average Codeword Length (Section 2.1)
Lecture 11 The Noiseless Coding Theorem (Section 3.4)
CSE 589 Applied Algorithms Spring 1999
Proving the Correctness of Huffman’s Algorithm
Lecture 8 Huffman Encoding (Section 2.2)
Presentation transcript:

EE 4780 Huffman Coding Example

Bahadir K. Gunturk2 Huffman Coding Example Suppose X is a source producing symbols; the symbols comes from the alphabet A={a1, a2, a3, a4, a5}. Suppose that the probability of each symbol is as follows: {0.4, 0.2, 0.2, 0.15, 0.05}. Form the Huffman tree: a1 a2 a3 a4 a Symbol | Probability | Codeword a a a a a Average codeword length = 0.4* * * * *4 = 2.2 per symbol Entropy = = 2.08

Bahadir K. Gunturk3 Huffman Coding Example Another possible tree with the same source is: a1 a2 a3 a4 a Symbol | Probability | Codeword a a a a a Average codeword length = 0.4* * * * *3 = 2.2 per symbol