Image Compression, Transform Coding & the Haar Transform 4c8 – Dr. David Corrigan.

Slides:



Advertisements
Similar presentations
EE 4780 Huffman Coding Example. Bahadir K. Gunturk2 Huffman Coding Example Suppose X is a source producing symbols; the symbols comes from the alphabet.
Advertisements

Data Compression CS 147 Minh Nguyen.
Image Compression. Data and information Data is not the same thing as information. Data is the means with which information is expressed. The amount of.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Data Compression Michael J. Watts
Lecture04 Data Compression.
Compression & Huffman Codes
Lecture 6: Huffman Code Thinh Nguyen Oregon State University.
CSc 461/561 CSc 461/561 Multimedia Systems Part B: 1. Lossless Compression.
SWE 423: Multimedia Systems
Department of Computer Engineering University of California at Santa Cruz Data Compression (1) Hai Tao.
Spatial and Temporal Data Mining
Compression & Huffman Codes Fawzi Emad Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Lecture 4 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan
Fundamentals of Multimedia Chapter 7 Lossless Compression Algorithms Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
Lossless Data Compression Using run-length and Huffman Compression pages
Lossless Compression - I Hao Jiang Computer Science Department Sept. 13, 2007.
Department of Computer Engineering University of California at Santa Cruz Data Compression (2) Hai Tao.
Chapter 7 Special Section Focus on Data Compression.
1 Lossless Compression Multimedia Systems (Module 2) r Lesson 1: m Minimum Redundancy Coding based on Information Theory: Shannon-Fano Coding Huffman Coding.
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
Compression Algorithms Robert Buckley MCIS681 Online Dr. Smith Nova Southeastern University.
8. Compression. 2 Video and Audio Compression Video and Audio files are very large. Unless we develop and maintain very high bandwidth networks (Gigabytes.
Chapter 2 Source Coding (part 2)
Algorithm Design & Analysis – CS632 Group Project Group Members Bijay Nepal James Hansen-Quartey Winter
CS Spring 2011 CS 414 – Multimedia Systems Design Lecture 7 – Basics of Compression (Part 2) Klara Nahrstedt Spring 2011.
Page 110/6/2015 CSE 40373/60373: Multimedia Systems So far  Audio (scalar values with time), image (2-D data) and video (2-D with time)  Higher fidelity.
CS-2852 Data Structures LECTURE 13B Andrew J. Wozniewicz Image copyright © 2010 andyjphoto.com.
4C8 Image Compression.
Multimedia Data Introduction to Lossless Data Compression Dr Sandra I. Woolley Electronic, Electrical.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 5.
1 Classification of Compression Methods. 2 Data Compression  A means of reducing the size of blocks of data by removing  Unused material: e.g.) silence.
Lossless Compression CIS 465 Multimedia. Compression Compression: the process of coding that will effectively reduce the total number of bits needed to.
Image Compression – Fundamentals and Lossless Compression Techniques
COMPRESSION. Compression in General: Why Compress? So Many Bits, So Little Time (Space) CD audio rate: 2 * 2 * 8 * = 1,411,200 bps CD audio storage:
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Huffman Code and Data Decomposition Pranav Shah CS157B.
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis
CS654: Digital Image Analysis Lecture 34: Different Coding Techniques.
Digital Image Processing Lecture 22: Image Compression
Bahareh Sarrafzadeh 6111 Fall 2009
Lossless Compression(2)
STATISTIC & INFORMATION THEORY (CSNB134) MODULE 11 COMPRESSION.
Lossless Decomposition and Huffman Codes Sophia Soohoo CS 157B.
Chapter 7 Lossless Compression Algorithms 7.1 Introduction 7.2 Basics of Information Theory 7.3 Run-Length Coding 7.4 Variable-Length Coding (VLC) 7.5.
بسم الله الرحمن الرحيم My Project Huffman Code. Introduction Introduction Encoding And Decoding Encoding And Decoding Applications Applications Advantages.
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 7 – Basics of Compression (Part 2) Klara Nahrstedt Spring 2012.
Dr. Hadi AL Saadi Image Compression. Goal of Image Compression Digital images require huge amounts of space for storage and large bandwidths for transmission.
Compression and Huffman Coding. Compression Reducing the memory required to store some information. Lossless compression vs lossy compression Lossless.
Entropy vs. Average Code-length Important application of Shannon’s entropy measure is in finding efficient (~ short average length) code words The measure.
Lossless Compression-Statistical Model Lossless Compression One important to note about entropy is that, unlike the thermodynamic measure of entropy,
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
Submitted To-: Submitted By-: Mrs.Sushma Rani (HOD) Aashish Kr. Goyal (IT-7th) Deepak Soni (IT-8 th )
Data Compression Michael J. Watts
Design & Analysis of Algorithm Huffman Coding
IMAGE PROCESSING IMAGE COMPRESSION
Compression & Huffman Codes
EE465: Introduction to Digital Image Processing
Digital Image Processing Lecture 20: Image Compression May 16, 2005
Data Compression.
Algorithms in the Real World
Data Compression.
Huffman Coding, Arithmetic Coding, and JBIG2
Data Compression CS 147 Minh Nguyen.
CH 8. Image Compression 8.1 Fundamental 8.2 Image compression models
Chapter 11 Data Compression
Huffman Encoding Huffman code is method for the compression for standard text documents. It makes use of a binary tree to develop codes of varying lengths.
Image Coding and Compression
Presentation transcript:

Image Compression, Transform Coding & the Haar Transform 4c8 – Dr. David Corrigan

Entropy  It all starts with entropy

Calculating the Entropy of an Image The entropy of lena is = 7.57 bits/pixel approx

Huffman Coding  Huffman is the simplest entropy coding scheme  It achieves average code lengths no more than 1 bit/symbol of the entropy A binary tree is built by combining the two symbols with lowest probability into a dummy node The code length for each symbol is the number of branches between the root and respective leaf

Huffman Coding of Lenna SymbolCode Length …… So the code length is not much greater than the entropy

But this is not very good  Why?  Entropy is not the minimum average codeword length for a source with memory  If the other pixel values are known we can predict the unknown pixel with much greater certainty and hence the effective (ie. conditional) entropy is much less.  Entropy Rate  The minimum average codeword length for any source.  It is defined as

Coding Sources with Memory  It is very difficult to achieve codeword lengths close to the entropy rate  In fact it is difficult to calculate the entropy rate itself  We looked at LZW as a practical coding algorithm  Average codeword length tends to the entropy rate if the file is large enough  Efficiency is improved if we use Huffman to encode the output of LZW  LZ algorithms used in lossless compression formats (eg..tiff,.png,.gif,.zip,.gz,.rar… )

Efficiency of Lossless Compression  Lenna (256x256) file sizes  Uncompressed tiff kB  LZW tiff – 69.0 kB  Deflate (LZ77 + Huff) – 58 kB  Green Screen (1920 x 1080) file sizes  Uncompressed – 5.93 MB  LZW – 4.85 MB  Deflate – 3.7 MB

Differential Coding  Key idea – code the differences in intensity. G(x,y) = I(x,y) – I(x-1,y)

Differential Coding The entropy is now 5.60 bits/pixel which is much less than 7.57 bits/pixel we had before (despite having twice as many symbols) Calculate Difference Image Huffman Enoding Channel Huffman Decoding Image Recon- struction

So why does this work? Plot a graph of H(p) against p.

In general  Entropy of a source is maximised when all signals are equiprobable and is less when a few symbols are much more probable than the others. Histogram of the original imageHistogram of the difference image Entropy = 7.57 bits/pixel Entropy = 5.6 bits/pixel

Lossy Compression  But this is still not enough compression  Trick is to throw away data that has the least perceptual significance Effective bit rate = 8 bits/pixel Effective bit rate = 1 bit/pixel (approx)