Digital Multimedia Coding

Slides:



Advertisements
Similar presentations
Introduction to H.264 / AVC Video Coding Standard Multimedia Systems Sharif University of Technology November 2008.
Advertisements

Sampling and Pulse Code Modulation
Information Theory EE322 Al-Sanie.
SIMS-201 Compressing Information. 2  Overview Chapter 7: Compression Introduction Entropy Huffman coding Universal coding.
Motivation Application driven -- VoD, Information on Demand (WWW), education, telemedicine, videoconference, videophone Storage capacity Large capacity.
Chapter 6 Information Theory
CT50A6100 Machine Vision and DIA 1 Prof. Heikki Kälviäinen, Prof. P. Franti, Dr. A. Kolesnikov Lectures 5: Image Compression Professor Heikki Kälviäinen.
Fundamental limits in Information Theory Chapter 10 :
MPEG: A Video Compression Standard for Multimedia Applications Didier Le Gall Communications of the ACM Volume 34, Number 4 Pages 46-58, 1991.
SWE 423: Multimedia Systems Chapter 7: Data Compression (1)
Spatial and Temporal Data Mining
Losslessy Compression of Multimedia Data Hao Jiang Computer Science Department Sept. 25, 2007.
1 Chapter 5 A Measure of Information. 2 Outline 5.1 Axioms for the uncertainty measure 5.2 Two Interpretations of the uncertainty function 5.3 Properties.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
T.Sharon-A.Frank 1 Multimedia Image Compression 2 T.Sharon-A.Frank Coding Techniques – Hybrid.
Lossless Compression - I Hao Jiang Computer Science Department Sept. 13, 2007.
Why Compress? To reduce the volume of data to be transmitted (text, fax, images) To reduce the bandwidth required for transmission and to reduce storage.
An Introduction to H.264/AVC and 3D Video Coding.
ELG5126 Source Coding and Data Compression
Image Compression - JPEG. Video Compression MPEG –Audio compression Lossy / perceptually lossless / lossless 3 layers Models based on speech generation.
Basics of Compression Goals: to understand how image/audio/video signals are compressed to save storage and increase transmission efficiency to understand.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Entropy and some applications in image processing Neucimar J. Leite Institute of Computing
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
Channel Capacity.
CIS679: Multimedia Basics r Multimedia data type r Basic compression techniques.
Image Compression Supervised By: Mr.Nael Alian Student: Anwaar Ahmed Abu-AlQomboz ID: IT College “Multimedia”
8. 1 MPEG MPEG is Moving Picture Experts Group On 1992 MPEG-1 was the standard, but was replaced only a year after by MPEG-2. Nowadays, MPEG-2 is gradually.
1 Classification of Compression Methods. 2 Data Compression  A means of reducing the size of blocks of data by removing  Unused material: e.g.) silence.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Outline Kinds of Coding Need for Compression Basic Types Taxonomy Performance Metrics.
Image Compression Fasih ur Rehman. Goal of Compression Reduce the amount of data required to represent a given quantity of information Reduce relative.
Compression video overview 演講者:林崇元. Outline Introduction Fundamentals of video compression Picture type Signal quality measure Video encoder and decoder.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Chapter 17 Image Compression 17.1 Introduction Redundant and irrelevant information  “Your wife, Helen, will meet you at Logan Airport in Boston.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Advances in digital image compression techniques Guojun Lu, Computer Communications, Vol. 16, No. 4, Apr, 1993, pp
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis
JPEG - JPEG2000 Isabelle Marque JPEGJPEG2000. JPEG Joint Photographic Experts Group Committe created in 1986 by: International Organization for Standardization.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
STATISTIC & INFORMATION THEORY (CSNB134) MODULE 11 COMPRESSION.
Image Processing Architecture, © Oleh TretiakPage 1Lecture 4 ECE-C490 Winter 2004 Image Processing Architecture Lecture 4, 1/20/2004 Principles.
Chapter 8 Lossy Compression Algorithms. Fundamentals of Multimedia, Chapter Introduction Lossless compression algorithms do not deliver compression.
Multi-media Data compression
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 19 – Multimedia Transport Subsystem (Part 2) + Midterm Review Klara Nahrstedt Spring 2014.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Fundamentals of Data Representation Yusung Kim
Introductions What Is Data Compression ? Data compression is the process of reduction in the amount of signal space that must be allocated to a given.
Introduction to H.264 / AVC Video Coding Standard Multimedia Systems Sharif University of Technology November 2008.
Quality Evaluation and Comparison of SVC Encoders
Digital Image Processing Lecture 20: Image Compression May 16, 2005
Introduction to Information theory
Algorithms in the Real World
JPEG Image Coding Standard
Image Compression The still image and motion images can be compressed by lossless coding or lossy coding. Principle of compression: - reduce the redundant.
Data Compression.
CH 8. Image Compression 8.1 Fundamental 8.2 Image compression models
Context-based Data Compression
Overview What is Multimedia? Characteristics of multimedia
COT 5611 Operating Systems Design Principles Spring 2012
Why Compress? To reduce the volume of data to be transmitted (text, fax, images) To reduce the bandwidth required for transmission and to reduce storage.
COT 5611 Operating Systems Design Principles Spring 2014
Subject Name: Information Theory Coding Subject Code: 10EC55
A Brief Introduction to Information Theory
UNIT IV.
CSE 589 Applied Algorithms Spring 1999
Presentation transcript:

Digital Multimedia Coding Xiaolin Wu McMaster University Hamilton, Ontario, Canada Part 1. Basics

What is data compression? Data compression is the art and science of representing information in a compact form. Data is a sequence of symbols taken from a discrete alphabet. We focus here on visual media (image/video), a digital image/frame is a collection of arrays (one for each color plane) of values representing intensity (color) of the point in corresponding spatial location (pixel). 11/17/2018

Why do we need Data Compression? Still Image 8.5 x 11 page at 600 dpi is > 100 MB. 20 1K x 1K images in digital camera generate 60 MB. Scanned 3 x 7 photograph at 300 dpi is 30 MB. Digital Cinema 4K x 2K x 3 x 12 bits/pel = 48 MB/frame, or 1.15 GB/sec, 69GB/min! Scientific/Medical Visualization fMRI width x height x depth x time! More than just storage, how about burdens on transmission bandwidth, I/O throughput? 11/17/2018

What makes compression possible? Statistical redundancy Spatial correlation - Local - Pixels at neighboring locations have similar intensities. Global - Reoccurring patterns. Spectral correlation – between color planes. Temporal correlation – between consecutive frames. Tolerance to fidelity Perceptual redundancy. Limitation of rendering hardware. 11/17/2018

Elements of a compression algorithm Transform Quantization Entropy Coding Source Sequence Compressed Bit stream Source Model 11/17/2018

Measures of performance Compression measures Compression ratio = Bits per symbol Fidelity measures Mean square error (MSE) SNR - Signal to noise ratio PSNR - Peak signal to noise ratio HVS based 11/17/2018

Other issues Encoder and decoder computation complexity Memory requirements Fixed rate or variable rate Error resilience Symmetric or asymmetric Decompress at multiple resolutions Decompress at various bit rates Standard or proprietary 11/17/2018

What is information? Semantic interpretation is subjective Statistical interpretation - Shannon 1948 Self information i(A) associated with event A is More probable events have less information and less probable events have more information. If A and B are two independent events then self information i(AB) = i(A) + i(B) 11/17/2018

Entropy of a random variable Entropy of a random variable X from alphabet {X1,…,Xn} is defined as This is the average self-information of the r.v. X The average number of bits needed to describe an instance of X is bounded above by its entropy. Furthermore, this bound is tight. (Shannon’s noiseless source coding theorem) 11/17/2018

Entropy of a binary valued r.v. Let X be a r.v. whose set of outcomes is {0,1} Let p(0) = p and p(1) = 1-p Plot H(X) = - p log p - (1-p) log (1-p) H(X) is max when p = 1/2 H(X) is 0 if and only if either p = 0 or p = 1 H(X) is continuous 11/17/2018

Properties of the entropy function Can also be viewed as measure of uncertainty in X Can be shown to be the only function that satisfies the following If all events are equally likely then entropy increases with number of events If X and Y are independent then H(XY) = H(X)+H(Y) The information content of the event does not depend in the manner the event is specified The information measure is continuous 11/17/2018

Entropy of a stochastic process A stochastic process S = {Xi} is an indexed sequence of r.v.’s characterized by joint pmf’s Entropy of a stochastic process S is defined as measure of average information per symbol of S In practice, difficult to determine as knowledge of source statistics is not complete. 11/17/2018

Joint Entropy and Conditional Entropy Joint entropy H(X,Y) is defined as The conditional entropy H(Y|X) is defined as It is easy to show that Mutual Information I(X;Y) is defined as 11/17/2018

General References on Data Compression Image and Video Compression Standards - V. Bhaskaran and K. Konstantinides. Kluwer International - Excellent reference for engineers. Data Compression - K. Sayood. Morgan Kauffman - Excellent introductory text. Elements of Information Theory - T. Cover and J. Thomas - Wiley Interscience - Excellent introduction to theoretical aspects. 11/17/2018