Entropy and Information Theory

Slides:



Advertisements
Similar presentations
DCSP-6: Signal Transmission + information theory Jianfeng Feng Department of Computer Science Warwick Univ., UK
Advertisements

DCSP-7: Information Jianfeng Feng Department of Computer Science Warwick Univ., UK
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Data Compression CS 147 Minh Nguyen.
NETWORKS – DATA TRANSMISSION. Vocabulary Review Protocol Agreed set of rules Data packet Envelope of data sent across a network Contains Source address.
EE465: Introduction to Digital Image Processing 1 One-Minute Survey Result  Thank you for your responses Kristen, Anusha, Ian, Christofer, Bernard, Greg,
Information Theory EE322 Al-Sanie.
An introduction to Data Compression
Michael A. Nielsen University of Queensland Quantum entropy Goals: 1.To define entropy, both classical and quantum. 2.To explain data compression, and.
SIMS-201 Compressing Information. 2  Overview Chapter 7: Compression Introduction Entropy Huffman coding Universal coding.
Chapter 6 Information Theory
SWE 423: Multimedia Systems
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Fundamental limits in Information Theory Chapter 10 :
Lossless data compression Lecture 1. Data Compression Lossless data compression: Store/Transmit big files using few bytes so that the original files.
2/28/03 1 The Virtues of Redundancy An Introduction to Error-Correcting Codes Paul H. Siegel Director, CMRR University of California, San Diego The Virtues.
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
1 11 Lecture 12 Overview of Probability and Random Variables (I) Fall 2008 NCTU EE Tzu-Hsien Sang.
Noise, Information Theory, and Entropy
Introduction to Information Theory
SIMS-201 Audio Digitization. 2  Overview Chapter 12 Digital Audio Digitization of Audio Samples Quantization Reconstruction Quantization error.
On Error Preserving Encryption Algorithms for Wireless Video Transmission Ali Saman Tosun and Wu-Chi Feng The Ohio State University Department of Computer.
Review of Probability Theory. © Tallal Elshabrawy 2 Review of Probability Theory Experiments, Sample Spaces and Events Axioms of Probability Conditional.
Noise, Information Theory, and Entropy
Basic Concepts in Information Theory
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
2. Mathematical Foundations
Lecture 1 Source Coding and Compression
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Discrete Random Variables: PMFs and Moments Lemon Chapter 2 Intro to Probability
EEET 5101 Information Theory Chapter 1
Information theory in the Modern Information Society A.J. Han Vinck University of Duisburg/Essen January 2003
Analogue vs Digital. Analogue  Lots of different frequencies, lots of different amplitudes  Wave recorded as it is.
(Important to algorithm analysis )
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Information & Communication INST 4200 David J Stucki Spring 2015.
Information Theory The Work of Claude Shannon ( ) and others.
Huffman Code and Data Decomposition Pranav Shah CS157B.
Coding Theory Efficient and Reliable Transfer of Information
Additive White Gaussian Noise
Source Coding Efficient Data Representation A.J. Han Vinck.
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Combinatorics (Important to algorithm analysis ) Problem I: How many N-bit strings contain at least 1 zero? Problem II: How many N-bit strings contain.
Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel.
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
STATISTIC & INFORMATION THEORY (CSNB134) MODULE 11 COMPRESSION.
Presented by Minkoo Seo March, 2006
Shanon Weaver Model Sujit Kumar Mohanty Assistant Professor Department of Journalism & Mass Communication Central University of Orissa Presentation at:-
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
ECE 101 An Introduction to Information Technology Information Coding.
Entropy estimation and lossless compression Structure and Entropy of English How much lossless compression can be achieved for a given image? How much.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Data Compression: Huffman Coding in Weiss (p.389)
Shannon Entropy Shannon worked at Bell Labs (part of AT&T)
CH 8. Image Compression 8.1 Fundamental 8.2 Image compression models
Information Theory Michael J. Watts
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Why Compress? To reduce the volume of data to be transmitted (text, fax, images) To reduce the bandwidth required for transmission and to reduce storage.
A Brief Introduction to Information Theory
Source Encoding and Compression
Entropy CSCI284/162 Spring 2009 GWU.
Presentation transcript:

Entropy and Information Theory Aida Austin 4/24/2009

Overview What is information theory? Random variables and entropy Entropy in information theory Applications Compression Data Transmission

Information Theory Developed in 1948 by Claude E. Shannon at Bell Laboratories Introduced in “A Mathematical Theory of Communication'' Goal: Efficient transmission of information over a noisy network Defines fundamental limits on compression needed for reliable data communication Claude E. Shannon

Random Variables A random variable is a function. Assigns numerical values to all possible outcomes (events) Example: A fair coin is tossed. Let X represent a random variable. Possible outcomes:

Entropy in Information Theory Entropy is the measure of the average information content missing from a set of data when the value of the random variable is not known. Helps determine the average number of bits needed for storage or communication of a signal. As the number of possible outcomes for a random variable increases, entropy increases. As entropy increases, information decreases Example: MP3 sampled at 128 kbps has higher entropy rate than 320 kbps MP3

Applications Data Compression MP3 (lossy) JPEG (lossy) ZIP (lossless) Cryptography Encryption Decryption Signal Transmission Across a Network Email Text Message Cell phone

Data Compression “Shrinks” the size of a signal/file/etc. to reduce cost of storage and transmission Smaller data size reduces the possible outcomes of the associated random variables, thus decreasing the entropy of the data. Entropy - minimum number of bits needed to encode with a lossless compression. Lossless (no data lost) if the rate of compression = entropy rate

Signal/Data Transmission Channel coding reduces bit error and bit loss due to noise in a network. As entropy increases, the likelihood of valuable information transmitted decreases. Example: Consider a signal composed of random variables. We may know the probability of certain values being transmitted, but we do not know the exact values will be received unless the transmission rate = entropy rate.

Questions?

Resources http://www.gap-system.org/~history/PictDisplay/Shannon.html http://en.wikipedia.org/wiki/Information_entropy http://en.wikipedia.org/wiki/Bit_rate http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html http://www.data-compression.com/theory.html