Introduction to Information Theory- Entropy

Slides:



Advertisements
Similar presentations
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Advertisements

Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
Entropy and Information Theory
Computer Networking Error Control Coding
SIMS-201 Compressing Information. 2  Overview Chapter 7: Compression Introduction Entropy Huffman coding Universal coding.
Entropy and Shannon’s First Theorem
Chapter 6 Information Theory
ENGS Lecture 8 ENGS 4 - Lecture 8 Technology of Cyberspace Winter 2004 Thayer School of Engineering Dartmouth College Instructor: George Cybenko,
Fundamental limits in Information Theory Chapter 10 :
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Introduction to Information Theory
Noise, Information Theory, and Entropy
STATISTIC & INFORMATION THEORY (CSNB134)
Information Theory & Coding…
Richard W. Hamming Learning to Learn The Art of Doing Science and Engineering Session 13: Information Theory ` Learning to Learn The Art of Doing Science.
COEN 180 Erasure Correcting, Error Detecting, and Error Correcting Codes.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Practical Session 10 Error Detecting and Correcting Codes.
Prof. Pushpak Bhattacharyya, IIT Bombay1 Basics Of Entropy CS 621 Artificial Intelligence Lecture /09/05 Prof. Pushpak Bhattacharyya.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
Introduction to Digital and Analog Communication Systems
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
How Computer Work Lecture 10 Page 1 How Computer Work Lecture 10 Introduction to the Physics of Communication.
Coding Theory Efficient and Reliable Transfer of Information
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
1 Decision Trees Greg Grudic (Notes borrowed from Thomas G. Dietterich and Tom Mitchell) [Edited by J. Wiebe] Decision Trees.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
Computer Communication & Networks
8 Coding Theory Discrete Mathematics: A Concept-based Approach.
Error Detection and Correction
ECE 313 Probability with Engineering Applications Lecture 7
Distributed Compression For Still Images
Computer Architecture and Assembly Language
IMAGE COMPRESSION.
Data Link Layer.
Introduction to Information theory
Dr. Clincy Professor of CS
Advanced Computer Networks
CIS 321 Data Communications & Networking
Functions Defined on General Sets
CH 8. Image Compression 8.1 Fundamental 8.2 Image compression models
Information Theory Michael J. Watts
Digital data communication (Error control)
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Scalar Quantization – Mathematical Model
Packetizing Error Detection
ICOM 6005 – Database Management Systems Design
COT 5611 Operating Systems Design Principles Spring 2012
Packetizing Error Detection
COT 5611 Operating Systems Design Principles Spring 2014
A Brief Introduction to Information Theory
EEC-484/584 Computer Networks
The Data Link Layer Chapter
Image Transforms for Robust Coding
This Presentation carries model and explanation..
Source Encoding and Compression
Packetizing Error Detection
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Computer Architecture and Assembly Language
Lecture 2: Basic Information Theory
Chapter Three: Signals and Data Transmission
Data Link Layer. Position of the data-link layer.
Presentation transcript:

Introduction to Information Theory- Entropy Subject: Information Theory and Coding By Professor Dr. Ayad Ghany Ismaeel for Dept. of computer Technical Eng./ Computer Comm. network . 6/25/2018

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 Topics Model of a Digital Communication System Communication Channel Shannon’s Definition of Communication Shannon want to maximizing the speed of ADSL at your home “Information Theory” or “The Shannon Theory” In terms of Information Theory Terminology Measurement of Information Entropy Discrete Continues References 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Model of a Digital Communication System Father of Digital Communication Message e.g. English symbols Encoder e.g. English to 0,1 sequence Information Source Coding Communication Channel Destination Decoding Can have noise or distortion Decoder e.g. 0,1 sequence to English 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Communication Channel Includes 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 And even this… 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Shannon’s Definition of Communication “The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” “Frequently the messages have meaning” “... [which is] irrelevant to the engineering problem.” 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 Shannon Wants to… Shannon wants to find a way for “reliably” transmitting data throughout the channel at “maximal” possible rate. Information Source Coding Communication Channel Destination Decoding For example, maximizing the speed of ADSL @ your home 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

And he thought about this problem for a while… He later on found a solution and published in this 1948 paper. 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 In his 1948 paper he build a rich theory to the problem of reliable communication, now called “Information Theory” or “The Shannon Theory” in honor of him. 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 Shannon’s Vision Data Source Encoding Channel Encoding Channel User Source Decoding Channel Decoding 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 Example: Disk Storage Data Zip Add CRC Channel User Unzip Verify CRC 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

In terms of Information Theory Terminology Zip Source Encoding = Data Compression Unzip Source Decoding = Data Decompression Add CRC Channel Encoding Error Protection/ cyclic redundancy check CRC error: indicates that some data in your Zip file (.zip or .zipx) is damaged. = Verify CRC Channel Decoding = Error Correction 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 Shannon Theory The original 1948 Shannon Theory contains: Measurement of Information Source Coding Theory Channel Coding Theory 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Measurement of Information Shannon’s first question is “How to measure information in terms of bits?” = ? bits = ? bits 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

All events are probabilistic! Using Probability Theory, Shannon showed that there is only one way to measure information in terms of number of bits: called the entropy function: is the expected value (average) of the information contained in each message. 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 For example Tossing a dice: Outcomes are 1,2,3,4,5,6 Each occurs at probability 1/6 Information provided by tossing a dice is 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Wait! It is nonsense! The number 2.585-bits is not an integer!! What does you mean? 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 For example, in the experience of throwing a piece normal money. The possibility of the emergence of writing is equal to the possibility of the appearance of the logo. And therefore (we'll use a logarithm base 2): p1 = 0.5, h1 = i(p1) = – log 0.5 = 1 p2 = 0.5, h2 = i(p2) = -log 0.5 = 1 And thus the average amount of information and entropy is: H =( p1) (i1) + (p2)( i2) = 0.5(1) + 0.5(1) = 1 In general, the average amount of information can be the expense of the relationship: Shannon proposed the one to decide (Bits) If we use the logarithm for the basis of 2 per account Entropy. Bit here means the amount of information that we get from this bit. 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 Example Suppose you have: p1 = 0.5 Chance of Rain Tomorrow p2 = 0.001 possibility that impair dinner on my account Does this reflect continued logarithm idea of the amount of information? Yes, fortunately. It achieves the following: log ( 1 ) = 0 log (0) = - infinity And thus it can be used as a measure of the value of information if multiplied by (-1) which gives zero when the event does not contain any surprise (we know it is falling pieces): i(p) = – log(p) To see how it behaves in the previous example, we'll use temporarily the natural logarithm: h1 = i(p1) = – ln(0.5) = 0.693 h2 = i(p2) = -ln(0.001) = 6.9 h = h1 + h2 = 0.693 + 6.9 = 7.593 h’ = i(p) = -ln(p1.p2) = -ln(p1) – ln(p2) = h1 + h2 = h 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 Continuous Entropy 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Continuous Entropy (Cont.) An Example The Uniform Distribution, allow (f ) to be the uniform distribution on [a, b]. That is: Informally, the continuous entropy of the uniform distribution is equal to the log of the width of the interval. note that h(X) can be negative! For example: if X is uniformly distributed in [0; 1/2 ], then h(X) = log ( ½- 0) = log 1/2 = -1. If entropy can be negative. 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02

Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02 References Richard W. Hamming, “Coding and Information Theory” , 2nd Edition, Prentice-Hall, New Jersey-USA, 1986. Mauro Barni and Benedetta Tondi, “Lecture notes on: Information Theory and Coding”, Universita degli Studi di Siena Facolta di Ingegneria, 2012. 6/25/2018 Prof. Dr. Ayad Barznji Information Theory-Entropy/ L01+02