Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
CY2G2 Information Theory 1
Sampling and Pulse Code Modulation
Digital Coding of Analog Signal Prepared By: Amit Degada Teaching Assistant Electronics Engineering Department, Sardar Vallabhbhai National Institute of.
Huffman code and ID3 Prof. Sin-Min Lee Department of Computer Science.
Information Theory EE322 Al-Sanie.
Capacity of Wireless Channels
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Chapter Two Fundamentals of Data and Signals
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Sixth Edition.
Some Common Binary Signaling Formats: NRZ RZ NRZ-B AMI Manchester.
Chain Rules for Entropy
Entropy and Shannon’s First Theorem
Chapter 6 Information Theory
Chapter 2 Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User’s Approach.
Fundamental limits in Information Theory Chapter 10 :
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Fifth Edition.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
Chapter 2 Fundamentals of Data and Signals
Chapter 2: Fundamentals of Data and Signals. 2 Objectives After reading this chapter, you should be able to: Distinguish between data and signals, and.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Fundamentals of Multimedia Chapter 7 Lossless Compression Algorithms Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
2015/7/12VLC 2008 PART 1 Introduction on Video Coding StandardsVLC 2008 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Information Theory and Security
1 Chapter 2 Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User’s Approach.
Noise, Information Theory, and Entropy
1 Lossless Compression Multimedia Systems (Module 2) r Lesson 1: m Minimum Redundancy Coding based on Information Theory: Shannon-Fano Coding Huffman Coding.
Noise, Information Theory, and Entropy
STATISTIC & INFORMATION THEORY (CSNB134)
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Data Communications & Computer Networks, Second Edition1 Chapter 2 Fundamentals of Data and Signals.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
1.Check if channel capacity* can cope with source information rate, if yes, source coding can proceed. Understand why. (i) Calculate Source entropy from.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Linawati Electrical Engineering Department Udayana University
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Introduction to Digital and Analog Communication Systems
Information Theory The Work of Claude Shannon ( ) and others.
DIGITAL COMMUNICATIONS Linear Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
CHAPTER 5 SIGNAL SPACE ANALYSIS
Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.
Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.
ECE 4710: Lecture #13 1 Bit Synchronization  Synchronization signals are clock-like signals necessary in Rx (or repeater) for detection (or regeneration)
Abdullah Aldahami ( ) April 6,  Huffman Coding is a simple algorithm that generates a set of variable sized codes with the minimum average.
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Eighth Edition.
Digital Modulation Technique
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Huffman Coding (2 nd Method). Huffman coding (2 nd Method)  The Huffman code is a source code. Here word length of the code word approaches the fundamental.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
Introduction to Information theory
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
A Brief Introduction to Information Theory
Lecture 2: Basic Information Theory
Presentation transcript:

Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat Information Theory Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat

Goal of Today’s Lecture Information Theory……Some Introduction Information Measure Function Determination for Information Average Information per Symbol Information rate Coding Shannon-Fano Coding

Information Theory It is a study of Communication Engineering plus Maths. A Communication Engineer has to Fight with Limited Power Inevitable Background Noise Limited Bandwidth

Information Theory deals with The Measure of Source Information The Information Capacity of the channel Coding If The rate of Information from a source does not exceed the capacity of the Channel, then there exist a Coding Scheme such that Information can be transmitted over the Communication Channel with arbitrary small amount of errors despite the presence of Noise

Information Measure This is utilized to determine the information rate of discrete Sources Consider two Messages A Dog Bites a Man  High probability  Less information A Man Bites a Dog  Less probability  High Information So we can say that Information α (1/Probability of Occurrence)

Information Measure Also we can state the three law from Intution Rule 1: Information I(mk) approaches to 0 as Pk approaches infinity. Mathematically I(mk) = 0 as Pk  1 e.g. Sun Rises in East

Information Measure Rule 2: The Information Content I(mk) must be Non Negative contity. It may be zero Mathematically I(mk) >= 0 as 0 <= Pk <=1 e.g. Sun Rises in West.

Information Measure Rule 3: The Information Content of message having Higher probability is less than the Information Content of Message having Lower probability Mathematically I(mk) > I(mj)

Information Measure Also we can state for the Sum of two messages that the information content in the two combined messages is same as the sum of information content of each message Provided the occurrence is mutually independent. e.g. There will be Sunny weather Today. There will be Cloudy weather Tomorrow Mathematically I (mk and mj) = I(mk mj) = I(mk)+I(mj)

Information measure So Question is which function that we can use that measure the Information? Information = F(1/Probability) Requirement that function must satisfy Its output must be non negative Quantity. Minimum Value is 0. It Should make Product into summation. Information I(mk) = Log b (1/ Pk ) Here b may be 2, e or 10 If b = 2 then unit is bits b = e then unit is nats b = 10 then unit is decit

Conversion Between Units

Example A Source generates one of four symbols during each interval with probabilities P1=1/2, P2=1/4, P3= P4=1/8. Find the Information content of three messages.

Average Information Content It is necessary to define the information content of the particular symbol as communication channel deals with symbol. Here we make following assumption….. The Source is stationery, so Probability remains constant with time. The Successive symbols are statistically independent and come out at avg rate of r symbols per second

Average Information Content Suppose a source emits M Possible symbols s1, s2, …..SM having Probability of occurrence p1,p2,…….pm For a long message having symbols N (>>M) s1 will occur P1N times, like also s2 will occur P2N times so on…….

Average Information Content Since s1 occurs p1N times so information Contribution by s1 is p1Nlog(1/p1). Similarly information Contribution by s2 is p2Nlog(1/p2). And So on……. Hence the Total Information Content is And Average Information is obtained by Bits/Symbol It means that In long message we can expect H bit of information per symbol. Another name of H is entropy.

Information Rate Information Rate = Total Information/ time taken Here Time Taken n bits are transmitted with r symbols per second. Total Information is nH. Information rate Bits/sec

Some Maths H satisfies following Equation Maximum H Will occur when all the message having equal Probability. Hence H also shows the uncertainty that which of the symbol will occur. As H approaches to its maximum Value we can’t determine which message will occur. Consider a system Transmit only 2 Messages having equal probability of occurrence 0.5. at that Time H=1 And at every instant we cant say which one of the two message will occur. So what would happen for more then two symbol source?

Variation of H Vs. p Let’s Consider a Binary Source, means M=2 Let the two symbols occur at the probability p and 1-p Respectively. Where o < p < 1. So Entropy can be Horse Shoe Function

Variation of H Vs. P Now We want to obtain the shape of the curve Verify it by Double differentiation

Example

Maximum Information rate We Know that Also Hence

Coding for Discrete memoryless Source Here Discrete means The Source is emitting different symbols that are fixed. Memoryless = Occurrence of present symbol is independent of previous symbol. Average Code Length Where Ni=Code length in Binary digits (binits)

Coding for Discrete memoryless Source Efficiency

Coding for Discrete memoryless Source Kraft’s inequality If this is satisfied then only the Coding is uniquely Decipherable or Separable.

Example Find The efficiency and Kraft’s inequality This Code is not Uniquely Decipherable mi pi Code I Code II Code III Code IV A B C D ½ ¼ 00 01 10 11 1 10 11 01 011 0111 10 110 111

Shannon –Fano Coding Technique Algorithm. Step 1: Arrange all messages in descending order of probability. Step 2: Devide the Seq. in two groups in such a way that sum of probabilities in each group is same. Step 3: Assign 0 to Upper group and 1 to Lower group. Step 4: Repeat the Step 2 and 3 for Group 1 and 2 and So on……..

Example Coding Procedure ½ 1/8/ 1/8 1/16 1/32 1 1 1 M1 M2 M3 M4 M5 M6 Messages Mi Pi Coding Procedure No. Of Bits Code ½ 1/8/ 1/8 1/16 1/32 1 1 1 M1 M2 M3 M4 M5 M6 M7 m8 1 3 4 5 100 101 1100 1101 1110 11110 11111 1 1 1 1

This can be downloaded from www.amitdegada.weebly.com/download After 5:30 Today

Questions

Thank You