Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY.

Slides:



Advertisements
Similar presentations
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Advertisements

Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Sampling and Pulse Code Modulation
Information Theory EE322 Al-Sanie.
Fundamentals of Data & Signals (Part II) School of Business Eastern Illinois University © Abdou Illia, Spring 2015 (February18, 2015)
Chapter 2 Fundamentals of Data and Signals
Data Communication Topics to be discussed:  Data Communication Terminology.  Data Transmission Signals.  Data Transmission Circuits.  Serial & Parallel.
Chi-Cheng Lin, Winona State University CS412 Introduction to Computer Networking & Telecommunication Theoretical Basis of Data Communication.
Chapter Two Fundamentals of Data and Signals
CHAPTER 4 DIGITAL MODULATION Part 1.
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Sixth Edition.
4.2 Digital Transmission Pulse Modulation (Part 2.1)
Some Common Binary Signaling Formats: NRZ RZ NRZ-B AMI Manchester.
Lab 2 COMMUNICATION TECHNOLOGY II. Capacity of a System The bit rate of a system increases with an increase in the number of signal levels we use to denote.
Chapter 6 Information Theory
Chapter 2 Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User’s Approach.
Fundamental limits in Information Theory Chapter 10 :
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Fifth Edition.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
Chapter 2 Fundamentals of Data and Signals
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
Chapter 2: Fundamentals of Data and Signals. 2 Objectives After reading this chapter, you should be able to: Distinguish between data and signals, and.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Source Coding Hafiz Malik Dept. of Electrical & Computer Engineering The University of Michigan-Dearborn
1 Chapter 2 Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User’s Approach.
331: STUDY DATA COMMUNICATIONS AND NETWORKS.  1. Discuss computer networks (5 hrs)  2. Discuss data communications (15 hrs)
Noise, Information Theory, and Entropy
Base-Band Digital Data Transmission Prepared By: Amit Degada. Electronics Engineering Department, Sardar Vallabhbhai National Institute of Technology,
Chapter 2 Basic Communication Theory Basic Communications Theory w Understand the basic transmission theory, and figure out the maximum data rate. w.
STATISTIC & INFORMATION THEORY (CSNB134)
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Formatting and Baseband Modulation
McGraw-Hill©The McGraw-Hill Companies, Inc., 2001 Data Transmission Techniques Data to be transmitted is of two types 1.Analog data 2.Digital data Therefore,
Lecture 1 Signals in the Time and Frequency Domains
Data Communications & Computer Networks, Second Edition1 Chapter 2 Fundamentals of Data and Signals.
Introduction.
Channel Capacity.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
The Physical Layer Lowest layer in Network Hierarchy. Physical transmission of data. –Various flavors Copper wire, fiber optic, etc... –Physical limits.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
1 Chapter 2 Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User’s Approach.
Introduction to Digital and Analog Communication Systems
Information Theory The Work of Claude Shannon ( ) and others.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
1 Composite Signals and Fourier Series To approximate a square wave with frequency f and amplitude A, the terms of the series are as follows: Frequencies:
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Eighth Edition.
4.2 Digital Transmission Pulse Modulation Pulse Code Modulation
Chapter 2 Fundamentals of Data and Signals
1 Signals. 2 Signals Introduction Introduction Analog and Digital Analog and Digital.
Chapter Two Fundamentals of Data and Signals Data Communications and Computer Networks: A Business User's Approach Eighth Edition.
Data Communications and Computer Networks Chapter 1 Network Architecture Models Logical and physical connections.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Eeng360 1 Chapter 1 INTRODUCTION  Propagation of Electromagnetic Waves  Information Measure  Channel Capacity and Ideal Communication Systems Huseyin.
DIGITAL COMMUNICATION. Introduction In a data communication system, the output of the data source is transmitted from one point to another. The rate of.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
INTRODUCTION. Electrical and Computer Engineering  Concerned with solving problems of two types:  Production or transmission of power.  Transmission.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Data Communication and Networking Digital Transmission Chapter 4.
Chapter Two Fundamentals of Data and Signals
Analog to digital conversion
4 장 신호(Signals) 4.1 아날로그와 디지털(Analog and Digital)
Chapter 3 : Data transmission
A Brief Introduction to Information Theory
Chapter 3 : Data transmission
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Computer Networks Bhushan Trivedi, Director, MCA Programme, at the GLS Institute of Computer Technology, Ahmadabad.
Presentation transcript:

Prepared by: Engr. Jo-Ann C. Viñas 1 MODULE 2 ENTROPY

Prepared by: Engr. Jo-Ann C. Viñas 2 OBJECTIVES: 1.Define Information 2.Discuss the Characteristics of Information 3.Introduce Types of Information Sources and Types of Communication System 4.Explain Information Theory

Prepared by: Engr. Jo-Ann C. Viñas 3 INFORMATION -Is defined as knowledge or intelligence communicated or received -a numerical quantity that measures the uncertainty in the outcome of an experiment to be performed

Prepared by: Engr. Jo-Ann C. Viñas 4 CHARACTERISTICS OF INFORMATION 1.Only the quantity of information and its integrity are important, not the meaning 2.No information is transmitted by a continuous symbol 3.Information requires change

Prepared by: Engr. Jo-Ann C. Viñas 5 TYPES OF INFORMATION SOURCES 1.Analog Information Source -produces messages that are defined on a continuum 2.Digital Information Source -produces a finite set of possible messages

Prepared by: Engr. Jo-Ann C. Viñas 6 UNITS OF INFORMATION 1.Bits 2. Dits 3.Nats

Prepared by: Engr. Jo-Ann C. Viñas 7 INFORMATION TRANSFER RATE -number of binary digits (bits) that is transmitted in unit of time -is expressed in bits per second -often called bit rate

Prepared by: Engr. Jo-Ann C. Viñas 8 SIGNALLING RATE -is the rate at which transmission changes occur -specifies how fast the signal states change in a communication channel -often called baud rate

Prepared by: Engr. Jo-Ann C. Viñas 9 CHARACTERISTICS OF A SQUARE WAVE 1.Has a frequency spectrum that contains the fundamental frequency and all its odd harmonics 2.The magnitude of the harmonic components decreases as the order goes up. 3.In practice, significant amplitude of the harmonics is limited to about 9

Prepared by: Engr. Jo-Ann C. Viñas 10 CHARACTERISTICS OF A PULSE TRAINS 1.Has a spectrum with many components 2.Frequencies of these components and the bandwidth required to convey the signal depend on the pattern of the bits in the pulse train 3.The pattern that requires the greatest bandwidth is that where alternate bits change state

Prepared by: Engr. Jo-Ann C. Viñas 11 THINGS TO REMEMBER 1.A practical communication channel has to be capable of conveying any pattern of data transmitted 2.The absolute minimum bandwidth of a channel is theoretically the fundamental frequency of the square waveform 3.In practice, the bandwidth used is greater than this absolute minimum

Prepared by: Engr. Jo-Ann C. Viñas 12 DATA -a form of information that is suitable for storage in or by processing by a computer

Prepared by: Engr. Jo-Ann C. Viñas 13 INFORMATION THEORY -ideal amount of data that should be transmitted to enable the data to be efficiently transmitted without transmitting the redundant data

Prepared by: Engr. Jo-Ann C. Viñas 14 PROPERTIES OF QUANTITATIVE MEASURE OF INFORMATION 1.If a particular message is known by the user prior to being transmitted, the message contains zero information. 2.If potential messages from a source are all equally likely, then the information contained in each particular message should be equal to the number of “1”s and “0 s required to uniquely identify the message. 3.If two potential messages are not equally likely messages, the one with lesser probability contains the greater amount of information.

Prepared by: Engr. Jo-Ann C. Viñas 15 I.INFORMATION MEASURE (I i ) The information sent from a digital source when the i th message is transmitted is given by: where: P i - probability of transmitting the i th message

Prepared by: Engr. Jo-Ann C. Viñas 16 EXAMPLE 1 Suppose that equal numbers of letter grades A, B, C, D, and F are given in a certain course. How much information in bits have you received when the instructor tells you that your grade is: a.not F? b. Either A or B c.repeat a and b (solve amount of information in terms of dits)

Prepared by: Engr. Jo-Ann C. Viñas 17 EXAMPLE 2 A card is drawn at random from an ordinary deck of 52 playing cards. Find a)the information in bits that you receive when you are told that the card is a heart b)a face card c)a heart face card

Prepared by: Engr. Jo-Ann C. Viñas 18 EXAMPLE 3 Find the information content of message that consists of a digital word 12 digits long in which each digit may take on one of four possible levels. The probability of sending any of the four levels is assumed to be equal, and the level in any digit does not depend on the values taken on by previous digits.

Prepared by: Engr. Jo-Ann C. Viñas 19 EXAMPLE 4 Consider a source flipping a coin. How much information is contained in the message “the coin landed heads up”?

Prepared by: Engr. Jo-Ann C. Viñas 20 EXAMPLE 5 Consider a fast-food restaurant in which a customer is nine times as likely to order a hamburger as a fish sandwich. How much information is contained in the message “the customer wants a hamburger?” How much information is contained in the message “the customer wants a fish sandwich?”

Prepared by: Engr. Jo-Ann C. Viñas 21 EXAMPLE 6 How much information is contained in the message “you are reading this example”?

Prepared by: Engr. Jo-Ann C. Viñas 22 II.AVERAGE INFORMATION -Average information content of a message from a particular source. -expected symbols per second

Prepared by: Engr. Jo-Ann C. Viñas 23 III.RELATIVE ENTROPY The ratio of the entropy of a source to the maximum value the entropy could take for the same source symbol where: H max = log b N N = total number of symbols

Prepared by: Engr. Jo-Ann C. Viñas 24 IV.REDUNDANCY

Prepared by: Engr. Jo-Ann C. Viñas 25 V.RATE OF INFORMATION The ratio of the entropy of a source to the maximum value the entropy could take for the same source symbol

Prepared by: Engr. Jo-Ann C. Viñas 26 EXAMPLE 1 A telephone touch-tone keypad has the digits 0 to 9, plus the * and # keys. Assume the probability of sending * and # is and the probability of sending 0 to 9 is each. If the keys are pressed at a rate of 2 keys/sec, compute the entropy and data rate for this source.

Prepared by: Engr. Jo-Ann C. Viñas 27 EXAMPLE 2 Determine the following: a)Entropy b)Relative Entropy c)Rate of Information

Prepared by: Engr. Jo-Ann C. Viñas 28 EXAMPLE 3 Determine the ideal number of bits that should be allocated to each of the following characters with the probabilities given.

Prepared by: Engr. Jo-Ann C. Viñas 29 EXAMPLE 4 Consider a transmission that is to transmit the first 6 characters of the alphabet only. Each will be expressed as a digital signal. By convention, each letter would be allocated 3 bits. Find the entropy to determine the most economical way for transmitting the data.

Prepared by: Engr. Jo-Ann C. Viñas 30 EXAMPLE 5 Suppose our fast-food restaurant serves an average of eight customers per minute. What is the information rate of the food orders?

Prepared by: Engr. Jo-Ann C. Viñas 31 EXAMPLE 6 Suppose that in the fast-food restaurant mentioned previously each customer orders either one hamburger or one fish sandwich. What is the average information content in a customer’s order?

Prepared by: Engr. Jo-Ann C. Viñas 32 VI.OTHER PARAMETERS ParameterEquation 1.Code Word Length 2.Average Code Word Length

Prepared by: Engr. Jo-Ann C. Viñas 33 VI.OTHER PARAMETERS ParameterEquation 3.Coding Efficiency 4.Coding Redundancy

Prepared by: Engr. Jo-Ann C. Viñas 34 EXAMPLE Calculate the coding efficiency in representing the 26 letters of the alphabet using a binary and decimal system

Prepared by: Engr. Jo-Ann C. Viñas 35 SEATWORK 1.Calculate H(x) for a discrete memory-less channel having six symbols with probabilities: P(A) =1/2 P(B) =1/4 P(C) = 1/8 P(D) = P(E) = 1/20 P(F) = 1/40 Find the amount of information contained in the messages BAD, BED, BEEF, CAB, FACE, BABAE, ABACADA, BEAD, FADE 2.Determine the entropy for the word “YABBBADDDABBBADDDOOOOO”.

Prepared by: Engr. Jo-Ann C. Viñas 36 SEATWORK 2.Suppose a source emits r = 2000 symbols/sec selected from an alphabet size of M = 4 with symbol probability xi listed below. Find information rate.

Prepared by: Engr. Jo-Ann C. Viñas 37 ASSIGNMENT Answer Problem 9.1 and 9.7(a-b) Communication Systems Analysis and Design by Harold P.E. Stern & Samy A. Mahmoud