CSC 350 - Dr. Gary Locklair Exam #4 … CSC 350 - Dr. Gary Locklair update date on slides 5, 6, 7.

Slides:



Advertisements
Similar presentations
Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Advertisements

Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Huffman code and ID3 Prof. Sin-Min Lee Department of Computer Science.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 18 Sampling Distribution Models.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 18 Sampling Distribution Models.
Information Theory EE322 Al-Sanie.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
SIMS-201 Compressing Information. 2  Overview Chapter 7: Compression Introduction Entropy Huffman coding Universal coding.
Some Common Binary Signaling Formats: NRZ RZ NRZ-B AMI Manchester.
Copyright © 2010 Pearson Education, Inc. Chapter 18 Sampling Distribution Models.
Chapter 6 Information Theory
ENGS Lecture 8 ENGS 4 - Lecture 8 Technology of Cyberspace Winter 2004 Thayer School of Engineering Dartmouth College Instructor: George Cybenko,
Communication Systems Simulation - I Harri Saarnisaari Part of Simulations and Tools for Telecommunication Course.
Fundamental limits in Information Theory Chapter 10 :
1 NETWORK CODING Anthony Ephremides University of Maryland - A NEW PARADIGM FOR NETWORKING - February 29, 2008 University of Minnesota.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
2015/7/12VLC 2008 PART 1 Introduction on Video Coding StandardsVLC 2008 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
CSI Uncertainty in A.I. Lecture 201 Basic Information Theory Review Measuring the uncertainty of an event Measuring the uncertainty in a probability.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Noise, Information Theory, and Entropy
Noise, Information Theory, and Entropy
Some basic concepts of Information Theory and Entropy
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
STATISTIC & INFORMATION THEORY (CSNB134)
2. Mathematical Foundations
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
CY2G2 Information Theory 5
Richard W. Hamming Learning to Learn The Art of Doing Science and Engineering Session 13: Information Theory ` Learning to Learn The Art of Doing Science.
Channel Capacity.
Compression.  Compression ratio: how much is the size reduced?  Symmetric/asymmetric: time difference to compress, decompress?  Lossless; lossy: any.
Copyright © 2009 Pearson Education, Inc. Chapter 18 Sampling Distribution Models.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
Coding & Information theory
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Linawati Electrical Engineering Department Udayana University
DCSP-8: Minimal length coding I Jianfeng Feng Department of Computer Science Warwick Univ., UK
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Lecture Focus: Data Communications and Networking  Transmission Impairment Lecture 14 CSCS 311.
1 st semester 1436/  When a signal is transmitted over a communication channel, it is subjected to different types of impairments because of imperfect.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
1 Decision Trees Greg Grudic (Notes borrowed from Thomas G. Dietterich and Tom Mitchell) [Edited by J. Wiebe] Decision Trees.
Mutual Information, Joint Entropy & Conditional Entropy
Huffman Coding (2 nd Method). Huffman coding (2 nd Method)  The Huffman code is a source code. Here word length of the code word approaches the fundamental.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 18 Sampling Distribution Models.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
Statistical methods in NLP Course 2 Diana Trandab ă ț
Statistical methods in NLP Course 2
Shannon Entropy Shannon worked at Bell Labs (part of AT&T)
Introduction to Information theory
Introduction to Information Theory- Entropy
Information Theory Michael J. Watts
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
A Brief Introduction to Information Theory
Inf 723 Information & Computing
Information-Theoretic Security
Data Compression.
Presentation transcript:

CSC Dr. Gary Locklair Exam #4 …

CSC Dr. Gary Locklair update date on slides 5, 6, 7

CSC Dr. Gary Locklair III. Information Theory Computers are informational tools Information (“I”) comes from Intelligence

CSC Dr. Gary Locklair A. Shannon (Info) Theory Claude Shannon ( ) American computer scientist of the 20th century

CSC Dr. Gary Locklair Two Statements: “It is Thursday, December 8, 2005, 1:15pm, S118B, CUW …” “Plasma beings from the planet Threa have infiltrated CUW. One of them has taken the shape of Dr. Ferry and assumed his role as president.”

CSC Dr. Gary Locklair Two Statements: One is dull: “It is Thursday, December 8, 2005, 1:15pm, S118B, CUW …” The other bizarre: “Plasma beings from the planet Threa have infiltrated CUW. One of them has taken the shape of Dr. Ferry and assumed his role as president.”

CSC Dr. Gary Locklair Two Statements: One doesn’t convey any (Shannon) information “It is Thursday, December 8, 2005, 1:15pm, S118B, CUW …” Why? Because its probability is high - it is indeed Thursday, etc

CSC Dr. Gary Locklair Two Statements: The other conveys lots of (Shannon) information: “Plasma beings from the planet Threa have infiltrated CUW. One of them has taken the shape of Dr. Ferry and assumed his role as president.” Why? Because its probability is low.

CSC Dr. Gary Locklair A. Shannon (Info) Theory deals only with syntax; it does not deal with semantics.

CSC Dr. Gary Locklair A. Shannon (Info) Theory As one example, the truth or falsity of the statement isn’t considered. For the moment, we won’t care if Plasma beings have really taken over Dr. Ferry or not! :-)

CSC Dr. Gary Locklair Dr. A. E. Wilder-Smith called Information the “surprise effect”

CSC Dr. Gary Locklair Shannon (Info) Theory attempts to: 1. Quantify, or measure, I 2. Set theoretical limits (define what’s possible) for conveying (transmitting) I

CSC Dr. Gary Locklair Shannon was concerned with transmitting I over phone lines. He wanted to reliably convey I from source to destination. In other words, we don’t want the I corrupted during transmission.

CSC Dr. Gary Locklair What would corrupt the I? [student responses here :-] How might you deal with a noisy channel? [student responses here :-] Notice that all require more effort

CSC Dr. Gary Locklair Shannon asked 1. Is it possible to detect and correct a corrupted message (within what limits)? 2. How can a garbled message be recovered?

CSC Dr. Gary Locklair Deal with noise by 1. Recognize the problem 2. Compensate for it to begin with

CSC Dr. Gary Locklair Situation I must be conveyed from A  B over a noisy channel 1. Reliability – message at B should be identical to message at A

CSC Dr. Gary Locklair Situation I must be conveyed from A  B over a noisy channel 2. Maximize Rate – effective transmission time {These are the real world tradeoffs}

CSC Dr. Gary Locklair We don’t want to give up channel capacity …eg, we don’t want to repeat message 5 times to ensure it arrives since there is no new I conveyed during times 2-5

CSC Dr. Gary Locklair Shannon (Info) Theory shows that we don’t have to give up rate to gain reliability however, there is a price: delay – time to recover or decode the message increases due to (perhaps) longer messages

CSC Dr. Gary Locklair Actually two subjects Info Theory – what is possible Coding Theory – how to do it

CSC Dr. Gary Locklair

Theory (fun!) Mutual Information – I provided about event X by occurrence of event Y I(X;Y) = LOG P(X|Y) / P(X)

CSC Dr. Gary Locklair Theory (fun!) Self Info – I(X) = LOG 1 / P(X) Entropy – “average” I = H(X) =  P(X) * LOG 1 / P(X)

CSC Dr. Gary Locklair remember 1. don’t consider if message is T or F (yet) 2. Shannon (Info) Theory depends upon probability of the message

CSC Dr. Gary Locklair Example Self Info – I(X) = LOG 1 / P(X)

CSC Dr. Gary Locklair If message is certain, Info should be …? I(X) = LOG 1 / P(X) … P(X) = 1 (100%) we usually use LOG 2 and unit of I is the bit LOG 2 1 = 0 … makes sense, there is no “info” in a certain message

CSC Dr. Gary Locklair Practical Example Using Shannon (Info) Theory along with Coding Theory to see how to efficiently (and reliably) transmit I …

CSC Dr. Gary Locklair What if we know some messages are more likely than others? Ex: a weather forecaster with 4 possible forecasts:

CSC Dr. Gary Locklair Ex: a weather forecaster with 4 possible forecasts: Cloudy = 50% - ½ Sunny = 25% - ¼ Rainy = 12.5% - 1 / 8 Tornado = 12.5% - 1 / 8

CSC Dr. Gary Locklair Normally, for 4 different messages we’d need at least … how many bits? 2 bits (encode 4 possibilities as 00, 01, 10, 11)

CSC Dr. Gary Locklair Shannon (Info) Theory says “average info” (entropy) of this “alphabet” (set of messages) is:  P(X) * LOG 1 / P(X) sum up (probability * self I)

CSC Dr. Gary Locklair (½ * LOG 2 1/(½)) + (¼ * LOG 2 1/(¼)) + ( 1 / 8 * LOG 2 1/( 1 / 8 )) + ( 1 / 8 * LOG 2 1/( 1 / 8 )) = 1/2+ 2/4 + 3/8+ 3/8 = 1¾

CSC Dr. Gary Locklair What? How can we transmit in less than 2 bits? Shannon says only 1¾ bits! On average, one message is much more likely, therefore encode it in a shorter bit string than the others.

CSC Dr. Gary Locklair Huffman encoding – variable length codes Coding Theory (prefix code) 0 = cloudy 10 = sunny 110 = rainy 111 = tornado

CSC Dr. Gary Locklair Zeb/Zeke Joke Zeb and Zeke were sadly returning from an expensive fishing trip which only produced one fish.

CSC Dr. Gary Locklair Zeb/Zeke Joke “The way I figure it,” said Zeke, “that lousy fish cost us $400!” “Wow,” replied Zeb, “it’s a good thing we didn’t catch more!”

CSC Dr. Gary Locklair B. Information Theory Gary Locklair - American computer scientist of the 21st century Who? Information only comes from Intelligence

CSC Dr. Gary Locklair Dr A. E. Wilder-Smith British scientist of the 20th century Life is matter + teleonomy (Information Content) Ultimate source of teleonomy is an omnipotent God

CSC Dr. Gary Locklair Dr. Werner Gitt German computer scientist of the 21st century “Laws of Information” Information consists of syntax and semantics

CSC Dr. Gary Locklair Dr. Werner Gitt “According to Shannon’s Theory any random sequence of symbols is regarded as information, without regard to its origin, nor whether it is meaningful or not.”

CSC Dr. Gary Locklair Gitt’s Levels of Information 1 - Statistics (transmitted signal) 2 - Syntax (coding method) 3 - Semantics (meaning) 4 - Pragmatics (action) 5 - Apobetics (purpose)

CSC Dr. Gary Locklair Information Theory is more than just statistics (Shannon) There must be an associated meaning for information to be present Example: computer program …

CSC Dr. Gary Locklair In Shannon (Info) Theory the truth or falsity of the statement isn’t considered! Although the “Plasma being” message has high Shannon (Info) Content It’s a bloomin’ lie! :-)

CSC Dr. Gary Locklair In Shannon (Info) Theory the truth or falsity of the statement isn’t considered! Although the “Today is …” message has low Shannon (Info) Content It’s the truth! :-)

CSC Dr. Gary Locklair Information Theory implies a sender and a receiver Sender: I have a purpose in mind, that will require some action, so I will communicate my idea using a particular code and then transmit it.

CSC Dr. Gary Locklair Information Theory implies a sender and a receiver Receiver: I received a signal, now I must decode it. I can now understand the idea and implement some action to achieve the desired result.

CSC Dr. Gary Locklair This is Information “For God so loved the world that he gave his one and only Son, that whoever believes in him shall not perish but have eternal life.”

CSC Dr. Gary Locklair From Computer Scientist Don Knuth