This file contains figures from the book: Information Theory A Tutorial Introduction by Dr James V Stone 2015 Sebtel Press. Copyright JV Stone. These.

Slides:



Advertisements
Similar presentations
Simulation Examples in EXCEL Montana Going Green 2010.
Advertisements

Lecture 4A: Probability Theory Review Advanced Artificial Intelligence.
An introduction to Data Compression
Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
Phy 212: General Physics II Chapter 20: Entropy & Heat Engines Lecture Notes.
Protein- Cytokine network reconstruction using information theory-based analysis Farzaneh Farhangmehr UCSD Presentation#3 July 25, 2011.
BCS547 Neural Encoding.
Chapter 6 Information Theory
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Fundamental limits in Information Theory Chapter 10 :
Copyright © 2008 Pearson Education, Inc. Chapter 11 Probability and Calculus Copyright © 2008 Pearson Education, Inc.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
1 Chapter 5 A Measure of Information. 2 Outline 5.1 Axioms for the uncertainty measure 5.2 Two Interpretations of the uncertainty function 5.3 Properties.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
Distributed Source Coding 教師 : 楊士萱 老師 學生 : 李桐照. Talk OutLine Introduction of DSCIntroduction of DSC Introduction of SWCQIntroduction of SWCQ ConclusionConclusion.
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-1 Chapter 32: Entropy and Uncertainty Conditional, joint probability Entropy.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Information Theory and Security
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info.
Introduction to Quantum Shannon Theory Patrick Hayden (McGill University) 12 February 2007, BIRS Quantum Structures Workshop | 
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.
Review of Probability Theory. © Tallal Elshabrawy 2 Review of Probability Theory Experiments, Sample Spaces and Events Axioms of Probability Conditional.
1 Statistical NLP: Lecture 5 Mathematical Foundations II: Information Theory.
Basic Concepts in Information Theory
Introduction to information theory
Some basic concepts of Information Theory and Entropy
§1 Entropy and mutual information
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
1 CY1B2 Statistics Aims: To introduce basic statistics. Outcomes: To understand some fundamental concepts in statistics, and be able to apply some probability.
Chapter 5 Memory and Programmable Logic 5.1. Introduction 5.2. Random Access Memory 5.3. Memory Encoding 5.4. Read Only Memory 5.5. Programmable Logic.
Channel Coding Part 1: Block Coding
§4 Continuous source and Gaussian channel
Chapter 11: The Data Survey Supplemental Material Jussi Ahola Laboratory of Computer and Information Science.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
JHU CS /Jan Hajic 1 Introduction to Natural Language Processing ( ) Essential Information Theory I AI-lab
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Information Theory Basics What is information theory? A way to quantify information A lot of the theory comes from two worlds Channel.
Coding Theory Efficient and Reliable Transfer of Information
Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Machine Learning Recitation 8 Oct 21, 2009 Oznur Tastan.
Lecture 3 Appendix 1 Computation of the conditional entropy.
Mutual Information, Joint Entropy & Conditional Entropy
JHU CS /Jan Hajic 1 Introduction to Natural Language Processing ( ) Essential Information Theory II AI-lab
Computational Linguistics Seminar LING-696G Week 5.
Ch 1. Introduction (Latter) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by J.W. Ha Biointelligence Laboratory, Seoul National.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by J.W. Ha Biointelligence Laboratory, Seoul National University.
Statistical methods in NLP Course 2 Diana Trandab ă ț
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Statistical methods in NLP Course 2
Claude Shannon’s Contributions to Computer Science, Telecommunications, and Information Theory Dr. Charles C. Tappert Professor of Computer Science Seidenberg.
Introduction to Information theory
Animal Random Slide Show Menu
Information Theory Michael J. Watts
Digital Multimedia Coding
Chapter 7 Functions of Random Variables (Optional)
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
A Brief Introduction to Information Theory
Probability Trees By Anthony Stones.
Chapter 1: Introduction
Distributed Compression For Binary Symetric Channels
Entropy CSCI284/162 Spring 2009 GWU.
MAS2317- Introduction to Bayesian Statistics
Chapter 6.
Presentation transcript:

This file contains figures from the book: Information Theory A Tutorial Introduction by Dr James V Stone 2015 Sebtel Press. Copyright JV Stone. These figures are released for use under the Creative Commons License specified on next slide. A copy of this file can be obtained from

Chapter 1

A B D = = = = = = = = 7 C Left Right

0 1 1 = = = = = = = = 7 Fish Bird Dog Car Van Truck Bus No Yes Q1 Q2 Q3 Cat

Chapter 2

Random variable X Experiment Coin flip Outcome = head Mapping X(head) = 1 Random variable X Experiment Coin flip Outcome = tail Mapping X(tail) = 0

Output y Noise  Channel Encoder x = g(s) Data s Decoder Input x

Chapter 3

Noise Channel Encoder Data Decoder Input Output X

A0.87 B0.04 C0.04 D0.03 E

from xkcd.com

From xkcd.com

Chapter 4

Output y Noise  Channel Encoder x = g(s) Data s Decoder Input x

a b c

H(Y) H(X) I(X,Y) H(Y|X) H(X|Y) H(X,Y)

H(Y) H(X) I(X,Y) H(Y|X) H(X|Y) H(X,Y)

2 H(Y|X) 2 H(X|Y) Number of outputs = 2 H(Y) Number of inputs = 2 H(X) Number of possible outputs for one input = 2 H(Y|X) Number of possible inputs for one output = 2 H(X|Y) Inputs X Outputs Y

Output entropy H(Y) Mutual information I(X,Y) Output noise entropy H(Y|X) Input entropy H(X) Mutual information I(X,Y) Input noise entropy H(X|Y) (a) (b)

ABCDEFGHIABCDEFGHI B E H

Chapter 5

Residual uncertainty Residual uncertainty Residual uncertainty b) After receiving 1 bit, U=50% a) After receiving 2 bits, U=25% c) After receiving 1/2 a bit, U=71%

Noise Channel Encoder Data Decoder Input Output

Chapter 6

Output y Noise  Channel Encoder x = g(s) Data s Decoder Input x

H(Y) H(X) I(X,Y) H(Y|X) H(X|Y) H(X,Y)

Chapter 7

Chapter 8

Warm AB

ColdHot AB

Chapter 9

a b c d e

82

83 Blue-Yellow channel Luminance channel Red-Green channel S-cone M-cone L-cone

THE END.