Download presentation
Published byGavin Doyle Modified over 7 years ago
1
Claude Shannon’s Contributions to Computer Science, Telecommunications, and Information Theory
Dr. Charles C. Tappert Professor of Computer Science Seidenberg School of CSIS 2016 Research Day
2
Life Overview 1916 (April 30) – born in Petoskey, Michigan
1936 BS EE and BS Math – University of Michigan 1937 MS EE – MIT, thesis applied Boole’s logic to circuit design Considered “best MS thesis of the century” 1940 PhD – MIT, Department of Mathematics 1940 – National Research Fellow, Institute for Advanced Study in Princeton, NJ (Hermann Weyl, von Neumann, Einstein, Gödel) – Bell Labs, Murray Hill, New Jersey During World War II – worked on fire-control systems and cryptography 1943 – met and shared ideas with Alan Turing at Bell Labs Proved the cryptographic one-time pad unbreakable Invented information theory – key paper 1948 – MIT Research Lab of Electronics 2001 (February 24) – died at age 84 (almost 85) 2016 Research Day
3
Shannon’s Source Coding Theorem The Communication Channel
Source generates message s – sequence of symbols s is encoded into channel input codewords x = g(s) Channel input output x => output y decoded to s 2016 Research Day
4
Shannon’s Source Coding Theorem Channel Capacity Maximum information communicated from channel input to output 2016 Research Day
5
Shannon’s Source Coding Theorem on Channel Capacity
Shannon’s 1948 “A Mathematical Theory of Communication” studied the theoretical limits on the channel transmission of information from source to receiver, where the source is encoded before transmission He considered two types of channels – continuous and discrete And he dealt with added noise Only the discrete channel without noise is described in this brief presentation 2016 Research Day
6
Preliminaries to Shannon’s Theorem Example: Fair and Biased Coin
50% % % % 2016 Research Day
7
Shannon Information = log2(1/p(x)) Measure of Surprise of One Outcome
90-10 coin & p(tail) = 0.1 surprise! Shannon information = 3.32 bits 50-50 coin => p(head/tail) = 0.5 Shannon information = 1.00 bit 90-10 coin & p(head) = 0.9 Shannon information = 0.15 bits 2016 Research Day
8
Shannon Entropy H(X) is Average Shannon Information
For probability distribution Examples: 50-50 coin: H(X) = 0.5*log(1/0.5)+0.5*log(1/0.5)=1.00 bit 90-10 coin: H(X) = 0.9*log(1/0.9)+0.1*log(1/0.1)=0.47 bits H(X) obeys properties a measure of information should possess: continuity, symmetry, max value, additive 2016 Research Day
9
Graph of Shannon Entropy H(X) versus coin bias (probability of a head)
H = 1.00 bits 90-10 coin H = 0.47 bits 2016 Research Day
10
Shannon’s Source Coding Theorem The Communication Channel
Source generates message s – sequence of symbols s is encoded into channel input codewords x = g(s) Channel input output x => output y decoded to s 2016 Research Day
11
Shannon’s Source Coding Theorem
Shannon’s optimal channel capacity equals the maximum entropy over possible probability distributions To guarantee each binary digit carries maximum information, the optimal distribution of code values should corresponds to the independently and identically distributed case But natural signals convey information in dilute form E.g., pixels close to each other in images tend to have similar values and sequences of characters in English are related to each other Thus, this theorem is really about the coding of messages before transmission through the channel but it does not state how such recoding can be achieved Inspired creation of coding methods – Huffman, etc. 2016 Research Day
12
Shannon’s Source Coding Theorem
So far we have dealt mostly with variables having a uniform distribution, but the importance of the theorem only becomes apparent for non-uniform distributions Consider the numeric outcome of throwing a pair of 6-sided dice 2016 Research Day
13
Shannon’s Source Coding Theorem
With 11 symbols, simple 4-bit binary code = 4.00 bits/symbol 2016 Research Day
14
Shannon’s Information Entropy vs Boltzmann-Gibbs’ Thermodynamic Entropy
Shannon’s information entropy measures information Boltzmann-Gibbs’ thermodynamic entropy measures the number of possible states of a physical system (e.g., jar of gas molecules) (Note syntactic similarity to Shannon’s equation) 2016 Research Day
15
Shannon’s Major Contributions
Father of electronic communications age Applied Boolean algebra to electrical systems Inventor of information theory Foundation of the digital revolution 2016 Research Day
16
Shannon’s Hobbies and Inventions
Juggling, unicycling, and chess Card counting at blackjack in Las Vegas Portfolio investment management – made him wealthy Inventions Signal flow graphs Roman numeral computer called THROBAC Juggling machines and flame throwing trumpet First wearable computer – calculates roulette odds Ultimate “useless” machine Device to solve Rubik’s Cube puzzle Magnetic maze solving mouse called Theseus Shannon’s maxim – reformulated Kerckhoff’s principle Computer chess program with minimax procedure Minivac 601 Digital Computer Kit 2016 Research Day
17
Shannon’s Maze Solving Mouse Early example of machine learning
2016 Research Day
18
Shannon’s Minivac 601 Digital Computer Kit
2016 Research Day
19
Shannon’s Ultimate “Useless” Machine
2016 Research Day
20
Shannon Quotes We know the past but cannot control it. We control the future but cannot know it. I am very seldom interested in applications. I am more interested in the elegance of a problem. Is it a good problem, an interesting problem? I visualize a time when we will be to robots what dogs are to humans. And I am rooting for the machines. My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.” 2016 Research Day
21
Shannon Videos Information Theory Society: Father of information age: Tech icons: Claude Shannon: The man who turned paper into pixels: Shannon demonstrates machine learning: Juggling machines: 2016 Research Day
22
Many Shannon Centenary Celebrations
IEEE Information Theory Society Bell Labs U.S. Postal Stamp Proposal 2016 Research Day
23
Primary References Information Theory: A Tutorial Introduction
By James V. Stone, Sebtel Press 2015 The Logician and The Engineer By Paul J. Nahin, Princeton Univ. Press 2013 Wikipedia, Claude Shannon 2016 Research Day
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.