Information theory in the Modern Information Society A.J. Han Vinck University of Duisburg/Essen January 2003
content What is Information theory ? Why is it important ? Where do we find it ?
Claude Elwood Shannon: April 30, February 24, 2001 Shannon (1948), Information theory, The Mathematical theory of Communication
What is Information theory about ? Information: knowledge that can be used Communication: exchange of Information Our goal: efficient; reliable; secure
Express everything in 0 and 1 Discrete ensemble: a,b,c,d 00, 01, 10, 11 in general: k binary digits specify 2 k messages Analogue signal: 1) sample and 2) represent sample value binary t v Output 00, 10, 01, 01, 11
Shannon‘s contributions Modeling: Modeling: how to go from analogue to digital fundamental communication models Bounds: Bounds: how far can we go? achievability impossibility Constructions: Constructions: constructive communication methods with optimum performance and many more!!! 1011 R P
efficient: efficient: general problem statement remove redundancy exact, no errors !! remove irrelevance distortion !! Topics: how ? how good ? how fast ?how complex ? + + +
efficient: efficient: text represent every symbol with 8 bit 1 book: 8 * (500 pages) * 1000 symbols = 4 Mbit 1 book compression possible to 1 Mbit (1:4)
efficient: efficient: speech sampling speed 8000 samples/sec; accuracy 8 bits/sample; speed 64 kBit/s; 45 minutes lecture = 45*60*64k =180Mbit 45 books compression possible to 4.8 kBit/s (1:10)
efficient: efficient: CD music sampling speed 44.1 k samples/sec; accuracy 16 bits/sample storage capacity for one hour stereo: 5 Gbit 1250 books compression possible to 4 bits/sample ( 1:4 )
efficient: efficient: digital pictures 300 x 400 pixels x 3 colors x 8 bit/sample 2.9 Mbit/picture; for 25 images/second we need 75 Mb/s 2 hour pictures need 540 Gbit books compression needed (1:100)
efficient: summary text: 1 book storage: = 4 Mbit 1 book speech: 45 minutes lecture = 45*60*64k =180Mbit 45 books CD music: storage capacity for one hour stereo: 5 Gbit 1250 books digital pictures: 2 hour pictures need 540 Gbit books
efficient: general idea represent likely symbols with short length binary words where likely is derived from -prediction of next symbol in source output - context between the source symbols words sounds context in pictures qquq-ue, q-ua, q-ui, q-uo
Morse
efficient: applications Text: Zip; GIF etc. Music: MP3 Pictures: JPEG, MPEG Contributors in data reduction/compression: Information theorists: A. Lempel and Jacob Ziv : Huffman a.m.m.
efficient: example JPEG MB4.566MB3.267MB2.351MB
Secure: Secure: example 1 Problem: Is B the owner of the open lock?
Secure: Secure: classical Problem: Is the key present at B?
Secure: Secure: example 2
Reliable: Transmit 0 or 1 Receive 0 or correct 01 in - correct 11 correct 1 0 in - correct What can we do about it ?
Reliable: 2 examples Transmit A: = 0 0 B: = 1 1 Receive 0 0 or 1 1 OK 0 1 or 1 0 NOK 1 error detected! A: = B: = 000, 001, 010, 100 A 111, 110, 101, 011 B 1 error corrected!
Error Sensitivity: Illustration Error sensitivity: =0.01%Error sensitivity: =0.05%
Optical Storage DVD's seven-fold increase in data capacity over the CD has been largely achieved by tightening up the tolerances throughout the predecessor system The data structure was made more efficient by using a better, more efficient error correction code system.
Errors in networking
no- comment
a meshed structure 3 links down partial Fundamental Fundamental problems to consider fast re-routing of information how to include redundancy ? how much redundancy? Cost versus reliability
The success story Qualcomm CDMA Founding information theorists: Irwin Jacobs and Andrew Viterbi
narrow-band and broad-band noise SOLUTION SOLUTION: FREQUENCY and TIME DIVISION time frequency
PPM Code Example 6 code words: distance: =
Code Division Multiple Access (CDMA) 6 users: – transmit at the same time /3 5/6 5 1/2 2/4 4/5 2/5 2 2
Why IT at this university? It is fundamental. The theory is well established Based on –Discrete Mathematics; algorithms –Physics Applications: –Communications; networking; Computer science –Multi-media; medical imaging; biology; languages –Information retrieval; information control
Other application: powerline communications
Information Theory In 1948, Bell Labs scientist Claude Shannon developed Information Theory, and the world of communications technology has never been the same.