Download presentation
Presentation is loading. Please wait.
Published byRudolf Bridges Modified over 6 years ago
1
경희대학교 MediaLab 서 덕 영 suh@khu.ac.kr
Information theory 경희대학교 MediaLab 서 덕 영
2
MediaLab , Kyunghee University
Digital Multimedia MediaLab , Kyunghee University
3
Analog digital conversion (ADC) Coding(compression) Channel capacity
Information theory Analog digital conversion (ADC) “Being digital” : integrated service Coding(compression) *.zip, *.mp3, *.avi …. Channel capacity LTE, “빠름, 빠름, 빠름” Error correction
4
Information theory: Entropy
amount of information = ‘degree of surprise’ Entropy and average code length Information source and coding Memory-less source : no correlation ∙∙∙∙∙ Red blue yellow yellow red black red ∙∙∙ ∙∙∙ 7/22/2018 Media Lab. Kyung Hee University
5
Analog-to-digital conversion
6
Media Lab. Kyung Hee University
ADC: Quantization? analog-to-digit-al quantization In order to cook in binary computers digital TV, digital comm., digital control… fine-to-coarse digital quantization ADC Infinite numbers finite numbers 7/22/2018 Media Lab. Kyung Hee University
7
ADC: Fine-to-coarse Quantization
Dice vs. coin Effects of quantization Data compression Information loss, but not all 1/6 1/2 {1,2,3} head {4,5,6} tail H T quantization ∙∙∙ H T H H T T ∙∙∙ 7/22/2018 Media Lab. Kyung Hee University
8
pdf and information loss
pdf (probability density function) The narrower pdf, the less error at the same number of bits The narrower pdf, the less number of bits at the same error Media signal
9
Fixed step size More error Non-uniform pdf Variable step size
Less error Media signal
10
Media Lab. Kyung Hee University
Correlation in text memory-less and memory I(x) = log2 (1/px) = “degree of surprise” qu-, re-, th-, -tion, less uncertain Of course, there are exceptions... Qatar, Qantas Conditional probability p(u|q) >> p(u) Then, I(u|q) << I(u) accordingly, I(n|tio) << I(n) 7/22/2018 Media Lab. Kyung Hee University
11
Differential Pulse-Coded Modulation (DPCM)
Quantize not x[n] but d[n]. Principle : Pdf of d[n] is narrower than that of x[n]. Less error at the same number of bits. Less amount of data, at the same error. -2o 3 Prediction Quantize -20o 30o Media signal
12
Coding *.zip
13
Red blue yellow yellow red black red
Coding Series of symbols bits Requirements Uniquely decodable Less number of bits: ∙∙∙∙∙ Red blue yellow yellow red black red ∙∙∙ ∙∙∙
14
Huffman code Average code length ∼ Entropy? Ex) Encode/decode AADHBEA
A solution is Huffman code. Used in *.mp3, *.avi, *.mp4 Ex) Encode/decode AADHBEA Media signal
15
Other codes Arithmetic code: HDTV Zip code winzip, compress, etc.
16
Channel Capacity
17
Digital communications
주파수 경매???
18
Digital communications
Claude Shannon(1916~2001) - American electrical engineer who founded information theory with his 1948 paper "A Mathematical Theory of Communication"
19
Additive Gaussian noise : SNR = P/No n(t)
Channel model Additive Gaussian noise : SNR = P/No n(t) 2-D Gaussian noise x(t) y(t) 1/2 1/2 n(t) -5V 5V B σ A
20
Multi-dimensional Gaussian channel
Noise power σ2 = NoW Number of small spheres Realcomplex W trials per sec over W Hz band
21
Digital communications in AWGN channel
Shannon equation C [bps/Hz] CATV (5MHz, 80Mbps) 위성 TV (5MHz, 30Mbps) 지상파 TV (5MHz, 20Mbps)
22
Conclusion Information amount Entropy Shannon’s channel capacity
Digital communications, digital multimedia!!
23
Q&A
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.