Presentation is loading. Please wait.

Presentation is loading. Please wait.

Communication Amid Uncertainty

Similar presentations


Presentation on theme: "Communication Amid Uncertainty"— Presentation transcript:

1 Communication Amid Uncertainty
Madhu Sudan Harvard University Based on many joint works … January 5, 2018 Communication Amid Uncertainty

2 Theories of Communication & Computation
Computing theory: (Turing ‘36) Fundamental principle = Universality You can program your computer to do whatever you want. Communication principle: (Shannon ‘48) Centralized design (Encoder, Decoder, Compression, IPv4, TCP/IP). You can NOT program your device! January 5, 2018 Communication Amid Uncertainty

3 Behavior of “intelligent” systems
Players: Humans/Computers Aspects: Acquisition of knowledge Analysis/Processing Communication/Dissemination Mathematical Modelling Explains limits Highlights non-trivial phenomena/mechanisms Limits apply also to human behavior! January 5, 2018 Communication Amid Uncertainty

4 Contribution of Shannon theory: Entropy!
Thermodynamics (Clausius/Boltzmann): 𝐻=ln Ω Quantum mechanics (von Neumann): 𝑆 𝜌 =−Tr 𝜌 ln 𝜌 Random Variables (Shannon): 𝐻 𝑃 = −∑𝑃(𝑥) log 𝑃(𝑥) Profound impact On technology of communication/data. On linguistics, philosophy, sociology, neuroscience See Information by James Gleick. January 5, 2018 Communication Amid Uncertainty

5 Communication Amid Uncertainty
Entropy Operational view: For random variable 𝑚 Alice + Bob know distribution 𝑃 of 𝑚. Alice observes 𝑚∼𝑃 Alice tasked to communicate 𝑚 to Bob. How many bits (in expectation) does she need to send? Theorem [Shannon/Huffman]: Entropy! 𝐻 𝑃 ≤𝐶𝑜𝑚𝑚𝑢𝑛𝑖𝑐𝑎𝑡𝑖𝑜𝑛≤𝐻 𝑃 +1 January 5, 2018 Communication Amid Uncertainty

6 E.g. “Series of approx. to English”
“We can also approximate to a natural language by means of a series of simple artificial languages.” 𝑖 th order approx.: Given 𝑖−1 symbols, choose 𝑖 th according to the empirical distribution of the language conditioned on the 𝑖−1 length prefix. 3-order (letter) approximation “IN NO IST LAT WHEY CRATICT FROURE BIRS GROCID PONDENOME OF DEMONSTURES OF THE REPTAGIN IS REGOACTIONA OF CRE.” Second-order word approximation “THE HEAD AND IN FRONTAL ATTACK ON AN ENGLISH WRITER THAT THE CHARACTER OF THIS POINT IS THEREFORE ANOTHER METHOD FOR THE LETTERS THAT THE TIME OF WHO EVER TOLD THE PROBLEM FOR AN UNEXPECTED.” “ 𝑖 𝑡ℎ order approx. produces plausible sequences of length 2𝑖” January 5, 2018 Communication Amid Uncertainty

7 Entropy applies to human communication?
Ideal world: Language = collection of messages we send each other + probability distribution over messages. Dictionary = message → words Optimal dictionary would achieve entropy of distribution. Real world: Context! Language = distribution for every context. Dictionary = (messages,context) → word. Challenge: Context not perfectly shared! January 5, 2018 Communication Amid Uncertainty

8 Uncertainty in communication
Repeating theme in human communication (and increasingly in devices): Communication task comes w. context Ignore context: Task achievable inefficiently. Use perfectly shared context (designed setting): Task achievable efficiently. Imperfectly shared context (humans): Task achievable moderately efficiently? Non-trivial Room for creative (robust) solutions January 5, 2018 Communication Amid Uncertainty

9 Uncertain Compression
Design encoding/decoding schemes (𝐸/𝐷) so that Sender has distribution 𝑃 on [𝑁] Receiver has distribution 𝑄 on [𝑁] Sender gets 𝑚∈[𝑁] Sends 𝐸(𝑃,𝑚) to receiver. Receiver receives 𝑦 = 𝐸(𝑃,𝑚) Decodes to 𝑚 =𝐷(𝑄,𝑦) Want: 𝑚= 𝑚 (provided 𝑃,𝑄 close), While minimizing 𝔼 xp 𝑚∼𝑃 |𝐸(𝑃,𝑚)| Δ 𝑃,𝑄 = max 𝑚∈[𝑁] max log 𝑃 𝑚 𝑄 𝑚 , log 𝑄 𝑚 𝑃 𝑚 January 5, 2018 Communication Amid Uncertainty

10 Communication Amid Uncertainty
Natural Compression Dictionary Words: 𝑤 𝑚,𝑗 |𝑚∈ 𝑁 , 𝑗∈ℕ , 𝑤 𝑚,𝑗 of length 𝑗, One word of each length 𝑗 for each message 𝑚 Encoding/Expression: Given 𝑚,𝑃: pick “large enough” 𝑗 and send 𝑤 𝑚,𝑗 Decoding/Understanding: Given 𝑤,𝑄: output 𝑚 s.t. 𝑤 𝑚,𝑗 =𝑤 that maximizes 𝑄(𝑚) (where 𝑗=|𝑤|) Theorem [JKKS]: If dictionary is random, then expected length = 𝐻 𝑃 +2Δ(𝑃,𝑄) Deterministic dictionary? Open! [Haramaty+S] January 5, 2018 Communication Amid Uncertainty

11 Other Contexts in Communication
Example 1: Common Randomness. Often shared randomness between sender+receiver makes communication efficient Context = randomness Imperfect sharing = shared correlations Thm [CGKS]: Communication with imperfect sharing bounded by communication with perfect sharing! Example 2: Uncertain functionality Often conversations short if goal of communication is known + incorporated into conversation Formalized by [Yao’80] What if goal is not perfectly understood by sender+receiver? Thm [GKKS]: One way communication roughly preserved. January 5, 2018 Communication Amid Uncertainty

12 Communication Amid Uncertainty
Conclusions Pressing need to understand human communication Context in communication: HUGE + huge role Uncertainty in context a consequence of “intelligence” (universality). Injects ambiguity, misunderstanding vulnerabilities … Needs new exploration to resolve. January 5, 2018 Communication Amid Uncertainty

13 Communication Amid Uncertainty
Thank You! January 5, 2018 Communication Amid Uncertainty


Download ppt "Communication Amid Uncertainty"

Similar presentations


Ads by Google