Communication Amid Uncertainty

Slides:



Advertisements
Similar presentations
Of 23 09/24/2013HLF: Reliable Meaningful Communication1 Reliable Meaningful Communication Madhu Sudan Microsoft, Cambridge, USA.
Advertisements

Of 35 05/30/2012CSOI-Summer: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: Universal Semantic.
Of 13 10/08/2013MSRNE 5 th Anniversary: Communication Amid Uncertainty1 Communication Amid Uncertainty Madhu Sudan Microsoft Research.
Tony Short University of Cambridge (with Sabri Al-Safi – PRA 84, (2011))
Of 29 12/02/2013Purdue: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: -Universal Semantic.
Of 24 11/20/2012TIFR: Deterministic Communication Amid Uncertainty1 ( Deterministic ) Communication amid Uncertainty Madhu Sudan Microsoft, New England.
Of 30 10/31/2013Cornell: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: -Universal Semantic.
Of 30 09/16/2013PACM: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: -Universal Semantic.
I NFORMATION CAUSALITY AND ITS TESTS FOR QUANTUM COMMUNICATIONS I- Ching Yu Host : Prof. Chi-Yee Cheung Collaborators: Prof. Feng-Li Lin (NTNU) Prof. Li-Yi.
Of 7 10/01/2013LIDS Lunch: Communication Amid Uncertainty1 Communication Amid Uncertainty Madhu Sudan Microsoft Research.
An introduction to Data Compression
Of 30 09/04/2012ITW 2012: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: Universal Semantic.
Of 27 01/06/2015CMI: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011) Juba,
Of 13 October 6-7, 2010Emerging Frontiers of Information: Kickoff 1 Madhu Sudan Microsoft Research + MIT TexPoint fonts used in EMF. TexPoint fonts used.
Of 12 03/22/2012CISS: Compression w. Uncertain Priors1 Compression under uncertain priors Madhu Sudan Microsoft, New England Based on joint works with:
Amparo Urbano (with P. Hernandez and J. Vila) University of Valencia. ERI-CES Pragmatic Languages with Universal Grammars: An Equilibrium Approach.
Bayesian Learning Rong Jin. Outline MAP learning vs. ML learning Minimum description length principle Bayes optimal classifier Bagging.
Texture This isn’t described in Trucco and Verri Parts are described in: – Computer Vision, a Modern Approach by Forsyth and Ponce –“Texture Synthesis.
Fundamental limits in Information Theory Chapter 10 :
February 3, 2010Harvard QR481 Coding and Entropy.
CSI Uncertainty in A.I. Lecture 201 Basic Information Theory Review Measuring the uncertainty of an event Measuring the uncertainty in a probability.
Noise, Information Theory, and Entropy
Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info.
Of 35 05/16/2012CTW: Communication and Computation1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: Universal Semantic Communication.
Noise, Information Theory, and Entropy
Of 19 June 15, 2015CUHK: Communication Amid Uncertainty1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on joint works with Brendan.
Feynman Festival, Olomouc, June 2009 Antonio Acín N. Brunner, N. Gisin, Ll. Masanes, S. Massar, M. Navascués, S. Pironio, V. Scarani Quantum correlations.
A Few Simple Applications to Cryptography Louis Salvail BRICS, Aarhus University.
Communication & Computing Madhu Sudan ( MSR New England ) Theories of.
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
Information Theory Ying Nian Wu UCLA Department of Statistics July 9, 2007 IPAM Summer School.
Coding Theory Efficient and Reliable Transfer of Information
University of Massachusetts Amherst · Department of Computer Science Square Root Law for Communication with Low Probability of Detection on AWGN Channels.
Of 27 August 6, 2015KAIST: Reliable Meaningful Communication1 Reliable Meaningful Communication Madhu Sudan Microsoft Research.
The Price of Uncertainty in Communication Brendan Juba (Washington U., St. Louis) with Mark Braverman (Princeton)
Of 27 12/03/2015 Boole-Shannon: Laws of Communication of Thought 1 Laws of Communication of Thought? Madhu Sudan Harvard.
Of 22 10/07/2015UMass: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011)
Of 22 10/30/2015WUSTL: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Harvard Based on Juba, S. (STOC 2008, ITCS 2011) Juba, S. (STOC.
Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Learning, Uncertainty, and Information: Evaluating Models Big Ideas November 12, 2004.
SEAC-3 J.Teuhola Information-Theoretic Foundations Founder: Claude Shannon, 1940’s Gives bounds for:  Ultimate data compression  Ultimate transmission.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Imperfectly Shared Randomness
Sub-fields of computer science. Sub-fields of computer science.
Communication Amid Uncertainty
Information Complexity Lower Bounds
Decision Trees and Information: A Question of Bits
Introduction to Information theory
Sampling of min-entropy relative to quantum knowledge Robert König in collaboration with Renato Renner TexPoint fonts used in EMF. Read the TexPoint.
And now for something completely different!
Communication Amid Uncertainty
General Strong Polarization
Communication Amid Uncertainty
Communication amid Uncertainty
Cryptography Lecture 4.
General Strong Polarization
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Linear sketching with parities
Quantum Information Theory Introduction
Uncertain Compression
General Strong Polarization
Imperfectly Shared Randomness
Universal Semantic Communication
Cryptography Lecture 4.
Communication Amid Uncertainty
What should I talk about? Aspects of Human Communication
General Strong Polarization
General Strong Polarization
Presentation transcript:

Communication Amid Uncertainty Madhu Sudan Harvard University Based on many joint works … January 5, 2018 Communication Amid Uncertainty

Theories of Communication & Computation Computing theory: (Turing ‘36) Fundamental principle = Universality You can program your computer to do whatever you want. Communication principle: (Shannon ‘48) Centralized design (Encoder, Decoder, Compression, IPv4, TCP/IP). You can NOT program your device! January 5, 2018 Communication Amid Uncertainty

Behavior of “intelligent” systems Players: Humans/Computers Aspects: Acquisition of knowledge Analysis/Processing Communication/Dissemination Mathematical Modelling Explains limits Highlights non-trivial phenomena/mechanisms Limits apply also to human behavior! January 5, 2018 Communication Amid Uncertainty

Contribution of Shannon theory: Entropy! Thermodynamics (Clausius/Boltzmann): 𝐻=ln Ω Quantum mechanics (von Neumann): 𝑆 𝜌 =−Tr 𝜌 ln 𝜌 Random Variables (Shannon): 𝐻 𝑃 = −∑𝑃(𝑥) log 𝑃(𝑥) Profound impact On technology of communication/data. On linguistics, philosophy, sociology, neuroscience See Information by James Gleick. January 5, 2018 Communication Amid Uncertainty

Communication Amid Uncertainty Entropy Operational view: For random variable 𝑚 Alice + Bob know distribution 𝑃 of 𝑚. Alice observes 𝑚∼𝑃 Alice tasked to communicate 𝑚 to Bob. How many bits (in expectation) does she need to send? Theorem [Shannon/Huffman]: Entropy! 𝐻 𝑃 ≤𝐶𝑜𝑚𝑚𝑢𝑛𝑖𝑐𝑎𝑡𝑖𝑜𝑛≤𝐻 𝑃 +1 January 5, 2018 Communication Amid Uncertainty

E.g. “Series of approx. to English” “We can also approximate to a natural language by means of a series of simple artificial languages.” 𝑖 th order approx.: Given 𝑖−1 symbols, choose 𝑖 th according to the empirical distribution of the language conditioned on the 𝑖−1 length prefix. 3-order (letter) approximation “IN NO IST LAT WHEY CRATICT FROURE BIRS GROCID PONDENOME OF DEMONSTURES OF THE REPTAGIN IS REGOACTIONA OF CRE.” Second-order word approximation “THE HEAD AND IN FRONTAL ATTACK ON AN ENGLISH WRITER THAT THE CHARACTER OF THIS POINT IS THEREFORE ANOTHER METHOD FOR THE LETTERS THAT THE TIME OF WHO EVER TOLD THE PROBLEM FOR AN UNEXPECTED.” “ 𝑖 𝑡ℎ order approx. produces plausible sequences of length 2𝑖” January 5, 2018 Communication Amid Uncertainty

Entropy applies to human communication? Ideal world: Language = collection of messages we send each other + probability distribution over messages. Dictionary = message → words Optimal dictionary would achieve entropy of distribution. Real world: Context! Language = distribution for every context. Dictionary = (messages,context) → word. Challenge: Context not perfectly shared! January 5, 2018 Communication Amid Uncertainty

Uncertainty in communication Repeating theme in human communication (and increasingly in devices): Communication task comes w. context Ignore context: Task achievable inefficiently. Use perfectly shared context (designed setting): Task achievable efficiently. Imperfectly shared context (humans): Task achievable moderately efficiently? Non-trivial Room for creative (robust) solutions January 5, 2018 Communication Amid Uncertainty

Uncertain Compression Design encoding/decoding schemes (𝐸/𝐷) so that Sender has distribution 𝑃 on [𝑁] Receiver has distribution 𝑄 on [𝑁] Sender gets 𝑚∈[𝑁] Sends 𝐸(𝑃,𝑚) to receiver. Receiver receives 𝑦 = 𝐸(𝑃,𝑚) Decodes to 𝑚 =𝐷(𝑄,𝑦) Want: 𝑚= 𝑚 (provided 𝑃,𝑄 close), While minimizing 𝔼 xp 𝑚∼𝑃 |𝐸(𝑃,𝑚)| Δ 𝑃,𝑄 = max 𝑚∈[𝑁] max log 𝑃 𝑚 𝑄 𝑚 , log 𝑄 𝑚 𝑃 𝑚 January 5, 2018 Communication Amid Uncertainty

Communication Amid Uncertainty Natural Compression Dictionary Words: 𝑤 𝑚,𝑗 |𝑚∈ 𝑁 , 𝑗∈ℕ , 𝑤 𝑚,𝑗 of length 𝑗, One word of each length 𝑗 for each message 𝑚 Encoding/Expression: Given 𝑚,𝑃: pick “large enough” 𝑗 and send 𝑤 𝑚,𝑗 Decoding/Understanding: Given 𝑤,𝑄: output 𝑚 s.t. 𝑤 𝑚,𝑗 =𝑤 that maximizes 𝑄(𝑚) (where 𝑗=|𝑤|) Theorem [JKKS]: If dictionary is random, then expected length = 𝐻 𝑃 +2Δ(𝑃,𝑄) Deterministic dictionary? Open! [Haramaty+S] January 5, 2018 Communication Amid Uncertainty

Other Contexts in Communication Example 1: Common Randomness. Often shared randomness between sender+receiver makes communication efficient Context = randomness Imperfect sharing = shared correlations Thm [CGKS]: Communication with imperfect sharing bounded by communication with perfect sharing! Example 2: Uncertain functionality Often conversations short if goal of communication is known + incorporated into conversation Formalized by [Yao’80] What if goal is not perfectly understood by sender+receiver? Thm [GKKS]: One way communication roughly preserved. January 5, 2018 Communication Amid Uncertainty

Communication Amid Uncertainty Conclusions Pressing need to understand human communication Context in communication: HUGE + huge role Uncertainty in context a consequence of “intelligence” (universality). Injects ambiguity, misunderstanding vulnerabilities … Needs new exploration to resolve. January 5, 2018 Communication Amid Uncertainty

Communication Amid Uncertainty Thank You! January 5, 2018 Communication Amid Uncertainty