Communication Amid Uncertainty

Slides:



Advertisements
Similar presentations
Estimating Distinct Elements, Optimally
Advertisements

1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Tight Bounds for Distributed Functional Monitoring David Woodruff IBM Almaden Qin Zhang Aarhus University MADALGO.
Xiaoming Sun Tsinghua University David Woodruff MIT
Tight Lower Bounds for the Distinct Elements Problem David Woodruff MIT Joint work with Piotr Indyk.
Lower Bounds for Non-Black-Box Zero Knowledge Boaz Barak (IAS*) Yehuda Lindell (IBM) Salil Vadhan (Harvard) *Work done while in Weizmann Institute. Short.
Of 13 10/08/2013MSRNE 5 th Anniversary: Communication Amid Uncertainty1 Communication Amid Uncertainty Madhu Sudan Microsoft Research.
Of 29 12/02/2013Purdue: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: -Universal Semantic.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Of 24 11/20/2012TIFR: Deterministic Communication Amid Uncertainty1 ( Deterministic ) Communication amid Uncertainty Madhu Sudan Microsoft, New England.
Of 30 10/31/2013Cornell: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: -Universal Semantic.
Of 30 09/16/2013PACM: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: -Universal Semantic.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Of 7 10/01/2013LIDS Lunch: Communication Amid Uncertainty1 Communication Amid Uncertainty Madhu Sudan Microsoft Research.
Universal Communication Brendan Juba (MIT) With: Madhu Sudan (MIT)
Of 30 09/04/2012ITW 2012: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: Universal Semantic.
Eran Omri, Bar-Ilan University Joint work with Amos Beimel and Ilan Orlov, BGU Ilan Orlov…!??!!
Of 27 01/06/2015CMI: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011) Juba,
Of 12 03/22/2012CISS: Compression w. Uncertain Priors1 Compression under uncertain priors Madhu Sudan Microsoft, New England Based on joint works with:
Bit Complexity of Breaking and Achieving Symmetry in Chains and Rings.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Of 35 05/16/2012CTW: Communication and Computation1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: Universal Semantic Communication.
Of 19 June 15, 2015CUHK: Communication Amid Uncertainty1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on joint works with Brendan.
Of 27 August 6, 2015KAIST: Reliable Meaningful Communication1 Reliable Meaningful Communication Madhu Sudan Microsoft Research.
The Price of Uncertainty in Communication Brendan Juba (Washington U., St. Louis) with Mark Braverman (Princeton)
Of 22 10/07/2015UMass: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011)
Of 22 10/30/2015WUSTL: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Harvard Based on Juba, S. (STOC 2008, ITCS 2011) Juba, S. (STOC.
Data Stream Algorithms Lower Bounds Graham Cormode
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Imperfectly Shared Randomness
Communication Amid Uncertainty
Universal Semantic Communication
Information Complexity Lower Bounds
Distributed Compression For Still Images
New Characterizations in Turnstile Streams with Applications
And now for something completely different!
UNIT 1 – LESSON 6 SENDING NUMBERS.
Communication Amid Uncertainty
General Strong Polarization
Communication Amid Uncertainty
Communication amid Uncertainty
Universal Semantic Communication
Universal Semantic Communication
Branching Programs Part 3
Local Decoding and Testing Polynomials over Grids
Effcient quantum protocols for XOR functions
General Strong Polarization
CS 154, Lecture 6: Communication Complexity
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
Communication Amid Uncertainty
Linear sketching with parities
When are Fuzzy Extractors Possible?
Locally Decodable Codes from Lifting
Alternating tree Automata and Parity games
Quantum Information Theory Introduction
How to Delegate Computations: The Power of No-Signaling Proofs
Uncertain Compression
Linear sketching over
General Strong Polarization
When are Fuzzy Extractors Possible?
Linear sketching with parities
Imperfectly Shared Randomness
Universal Semantic Communication
Universal Semantic Communication
Universal Semantic Communication
CS21 Decidability and Tractability
Universal Semantic Communication
What should I talk about? Aspects of Human Communication
General Strong Polarization
General Strong Polarization
Presentation transcript:

Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on joint works with Brendan Juba, Oded Goldreich, Adam Kalai, Sanjeev Khanna, Elad Haramaty, Jacob Leshno, Clement Canonne, Venkatesan Guruswami, Badih Ghazi, Pritish Kamath, Ilan Komargodski and Pravesh Kothari. August 31, 2015 Harvard: Communication Amid Uncertainty

Context in Communication Sender + Receiver share (huuuge) context In human comm: Language, news, Social In computer comm: Protocols, Codes, Distributions Helps compress communication Perfectly shared ⇒ Can be abstracted away. Imperfectly shared ⇒ What is the cost? How to study? August 31, 2015 Harvard: Communication Amid Uncertainty

Communication Complexity The model (with shared randomness) 𝑥 𝑦 𝑓: 𝑥,𝑦 ↦Σ 𝑅=$$$ Usually studied for lower bounds. This talk: CC as +ve model. Alice Bob 𝐶𝐶 𝑓 =# bits exchanged by best protocol 𝑓(𝑥,𝑦) w.p. 2/3 August 31, 2015 Harvard: Communication Amid Uncertainty

Modelling Shared Context + Imperfection 1 Many possibilities. Ongoing effort. Alice+Bob may have estimates of 𝑥 and 𝑦. More generally: 𝑥,𝑦 correlated. Knowledge of 𝑓 – function Bob wants to compute may not be exactly known to Alice! Shared randomness Alice + Bob may not have identical copies. 3 2 August 31, 2015 Harvard: Communication Amid Uncertainty

Part 1: Uncertain Compression August 31, 2015 Harvard: Communication Amid Uncertainty

Classical (One-Shot) Compression Sender and Receiver have distribution 𝑃∼[𝑁] Sender/Receiver agree on Encoder/Decoder 𝐸/𝐷 Sender gets 𝑋∈[𝑁] ; Sends 𝐸 𝑋 Receiver gets 𝑌=𝐸(𝑋) ; Decodes 𝑋 =𝐷 𝑌 Requirement: 𝑋 =𝑋 (always) Performance: 𝔼 𝑋←𝑃 |𝐸 𝑋 | Trivial Solution: 𝔼 𝑋←𝑃 |𝐸 𝑋 | = log 𝑁 Huffman Coding: Achieves 𝔼 𝑋←𝑃 |𝐸 𝑋 | ≤𝐻 𝑃 +1 August 31, 2015 Harvard: Communication Amid Uncertainty

The (Uncertain Compression) problem [Juba,Kalai,Khanna,S.’11] Design encoding/decoding schemes (𝐸/𝐷) s.t.: Sender has distribution 𝑃∼ 𝑁 Receiver has distribution 𝑄∼[𝑁] Sender gets 𝑋∈[𝑁] ; Sends 𝐸(𝑃,𝑋) to receiver. Receiver gets 𝑌 = 𝐸(𝑃,𝑋); Decodes 𝑋 =𝐷(𝑄,𝑌) Want: 𝑋= 𝑋 (provided 𝑃,𝑄 close), While minimizing 𝔼 𝑋←𝑃 [ 𝐸 𝑃,𝑋 ] Motivation: Models natural communication? Δ 𝑃,𝑄 ≤Δ if 2 −Δ ≤ log 𝑃(𝑥) log 𝑄(𝑥) ≤ 2 Δ for all 𝑥 August 31, 2015 Harvard: Communication Amid Uncertainty

Solution (variant of Arith. Coding) Uses shared randomness: Sender+Receiver ←𝑟∈ 0,1 ∗ Use 𝑟 to define sequences “dictionary” 𝑟 1 1 , 𝑟 1 2 , 𝑟 1 3 , … 𝑟 2 1 , 𝑟 2 2 , 𝑟 2 3 , … … 𝑟 𝑁 1 , 𝑟 𝑁 2 , 𝑟 𝑁 3 , … Sender sends prefix of 𝑟 𝑥 [1…𝐿] as encoding of 𝑥 Receiver outputs argmax 𝑧 𝑄 𝑧 | 𝑟 𝑧 1…𝐿 = 𝑟 𝑥 [1…𝐿] Want: 𝐿 : 𝑟 𝑧 1…𝐿 = 𝑟 𝑥 1…𝐿 ⇒𝑄 𝑧 <𝑄 𝑥 ; ⇔(𝑄 𝑧 >𝑄 𝑥 ⇒ 𝑟 𝑧 1…𝐿 ≠ 𝑟 𝑥 [1…𝐿]) ⇐(𝑃 𝑧 > 4 −Δ 𝑃 𝑥 ⇒ 𝑟 𝑧 1…𝐿 ≠ 𝑟 𝑥 [1…𝐿]) Analysis: 𝔼 𝑟 𝐿 =2Δ+ log 1 𝑃(𝑥) 𝔼 𝑥,𝑟 𝐿 =2Δ+𝐻(𝑃) August 31, 2015 Harvard: Communication Amid Uncertainty

Harvard: Communication Amid Uncertainty Implications Coding scheme reflects the nature of human communication (extend messages till they feel unambiguous). Reflects tension between ambiguity resolution and compression. Larger the ((estimated) gap in context), larger the encoding length. Entropy is still a valid measure! The “shared randomness’’ assumption A convenient starting point for discussion But is dictionary independent of context? This is problematic. August 31, 2015 Harvard: Communication Amid Uncertainty

Deterministic Compression: Challenge Say Alice and Bob have rankings of N players. Rankings = bijections 𝜋, 𝜎 : 𝑁 → 𝑁 𝜋 𝑖 = rank of i th player in Alice’s ranking. Further suppose they know rankings are close. ∀ 𝑖∈ 𝑁 : 𝜋 𝑖 −𝜎 𝑖 ≤2. Bob wants to know: Is 𝜋 −1 1 = 𝜎 −1 1 How many bits does Alice need to send (non-interactively). With shared randomness – 𝑂(1) Deterministically? 𝑂 1 ? 𝑂( log 𝑁 )?𝑂( log log log 𝑁)? With Elad Haramaty: 𝑂( log ∗ 𝑛 ) August 31, 2015 Harvard: Communication Amid Uncertainty

Part 2: Imperfectly Shared Randomness August 31, 2015 Harvard: Communication Amid Uncertainty

Model: Imperfectly Shared Randomness Alice ←𝑟 ; and Bob ←𝑠 where 𝑟,𝑠 = i.i.d. sequence of correlated pairs 𝑟 𝑖 , 𝑠 𝑖 𝑖 ; 𝑟 𝑖 , 𝑠 𝑖 ∈{−1,+1};𝔼 𝑟 𝑖 =𝔼 𝑠 𝑖 =0;𝔼 𝑟 𝑖 𝑠 𝑖 =𝜌≥0 . Notation: 𝑖𝑠 𝑟 𝜌 (𝑓) = cc of 𝑓 with 𝜌-correlated bits. 𝑐𝑐(𝑓): Perfectly Shared Randomness cc. 𝑝𝑟𝑖𝑣 𝑓 : cc with PRIVate randomness Starting point: for Boolean functions 𝑓 𝑐𝑐 𝑓 ≤𝑖𝑠 𝑟 𝜌 𝑓 ≤𝑝𝑟𝑖𝑣 𝑓 ≤𝑐𝑐 𝑓 + log 𝑛 What if 𝑐𝑐 𝑓 ≪log 𝑛? E.g. 𝑐𝑐 𝑓 =𝑂(1) =𝑖𝑠 𝑟 1 𝑓 =𝑖𝑠 𝑟 0 𝑓 𝜌≤𝜏⇒ 𝑖𝑠 𝑟 𝜌 𝑓 ≥𝑖𝑠 𝑟 𝜏 𝑓 August 31, 2015 Harvard: Communication Amid Uncertainty

Imperfectly Shared Randomness: Results Model first studied by [Bavarian,Gavinsky,Ito’14] (“Independently and earlier”). Their focus: Simultaneous Communication; general models of correlation. They show 𝑖𝑠𝑟 Equality =𝑂 1 (among other things) Our Results:[Canonne,Guruswami,Meka,S’15] Generally: 𝑐𝑐 𝑓 ≤𝑘⇒𝑖𝑠𝑟 𝑓 ≤ 2 𝑘 Converse: ∃𝑓 with 𝑐𝑐 𝑓 ≤𝑘 & 𝑖𝑠𝑟 𝑓 ≥ 2 𝑘 August 31, 2015 Harvard: Communication Amid Uncertainty

Aside: Easy CC Problems [Ghazi,Kamath,S’15] Equality testing: 𝐸𝑄 𝑥,𝑦 =1⇔𝑥=𝑦; 𝐶𝐶 𝐸𝑄 =𝑂 1 Hamming distance: 𝐻 𝑘 𝑥,𝑦 =1⇔Δ 𝑥,𝑦 ≤𝑘; 𝐶𝐶 𝐻 𝑘 =𝑂(𝑘 log 𝑘 ) [Huang etal.] Small set intersection: ∩ 𝑘 𝑥,𝑦 =1⇔ wt 𝑥 , wt 𝑦 ≤𝑘 & ∃𝑖 𝑠.𝑡. 𝑥 𝑖 = 𝑦 𝑖 =1. 𝐶𝐶 ∩ 𝑘 =𝑂(𝑘) [Håstad Wigderson] Gap (Real) Inner Product: 𝑥,𝑦∈ ℝ 𝑛 ; 𝑥 2 , 𝑦 2 =1; 𝐺𝐼 𝑃 𝑐,𝑠 𝑥,𝑦 =1 if 𝑥,𝑦 ≥𝑐; =0 if 𝑥,𝑦 ≤𝑠; 𝐶𝐶 𝐺𝐼 𝑃 𝑐,𝑠 =𝑂 1 𝑐−𝑠 2 ; [Alon, Matias, Szegedy] Protocol: Fix ECC 𝐸: 0,1 𝑛 → 0,1 𝑁 ; Shared randomness: 𝑖← 𝑁 ; Exchange 𝐸 𝑥 𝑖 , 𝐸 𝑦 𝑖 ; Accept iff 𝐸 𝑥 𝑖 =𝐸 𝑦 𝑖 . 𝑝𝑜𝑙𝑦 𝑘 Protocol: Use common randomness to hash 𝑛 →[ 𝑘 2 ] 𝑥=( 𝑥 1 , …, 𝑥 𝑛 ) 𝑦= 𝑦 1 , …, 𝑦 𝑛 〈𝑥,𝑦〉≜ 𝑖 𝑥 𝑖 𝑦 𝑖 Main Insight: If 𝐺←𝑁 0,1 𝑛 , then 𝔼 𝐺,𝑥 ⋅ 𝐺,𝑦 =〈𝑥,𝑦〉 Unstated philosophical contribution of CC a la Yao: Communication with a focus (“only need to determine 𝑓 𝑥,𝑦 ”) can be more effective (shorter than 𝑥 ,𝐻 𝑥 ,𝐻 𝑦 ,𝐼(𝑥;𝑦)… ) August 31, 2015 Harvard: Communication Amid Uncertainty

Equality Testing (our proof) Key idea: Think inner products. Encode 𝑥↦𝑋=𝐸(𝑥);𝑦↦𝑌=𝐸 𝑦 ;𝑋,𝑌∈ −1,+1 𝑁 𝑥=𝑦⇒ 〈𝑋,𝑌〉=𝑁 𝑥≠𝑦⇒ 〈𝑋,𝑌〉≤𝑁/2 Estimating inner products: Building on sketching protocols … Alice: Picks Gaussians 𝐺 1 ,… 𝐺 𝑡 ∈ ℝ 𝑁 , Sends 𝑖∈ 𝑡 maximizing 𝐺 𝑖 ,𝑋 to Bob. Bob: Accepts iff 𝐺′ 𝑖 ,𝑌 ≥0 Analysis: 𝑂 𝜌 (1) bits suffice if 𝐺 ≈ 𝜌 𝐺′ Gaussian Protocol August 31, 2015 Harvard: Communication Amid Uncertainty

General One-Way Communication Idea: All communication ≤ Inner Products (For now: Assume one-way-cc 𝑓 ≤ 𝑘) For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 𝑘 Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 𝑘 → 0,1 W.p. ≥ 2 3 over 𝑅, 𝑓 𝑅 𝑖 𝑅 is the right answer. August 31, 2015 Harvard: Communication Amid Uncertainty

General One-Way Communication For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 𝑘 Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 𝑘 → 0,1 W.p. ≥ 2 3 , 𝑓 𝑅 𝑖 𝑅 is the right answer. Vector representation: 𝑖 𝑅 ↦ 𝑥 𝑅 ∈ 0,1 2 𝑘 (unit coordinate vector) 𝑓 𝑅 ↦ 𝑦 𝑅 ∈ 0,1 2 𝑘 (truth table of 𝑓 𝑅 ). 𝑓 𝑅 𝑖 𝑅 =〈 𝑥 𝑅 , 𝑦 𝑅 〉; Acc. Prob. ∝ 𝑋,𝑌 ;𝑋= 𝑥 𝑅 𝑅 ;𝑌= 𝑦 𝑅 𝑅 Gaussian protocol estimates inner products of unit vectors to within ±𝜖 with 𝑂 𝜌 1 𝜖 2 communication. August 31, 2015 Harvard: Communication Amid Uncertainty

Two-way communication Still decided by inner products. Simple lemma: ∃ 𝐾 𝐴 𝑘 , 𝐾 𝐵 𝑘 ⊆ ℝ 2 𝑘 convex, that describe private coin k-bit comm. strategies for Alice, Bob s.t. accept prob. of 𝜋 𝐴 ∈ 𝐾 𝐴 𝑘 , 𝜋 𝐵 ∈ 𝐾 𝐵 𝑘 equals 〈 𝜋 𝐴 , 𝜋 𝐵 〉 Putting things together: Theorem: 𝑐𝑐 𝑓 ≤𝑘⇒𝑖𝑠𝑟 𝑓 ≤ 𝑂 𝜌 ( 2 𝑘 ) August 31, 2015 Harvard: Communication Amid Uncertainty

Part 3: Uncertain Functionality August 31, 2015 Harvard: Communication Amid Uncertainty

Harvard: Communication Amid Uncertainty Model Bob wishes to compute 𝑓(𝑥,𝑦); Alice knows 𝑔≈𝑓; Alice, Bob given 𝑔,𝑓 explicitly. (Input size ~ 2 𝑛 ) Modelling Questions: What is ≈? Is it reasonable to expect to compute 𝑓 𝑥,𝑦 ? E.g., 𝑓 𝑥,𝑦 = 𝑓 ′ 𝑥 ? Can’t compute 𝑓 𝑥,𝑦 without communicating 𝑥 Answers: Assume 𝑥,𝑦∼ 0,1 𝑛 × 0,1 𝑛 uniformly. 𝑓 ≈ 𝛿 𝑔 if 𝛿 𝑓,𝑔 ≤𝛿. Suffices to compute ℎ 𝑥,𝑦 for ℎ ≈ 𝜖 𝑓 August 31, 2015 Harvard: Communication Amid Uncertainty

Harvard: Communication Amid Uncertainty Results - 1 Thm [Ghazi,Komargodski,Kothari,S.]: ∃𝑓,𝑔,𝜇 s.t. cc 𝜇,.1 1𝑤𝑎𝑦 𝑓 , cc 𝜇, .1 1𝑤𝑎𝑦 𝑔 =1 and 𝛿 𝜇 𝑓,𝑔 =𝑜(1); but uncertain communication =Ω( 𝑛 ); Thm [GKKS]: But not if 𝑥⊥𝑦 (in 1-way setting). (2-way, even 2-round, open!) Main Idea: Canonical 1-way protocol for 𝑓: Alice + Bob share random 𝑦 1 , … 𝑦 𝑚 ∈ 0,1 𝑛 . Alice sends 𝑓 𝑥, 𝑦 1 , …, 𝑓 𝑥, 𝑦 𝑚 to Bob. Protocol used previously … but not as “canonical”. Canonical protocol robust when 𝑓≈𝑔. August 31, 2015 Harvard: Communication Amid Uncertainty

Harvard: Communication Amid Uncertainty Conclusions Positive view of communication complexity: Communication with a focus can be effective! Context Important: New layer of uncertainty. New notion of scale (context LARGE) Importance of 𝑜( log 𝑛 ) additive factors. Many “uncertain” problems can be solved without resolving the uncertainty Many open directions+questions (which is a good thing) August 31, 2015 Harvard: Communication Amid Uncertainty

Harvard: Communication Amid Uncertainty Thank You! August 31, 2015 Harvard: Communication Amid Uncertainty