Presentation is loading. Please wait.

Presentation is loading. Please wait.

Communication Amid Uncertainty

Similar presentations


Presentation on theme: "Communication Amid Uncertainty"— Presentation transcript:

1 Communication Amid Uncertainty
Madhu Sudan Harvard University Based on joint works with Brendan Juba, Oded Goldreich, Adam Kalai, Sanjeev Khanna, Elad Haramaty, Jacob Leshno, Clement Canonne, Venkatesan Guruswami, Badih Ghazi, Pritish Kamath, Ilan Komargodski and Pravesh Kothari. April 11, 2016 Skoltech: Communication Amid Uncertainty

2 Obligatory Sales Pitch
Most of communication theory [a la Shannon, Hamming]: Built around sender and receiver perfectly synchronized. Most Human communication … … does not assume perfect synchronization. And increasingly … device-device communication can also not rely on this. Can we build a mathematical theory of imperfectly synchronized communication? What are questions/answers ? April 11, 2016 Skoltech: Communication Amid Uncertainty

3 Context in Communication
In most forms of communication: Sender + Receiver share (huuuge) context In human comm: Language, news, Social In computer comm: Protocols, Codes, Distributions Helps compress communication Perfectly shared ⇒ Can be abstracted away. Imperfectly shared ⇒ What is the cost? How to study? April 11, 2016 Skoltech: Communication Amid Uncertainty

4 Communication Complexity
The model (with shared randomness) 𝑥 𝑦 𝑓: 𝑥,𝑦 ↦Σ 𝑅=$$$ Usually studied for lower bounds. This talk: CC as +ve model. Alice Bob 𝐶𝐶 𝑓 =# bits exchanged by best protocol 𝑓(𝑥,𝑦) w.p. 2/3 April 11, 2016 Skoltech: Communication Amid Uncertainty

5 Aside: Easy CC Problems [Ghazi,Kamath,S’15]
∃ Problems with large inputs and small communication? Equality testing: 𝐸𝑄 𝑥,𝑦 =1⇔𝑥=𝑦; 𝐶𝐶 𝐸𝑄 =𝑂 1 Hamming distance: 𝐻 𝑘 𝑥,𝑦 =1⇔Δ 𝑥,𝑦 ≤𝑘; 𝐶𝐶 𝐻 𝑘 =𝑂(𝑘 log 𝑘 ) [Huang etal.] Small set intersection: ∩ 𝑘 𝑥,𝑦 =1⇔ wt 𝑥 , wt 𝑦 ≤𝑘 & ∃𝑖 𝑠.𝑡. 𝑥 𝑖 = 𝑦 𝑖 =1. 𝐶𝐶 ∩ 𝑘 =𝑂(𝑘) [Håstad Wigderson] Gap (Real) Inner Product: 𝑥,𝑦∈ ℝ 𝑛 ; 𝑥 2 , 𝑦 2 =1; 𝐺𝐼 𝑃 𝑐,𝑠 𝑥,𝑦 =1 if 𝑥,𝑦 ≥𝑐; =0 if 𝑥,𝑦 ≤𝑠; 𝐶𝐶 𝐺𝐼 𝑃 𝑐,𝑠 =𝑂 1 𝑐−𝑠 ; [Alon, Matias, Szegedy] Protocol: Fix ECC 𝐸: 0,1 𝑛 → 0,1 𝑁 ; Shared randomness: 𝑖← 𝑁 ; Exchange 𝐸 𝑥 𝑖 , 𝐸 𝑦 𝑖 ; Accept iff 𝐸 𝑥 𝑖 =𝐸 𝑦 𝑖 . 𝑝𝑜𝑙𝑦 𝑘 Protocol: Use common randomness to hash 𝑛 →[ 𝑘 2 ] 𝑥=( 𝑥 1 , …, 𝑥 𝑛 ) 𝑦= 𝑦 1 , …, 𝑦 𝑛 〈𝑥,𝑦〉≜ 𝑖 𝑥 𝑖 𝑦 𝑖 Main Insight: If 𝐺←𝑁 0,1 𝑛 , then 𝔼 𝐺,𝑥 ⋅ 𝐺,𝑦 =〈𝑥,𝑦〉 Unstated philosophical contribution of CC a la Yao: Communication with a focus (“only need to determine 𝑓 𝑥,𝑦 ”) can be more effective (shorter than 𝑥 ,𝐻 𝑥 ,𝐻 𝑦 ,𝐼(𝑥;𝑦)… ) April 11, 2016 Skoltech: Communication Amid Uncertainty

6 Modelling Shared Context + Imperfection
1 Many possibilities. Ongoing effort. Alice+Bob may have estimates of 𝑥 and 𝑦. More generally: 𝑥,𝑦 correlated. Knowledge of 𝑓 – function Bob wants to compute may not be exactly known to Alice! Shared randomness Alice + Bob may not have identical copies. 3 2 April 11, 2016 Skoltech: Communication Amid Uncertainty

7 Part 1: Uncertain Compression
April 11, 2016 Skoltech: Communication Amid Uncertainty

8 Classical (One-Shot) Compression
Sender and Receiver have distribution 𝑃∼[𝑁] Sender/Receiver agree on Encoder/Decoder 𝐸/𝐷 Sender gets 𝑋∈[𝑁] ; Sends 𝐸 𝑋 Receiver gets 𝑌=𝐸(𝑋) ; Decodes 𝑋 =𝐷 𝑌 Requirement: 𝑋 =𝑋 (always) Performance: 𝔼 𝑋←𝑃 |𝐸 𝑋 | Trivial Solution: 𝔼 𝑋←𝑃 |𝐸 𝑋 | = log 𝑁 Huffman Coding: Achieves 𝔼 𝑋←𝑃 |𝐸 𝑋 | ≤𝐻 𝑃 +1 April 11, 2016 Skoltech: Communication Amid Uncertainty

9 The (Uncertain Compression) problem
[Juba,Kalai,Khanna,S.’11] Design encoding/decoding schemes (𝐸/𝐷) s.t.: Sender has distribution 𝑃∼ 𝑁 Receiver has distribution 𝑄∼[𝑁] Sender gets 𝑋∈[𝑁] ; Sends 𝐸(𝑃,𝑋) to receiver. Receiver gets 𝑌 = 𝐸(𝑃,𝑋); Decodes 𝑋 =𝐷(𝑄,𝑌) Want: 𝑋= 𝑋 (provided 𝑃,𝑄 close), While minimizing 𝔼 𝑋←𝑃 [ 𝐸 𝑃,𝑋 ] Motivation: Models natural communication? Δ 𝑃,𝑄 ≤Δ if log 𝑃 𝑥 𝑄 𝑥 ≤Δ for all 𝑥 April 11, 2016 Skoltech: Communication Amid Uncertainty

10 Solution (variant of Arith. Coding)
Uses shared randomness: Sender+Receiver ←𝑟∈ 0,1 ∗ Use 𝑟 to define sequences “dictionary” 𝑟 , 𝑟 , 𝑟 , … 𝑟 , 𝑟 , 𝑟 , … 𝑟 𝑁 1 , 𝑟 𝑁 2 , 𝑟 𝑁 3 , … Sender sends prefix of 𝑟 𝑥 [1…𝐿] as encoding of 𝑥 Receiver outputs argmax 𝑧 𝑄 𝑧 | 𝑟 𝑧 1…𝐿 = 𝑟 𝑥 [1…𝐿] Want: 𝐿 : 𝑟 𝑧 1…𝐿 = 𝑟 𝑥 1…𝐿 ⇒𝑄 𝑧 <𝑄 𝑥 ; ⇔(𝑄 𝑧 >𝑄 𝑥 ⇒ 𝑟 𝑧 1…𝐿 ≠ 𝑟 𝑥 [1…𝐿]) ⇐(𝑃 𝑧 > 4 −Δ 𝑃 𝑥 ⇒ 𝑟 𝑧 1…𝐿 ≠ 𝑟 𝑥 [1…𝐿]) Analysis: 𝔼 𝑟 𝐿 =2Δ+ log 1 𝑃(𝑥) 𝔼 𝑥,𝑟 𝐿 =2Δ+𝐻(𝑃) April 11, 2016 Skoltech: Communication Amid Uncertainty

11 Skoltech: Communication Amid Uncertainty
Implications Coding scheme reflects the nature of human communication (extend messages till they feel unambiguous). Reflects tension between ambiguity resolution and compression. Larger the ((estimated) gap in context), larger the encoding length. Entropy is still a valid measure! The “shared randomness’’ assumption A convenient starting point for discussion But is dictionary independent of context? This is problematic. April 11, 2016 Skoltech: Communication Amid Uncertainty

12 Deterministic Compression: Challenge
Say Alice and Bob have rankings of N players. Rankings = bijections 𝜋, 𝜎 : 𝑁 → 𝑁 𝜋 𝑖 = rank of i th player in Alice’s ranking. Further suppose they know rankings are close. ∀ 𝑖∈ 𝑁 : 𝜋 𝑖 −𝜎 𝑖 ≤2. Bob wants to know: Is 𝜋 −1 1 = 𝜎 −1 1 How many bits does Alice need to send (non-interactively). With shared randomness – 𝑂(1) Deterministically? 𝑂 1 ? 𝑂( log 𝑁 )?𝑂( log log log 𝑁)? With Elad Haramaty: 𝑂( log ∗ 𝑛 ) April 11, 2016 Skoltech: Communication Amid Uncertainty

13 Part 2: Imperfectly Shared Randomness
April 11, 2016 Skoltech: Communication Amid Uncertainty

14 Model: Imperfectly Shared Randomness
Alice ←𝑟 ; and Bob ←𝑠 where 𝑟,𝑠 = i.i.d. sequence of correlated pairs 𝑟 𝑖 , 𝑠 𝑖 𝑖 ; 𝑟 𝑖 , 𝑠 𝑖 ∈{−1,+1};𝔼 𝑟 𝑖 =𝔼 𝑠 𝑖 =0;𝔼 𝑟 𝑖 𝑠 𝑖 =𝜌≥0 . Notation: 𝑖𝑠 𝑟 𝜌 (𝑓) = cc of 𝑓 with 𝜌-correlated bits. 𝑐𝑐(𝑓): Perfectly Shared Randomness cc. 𝑝𝑟𝑖𝑣 𝑓 : cc with PRIVate randomness Starting point: for Boolean functions 𝑓 𝑐𝑐 𝑓 ≤𝑖𝑠 𝑟 𝜌 𝑓 ≤𝑝𝑟𝑖𝑣 𝑓 ≤𝑐𝑐 𝑓 + log 𝑛 What if 𝑐𝑐 𝑓 ≪log 𝑛? E.g. 𝑐𝑐 𝑓 =𝑂(1) =𝑖𝑠 𝑟 1 𝑓 =𝑖𝑠 𝑟 0 𝑓 𝜌≤𝜏⇒ 𝑖𝑠 𝑟 𝜌 𝑓 ≥𝑖𝑠 𝑟 𝜏 𝑓 April 11, 2016 Skoltech: Communication Amid Uncertainty

15 Imperfectly Shared Randomness: Results
Model first studied by [Bavarian,Gavinsky,Ito’14] (“Independently and earlier”). Their focus: Simultaneous Communication; general models of correlation. They show 𝑖𝑠𝑟 Equality =𝑂 1 (among other things) Our Results:[Canonne,Guruswami,Meka,S’15] Generally: 𝑐𝑐 𝑓 ≤𝑘⇒𝑖𝑠𝑟 𝑓 ≤ 2 𝑘 Converse: ∃𝑓 with 𝑐𝑐 𝑓 ≤𝑘 & 𝑖𝑠𝑟 𝑓 ≥ 2 𝑘 April 11, 2016 Skoltech: Communication Amid Uncertainty

16 Equality Testing (our proof)
Key idea: Think inner products. Encode 𝑥↦𝑋=𝐸(𝑥);𝑦↦𝑌=𝐸 𝑦 ;𝑋,𝑌∈ −1,+1 𝑁 𝑥=𝑦⇒ 〈𝑋,𝑌〉=𝑁 𝑥≠𝑦⇒ 〈𝑋,𝑌〉≤𝑁/2 Estimating inner products: Building on sketching protocols … Alice: Picks Gaussians 𝐺 1 ,… 𝐺 𝑡 ∈ ℝ 𝑁 , Sends 𝑖∈ 𝑡 maximizing 𝐺 𝑖 ,𝑋 to Bob. Bob: Accepts iff 𝐺′ 𝑖 ,𝑌 ≥0 Analysis: 𝑂 𝜌 (1) bits suffice if 𝐺 ≈ 𝜌 𝐺′ Gaussian Protocol April 11, 2016 Skoltech: Communication Amid Uncertainty

17 General One-Way Communication
Idea: All communication ≤ Inner Products (For now: Assume one-way-cc 𝑓 ≤ 𝑘) For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 𝑘 Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 𝑘 → 0,1 W.p. ≥ 2 3 over 𝑅, 𝑓 𝑅 𝑖 𝑅 is the right answer. April 11, 2016 Skoltech: Communication Amid Uncertainty

18 General One-Way Communication
For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 𝑘 Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 𝑘 → 0,1 W.p. ≥ 2 3 , 𝑓 𝑅 𝑖 𝑅 is the right answer. Vector representation: 𝑖 𝑅 ↦ 𝑥 𝑅 ∈ 0,1 2 𝑘 (unit coordinate vector) 𝑓 𝑅 ↦ 𝑦 𝑅 ∈ 0,1 2 𝑘 (truth table of 𝑓 𝑅 ). 𝑓 𝑅 𝑖 𝑅 =〈 𝑥 𝑅 , 𝑦 𝑅 〉; Acc. Prob. ∝ 𝑋,𝑌 ;𝑋= 𝑥 𝑅 𝑅 ;𝑌= 𝑦 𝑅 𝑅 Gaussian protocol estimates inner products of unit vectors to within ±𝜖 with 𝑂 𝜌 1 𝜖 2 communication. April 11, 2016 Skoltech: Communication Amid Uncertainty

19 Two-way communication
Still decided by inner products. Simple lemma: ∃ 𝐾 𝐴 𝑘 , 𝐾 𝐵 𝑘 ⊆ ℝ 2 𝑘 convex, that describe private coin k-bit comm. strategies for Alice, Bob s.t. accept prob. of 𝜋 𝐴 ∈ 𝐾 𝐴 𝑘 , 𝜋 𝐵 ∈ 𝐾 𝐵 𝑘 equals 〈 𝜋 𝐴 , 𝜋 𝐵 〉 Putting things together: Theorem: 𝑐𝑐 𝑓 ≤𝑘⇒𝑖𝑠𝑟 𝑓 ≤ 𝑂 𝜌 ( 2 𝑘 ) April 11, 2016 Skoltech: Communication Amid Uncertainty

20 Part 3: Uncertain Functionality
April 11, 2016 Skoltech: Communication Amid Uncertainty

21 Skoltech: Communication Amid Uncertainty
Model Bob wishes to compute 𝑓(𝑥,𝑦); Alice knows 𝑔≈𝑓; Alice, Bob given 𝑔,𝑓 explicitly. (Input size ~ 2 𝑛 ) Modelling Questions: What is ≈? Is it reasonable to expect to compute 𝑓 𝑥,𝑦 ? E.g., 𝑓 𝑥,𝑦 = 𝑓 ′ 𝑥 ? Can’t compute 𝑓 𝑥,𝑦 without communicating 𝑥 Answers: Assume 𝑥,𝑦∼ 0,1 𝑛 × 0,1 𝑛 uniformly. 𝑓 ≈ 𝛿 𝑔 if 𝛿 𝑓,𝑔 ≤𝛿. Suffices to compute ℎ 𝑥,𝑦 for ℎ ≈ 𝜖 𝑓 April 11, 2016 Skoltech: Communication Amid Uncertainty

22 Skoltech: Communication Amid Uncertainty
Results - 1 Thm [Ghazi,Komargodski,Kothari,S.]: ∃𝑓,𝑔,𝜇 s.t. cc 𝜇,.1 1𝑤𝑎𝑦 𝑓 , cc 𝜇, .1 1𝑤𝑎𝑦 𝑔 =1 and 𝛿 𝜇 𝑓,𝑔 =𝑜(1); but uncertain communication =Ω( 𝑛 ); Thm [GKKS]: But not if 𝑥⊥𝑦 (in 1-way setting). (2-way, even 2-round, open!) Main Idea: Canonical 1-way protocol for 𝑓: Alice + Bob share random 𝑦 1 , … 𝑦 𝑚 ∈ 0,1 𝑛 . Alice sends 𝑓 𝑥, 𝑦 1 , …, 𝑓 𝑥, 𝑦 𝑚 to Bob. Protocol used previously … but not as “canonical”. Canonical protocol robust when 𝑓≈𝑔. April 11, 2016 Skoltech: Communication Amid Uncertainty

23 Skoltech: Communication Amid Uncertainty
Conclusions Positive view of communication complexity: Communication with a focus can be effective! Context Important: New layer of uncertainty. New notion of scale (context LARGE) Importance of 𝑜( log 𝑛 ) additive factors. Many “uncertain” problems can be solved without resolving the uncertainty Many open directions+questions (which is a good thing) April 11, 2016 Skoltech: Communication Amid Uncertainty

24 Skoltech: Communication Amid Uncertainty
Thank You! April 11, 2016 Skoltech: Communication Amid Uncertainty


Download ppt "Communication Amid Uncertainty"

Similar presentations


Ads by Google