Imperfectly Shared Randomness in Communication Madhu Sudan Microsoft Research Joint work with Clément Canonne (Columbia), Venkatesan Guruswami (CMU) and Raghu Meka (UCLA). 03/26/2015 MSR Theory Day: ISR in Communication
Communication Complexity The model (with shared randomness) 𝑥 𝑦 𝑓: 𝑥,𝑦 ↦Σ 𝑅=$$$ Alice Bob 𝐶𝐶 𝑓 =# bits exchanged by best protocol 𝑓(𝑥,𝑦) w.p. 2/3 03/26/2015 MSR Theory Day: ISR in Communication
Communication Complexity - Motivation Standard: Lower bounds Circuit complexity, Streaming, Data Structures, extended formulations … This talk: Communication complexity as model for communication Food for thought: Shannon vs. Yao? Which is the right model. Typical (human-human/computer-computer) communication involves large context and brief communication. Contexts imperfectly shared. 03/26/2015 MSR Theory Day: ISR in Communication
MSR Theory Day: ISR in Communication This talk Example problems with low communication complexity. New form of uncertainty and overcoming it. 03/26/2015 MSR Theory Day: ISR in Communication
Aside: Easy CC Problems Equality testing: 𝐸𝑄 𝑥,𝑦 =1⇔𝑥=𝑦; 𝐶𝐶 𝐸𝑄 =𝑂 1 Hamming distance: 𝐻 𝑘 𝑥,𝑦 =1⇔Δ 𝑥,𝑦 ≤𝑘; 𝐶𝐶 𝐻 𝑘 =𝑂(𝑘 log 𝑘 ) [Huang etal.] Small set intersection: ∩ 𝑘 𝑥,𝑦 =1⇔ wt 𝑥 , wt 𝑦 ≤𝑘 & ∃𝑖 𝑠.𝑡. 𝑥 𝑖 = 𝑦 𝑖 =1. 𝐶𝐶 ∩ 𝑘 =𝑂(𝑘) [Håstad Wigderson] Gap (Real) Inner Product: 𝑥,𝑦∈ ℝ 𝑛 ; 𝑥 2 , 𝑦 2 =1; 𝐺𝐼 𝑃 𝑐,𝑠 𝑥,𝑦 =1 if 𝑥,𝑦 ≥𝑐; =0 if 𝑥,𝑦 ≤𝑠; 𝐶𝐶 𝐺𝐼 𝑃 𝑐,𝑠 =𝑂 1 𝑐−𝑠 2 ; [Alon, Matias, Szegedy] Protocol: Fix ECC 𝐸: 0,1 𝑛 → 0,1 𝑁 ; Shared randomness: 𝑖← 𝑁 ; Exchange 𝐸 𝑥 𝑖 , 𝐸 𝑦 𝑖 ; Accept iff 𝐸 𝑥 𝑖 =𝐸 𝑦 𝑖 . 𝑝𝑜𝑙𝑦 𝑘 Protocol: Use common randomness to hash 𝑛 →[ 𝑘 2 ] 𝑥=( 𝑥 1 , …, 𝑥 𝑛 ) 𝑦= 𝑦 1 , …, 𝑦 𝑛 〈𝑥,𝑦〉≜ 𝑖 𝑥 𝑖 𝑦 𝑖 Main Insight: If 𝐺←𝑁 0,1 𝑛 , then 𝔼 𝐺,𝑥 ⋅ 𝐺,𝑦 =〈𝑥,𝑦〉 Thanks to Badih Ghazi and Pritish Kamath 03/26/2015 MSR Theory Day: ISR in Communication
Uncertainty in Communication Overarching question: Are there communication mechanisms that can overcome uncertainty? What is uncertainty? This talk: Alice, Bob don’t share randomness perfectly; only approximately. A B f R 03/26/2015 MSR Theory Day: ISR in Communication
MSR Theory Day: ISR in Communication Rest of this talk Model: Imperfectly Shared Randomness Positive results: Coping with imperfectly shared randomness. Negative results: Analyzing weakness of imperfectly shared randomness. 03/26/2015 MSR Theory Day: ISR in Communication
Model: Imperfectly Shared Randomness Alice ←𝑟 ; and Bob ←𝑠 where 𝑟,𝑠 = i.i.d. sequence of correlated pairs 𝑟 𝑖 , 𝑠 𝑖 𝑖 ; 𝑟 𝑖 , 𝑠 𝑖 ∈{−1,+1};𝔼 𝑟 𝑖 =𝔼 𝑠 𝑖 =0;𝔼 𝑟 𝑖 𝑠 𝑖 =𝜌≥0 . Notation: 𝑖𝑠 𝑟 𝜌 (𝑓) = cc of 𝑓 with 𝜌-correlated bits. 𝑐𝑐(𝑓): Perfectly Shared Randomness cc. 𝑝𝑟𝑖𝑣 𝑓 : cc with PRIVate randomness Starting point: for Boolean functions 𝑓 𝑐𝑐 𝑓 ≤𝑖𝑠 𝑟 𝜌 𝑓 ≤𝑝𝑟𝑖𝑣 𝑓 ≤𝑐𝑐 𝑓 + log 𝑛 What if 𝑐𝑐 𝑓 ≪log 𝑛? E.g. 𝑐𝑐 𝑓 =𝑂(1) =𝑖𝑠 𝑟 1 𝑓 =𝑖𝑠 𝑟 0 𝑓 𝜌≤𝜏⇒ 𝑖𝑠 𝑟 𝜌 𝑓 ≥𝑖𝑠 𝑟 𝜏 𝑓 03/26/2015 MSR Theory Day: ISR in Communication
MSR Theory Day: ISR in Communication Results Model first studied by [Bavarian,Gavinsky,Ito’14] (“Independently and earlier”). Their focus: Simultaneous Communication; general models of correlation. They show 𝑖𝑠𝑟 Equality =𝑂 1 (among other things) Our Results: Generally: 𝑐𝑐 𝑓 ≤𝑘⇒𝑖𝑠𝑟 𝑓 ≤ 2 𝑘 Converse: ∃𝑓 with 𝑐𝑐 𝑓 ≤𝑘 & 𝑖𝑠𝑟 𝑓 ≥ 2 𝑘 03/26/2015 MSR Theory Day: ISR in Communication
Equality Testing (our proof) Key idea: Think inner products. Encode 𝑥↦𝑋=𝐸(𝑥);𝑦↦𝑌=𝐸 𝑦 ;𝑋,𝑌∈ −1,+1 𝑁 𝑥=𝑦⇒ 〈𝑋,𝑌〉=𝑁 𝑥≠𝑦⇒ 〈𝑋,𝑌〉≤𝑁/2 Estimating inner products: Building on sketching protocols … Alice: Picks Gaussians 𝐺 1 ,… 𝐺 𝑡 ∈ ℝ 𝑁 , Sends 𝑖∈ 𝑡 maximizing 𝐺 𝑖 ,𝑋 to Bob. Bob: Accepts iff 𝐺′ 𝑖 ,𝑌 ≥0 Analysis: 𝑂 𝜌 (1) bits suffice if 𝐺 ≈ 𝜌 𝐺′ Gaussian Protocol 03/26/2015 MSR Theory Day: ISR in Communication
General One-Way Communication Idea: All communication ≤ Inner Products (For now: Assume one-way-cc 𝑓 ≤ 𝑘) For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 𝑘 Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 𝑘 → 0,1 W.p. ≥ 2 3 over 𝑅, 𝑓 𝑅 𝑖 𝑅 is the right answer. 03/26/2015 MSR Theory Day: ISR in Communication
General One-Way Communication For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 𝑘 Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 𝑘 → 0,1 W.p. ≥ 2 3 , 𝑓 𝑅 𝑖 𝑅 is the right answer. Vector representation: 𝑖 𝑅 ↦ 𝑥 𝑅 ∈ 0,1 2 𝑘 (unit coordinate vector) 𝑓 𝑅 ↦ 𝑦 𝑅 ∈ 0,1 2 𝑘 (truth table of 𝑓 𝑅 ). 𝑓 𝑅 𝑖 𝑅 =〈 𝑥 𝑅 , 𝑦 𝑅 〉; Acc. Prob. ∝ 𝑋,𝑌 ;𝑋= 𝑥 𝑅 𝑅 ;𝑌= 𝑦 𝑅 𝑅 Gaussian protocol estimates inner products of unit vectors to within ±𝜖 with 𝑂 𝜌 1 𝜖 2 communication. 03/26/2015 MSR Theory Day: ISR in Communication
Two-way communication Still decided by inner products. Simple lemma: ∃ 𝐾 𝐴 𝑘 , 𝐾 𝐵 𝑘 ⊆ ℝ 2 𝑘 convex, that describe private coin k-bit comm. strategies for Alice, Bob s.t. accept prob. of 𝜋 𝐴 ∈ 𝐾 𝐴 𝑘 , 𝜋 𝐵 ∈ 𝐾 𝐵 𝑘 equals 〈 𝜋 𝐴 , 𝜋 𝐵 〉 Putting things together: Theorem: 𝑐𝑐 𝑓 ≤𝑘⇒𝑖𝑠𝑟 𝑓 ≤ 𝑂 𝜌 ( 2 𝑘 ) 03/26/2015 MSR Theory Day: ISR in Communication
Main Technical Result: Matching lower bound Theorem:There exists a (promise) problem 𝑓 s.t. 𝑐𝑐 𝑓 ≤𝑘 𝑖𝑠 𝑟 𝜌 𝑓 ≥exp(𝑘) The Problem: Gap Sparse Inner Product (G-Sparse-IP). Alice gets sparse 𝑥∈ 0,1 𝑛 ; wt 𝑥 ≈ 2 −𝑘 ⋅𝑛 Bob gets 𝑦∈ 0,1 𝑛 Promise: 〈𝑥,𝑦〉 ≥ .9 2 −𝑘 ⋅𝑛 or 𝑥,𝑦 ≤ .6 2 −𝑘 ⋅𝑛. Decide which. G-Sparse-IP: 𝒙, 𝒚 ∈ 𝟎,𝟏 𝒏 ;𝒘𝒕 𝒙 ≈ 𝟐 −𝒌 ⋅𝒏 Decide 𝒙,𝒚 ≥(.𝟗) 𝟐 −𝒌 ⋅𝒏 or 𝒙,𝒚 ≤(.𝟔) 𝟐 −𝒌 ⋅𝒏? 03/26/2015 MSR Theory Day: ISR in Communication
𝒑𝒔𝒓 Protocol for G-Sparse-IP Note: Gaussian protocol takes 𝑂 2 𝑘 bits. Need to get exponentially better. Idea: 𝑥 𝑖 ≠0 ⇒ 𝑦 𝑖 correlated with answer. Use (perfectly) shared randomness to find random index 𝑖 s.t. 𝑥 𝑖 ≠0 . Shared randomness: 𝑖 1 , 𝑖 2 , 𝑖 3 , … uniform over [𝑛] Alice → Bob: smallest index 𝑗 s.t. 𝑥 𝑖 𝑗 ≠0. Bob: Accept if 𝑦 𝑖 𝑗 =1 Expect 𝑗≈ 2 𝑘 ; 𝑐𝑐≤𝑘. G-Sparse-IP: 𝒙, 𝒚 ∈ 𝟎,𝟏 𝒏 ;𝒘𝒕 𝒙 ≈ 𝟐 −𝒌 ⋅𝒏 Decide 𝒙,𝒚 ≥(.𝟗) 𝟐 −𝒌 ⋅𝒏 or 𝒙,𝒚 ≤(.𝟔) 𝟐 −𝒌 ⋅𝒏? 03/26/2015 MSR Theory Day: ISR in Communication
MSR Theory Day: ISR in Communication Lower Bound Idea Catch: ∀ Dist., ∃ Det. Protocol w. comm. ≤𝑘. Need to fix strategy first, construct dist. later. Main Idea: Protocol can look like G-Sparse-Inner-Product But implies players can agree on common index to focus on … Agreement is hard Protocol can ignore sparsity Requires 2 𝑘 bits Protocol can look like anything! Invariance Principle [MOO, Mossel … ] 03/26/2015 MSR Theory Day: ISR in Communication
MSR Theory Day: ISR in Communication Summarizing 𝑘 bits of comm. with perfect sharing → 2 𝑘 bits with imperfect sharing. This is tight Invariance principle for communication Agreement distillation Low-influence strategies G-Sparse-IP: 𝒙, 𝒚 ∈ 𝟎,𝟏 𝒏 ;𝒘𝒕 𝒙 ≈ 𝟐 −𝒌 ⋅𝒏 Decide 𝒙,𝒚 ≥(.𝟗) 𝟐 −𝒌 ⋅𝒏 or 𝒙,𝒚 ≤(.𝟔) 𝟐 −𝒌 ⋅𝒏? 03/26/2015 MSR Theory Day: ISR in Communication
MSR Theory Day: ISR in Communication Conclusions Imperfect agreement of context important. Dealing with new layer of uncertainty. Notion of scale (context LARGE) Many open directions+questions: Imperfectly shared randomness: One-sided error? Does interaction ever help? How much randomness? More general forms of correlation? 03/26/2015 MSR Theory Day: ISR in Communication
MSR Theory Day: ISR in Communication Thank You! 03/26/2015 MSR Theory Day: ISR in Communication