Download presentation
Presentation is loading. Please wait.
1
Imperfectly Shared Randomness
in Communication Madhu Sudan Harvard Joint work with ClΓ©ment Canonne (Columbia), Venkatesan Guruswami (CMU) and Raghu Meka (UCLA). 11/16/2016 UofT: ISR in Communication
2
Classical theory of communication
Shannon (1948) Clean architecture for reliable communication. Remarkable mathematical discoveries: Prob. Method, Entropy, (Mutual) Information, Compression schemes, Error-correcting codes. But does Bob really need all of π₯? π₯ Alice Bob Encoder Decoder 11/16/2016 UofT: ISR in Communication
3
[Yaoβ80] Communication Complexity
The model (with shared randomness) π₯ π¦ π: π₯,π¦ β¦Ξ£ π
=$$$ Alice Bob πΆπΆ π =# bits exchanged by best protocol π(π₯,π¦) w.p. 2/3 11/16/2016 UofT: ISR in Communication
4
Communication Complexity - Motivation
Standard: Lower bounds Circuit complexity, Streaming, Data Structures, extended formulations β¦ This talk: Communication complexity as model for communication Food for thought: Shannon vs. Yao? Which is the right model. Typical (human-human/computer-computer) communication involves large context and brief communication. Contexts imperfectly shared. 11/16/2016 UofT: ISR in Communication
5
UofT: ISR in Communication
This talk Example problems with low communication complexity. New form of uncertainty and overcoming it. 11/16/2016 UofT: ISR in Communication
6
Aside: Easy CC Problems
Equality testing: πΈπ π₯,π¦ =1βπ₯=π¦; πΆπΆ πΈπ =π 1 Hamming distance: π» π π₯,π¦ =1βΞ π₯,π¦ β€π; πΆπΆ π» π =π(π log π ) [Huang etal.] Small set intersection: β© π π₯,π¦ =1β wt π₯ , wt π¦ β€π & βπ π .π‘. π₯ π = π¦ π =1. πΆπΆ β© π =π(π) [HΓ₯stad Wigderson] Gap (Real) Inner Product: π₯,π¦β β π ; π₯ 2 , π¦ 2 =1; πΊπΌ π π,π π₯,π¦ =1 if π₯,π¦ β₯π; =0 if π₯,π¦ β€0; πΆπΆ πΊπΌ π π,π =π 1 π 2 ; [Alon, Matias, Szegedy] Protocol: Fix ECC πΈ: 0,1 π β 0,1 π ; Shared randomness: πβ π ; Exchange πΈ π₯ π , πΈ π¦ π ; Accept iff πΈ π₯ π =πΈ π¦ π . ππππ¦ π Protocol: Use common randomness to hash π β[ π 2 ] π₯=( π₯ 1 , β¦, π₯ π ) π¦= π¦ 1 , β¦, π¦ π β©π₯,π¦βͺβ π π₯ π π¦ π Main Insight: If πΊβπ 0,1 π , then πΌ πΊ,π₯ β
πΊ,π¦ =β©π₯,π¦βͺ Summary from [Ghazi, Kamath, S.β16] 11/16/2016 UofT: ISR in Communication
7
Uncertainty in Communication
Overarching question: Are there communication mechanisms that can overcome uncertainty? What is uncertainty? This talk: Alice, Bob donβt share randomness perfectly; only approximately. A B f R 11/16/2016 UofT: ISR in Communication
8
UofT: ISR in Communication
Rest of this talk Model: Imperfectly Shared Randomness Positive results: Coping with imperfectly shared randomness. Negative results: Analyzing weakness of imperfectly shared randomness. 11/16/2016 UofT: ISR in Communication
9
Model: Imperfectly Shared Randomness
Alice βπ ; and Bob βπ where π,π = i.i.d. sequence of correlated pairs π π , π π π ; π π , π π β{β1,+1};πΌ π π =πΌ π π =0;πΌ π π π π =πβ₯0 . Notation: ππ π π (π) = cc of π with π-correlated bits. ππ(π): Perfectly Shared Randomness cc. ππππ£ π : cc with PRIVate randomness Starting point: for Boolean functions π ππ π β€ππ π π π β€ππππ£ π β€ππ π + log π What if ππ π βͺlog π? E.g. ππ π =π(1) =ππ π 1 π =ππ π 0 π πβ€πβ ππ π π π β₯ππ π π π 11/16/2016 UofT: ISR in Communication
10
Distill Perfect Randomness from ISR?
Agreement Distillation: Alice βπ; Bob βπ ; π,π π-corr. unbiased bits Outputs: Alice βπ’; Bob βπ£; π» β π’ , π» β π£ β₯π Communication = π bits; What is max. prob. π of agreement (π’=π£)? Well-studied in the literature! [Ahlswede-Csiszar β70s]: πβ1βπ=Ξ© π (one-way) [Bogdanov-Mossel β2000s]: π=0βπβ€exp(βπ) [Our work]: π= exp(βπ+π π ) (two-way) Summary: Canβt distill randomness! 11/16/2016 UofT: ISR in Communication
11
UofT: ISR in Communication
Results Model first studied by [Bavarian,Gavinsky,Itoβ14] (βIndependently and earlierβ). Their focus: Simultaneous Communication; general models of correlation. They show ππ π Equality =π 1 (among other things) Our Results: Generally: ππ π β€πβππ π π β€ 2 π Converse: βπ with ππ π β€π & ππ π π β₯ 2 π 11/16/2016 UofT: ISR in Communication
12
Equality Testing (our proof)
Key idea: Think inner products. Encode π₯β¦π=πΈ(π₯);π¦β¦π=πΈ π¦ ;π,πβ β1,+1 π π₯=π¦β β©π,πβͺ=π π₯β π¦β β©π,πβͺβ€π/2 Estimating inner products: Building on sketching protocols β¦ Alice: Picks Gaussians πΊ 1 ,β¦ πΊ π‘ β β π , Sends πβ π‘ maximizing πΊ π ,π to Bob. Bob: Accepts iff πΊβ² π ,π β₯0 Analysis: π π (1) bits suffice if πΊ β π πΊβ² Gaussian Protocol 11/16/2016 UofT: ISR in Communication
13
General One-Way Communication
Idea: All communication β€ Inner Products (For now: Assume one-way-cc π β€ π) For each random string π
Aliceβs message = π π
β 2 π Bobβs output = π π
( π π
) where π π
: 2 π β 0,1 W.p. β₯ 2 3 over π
, π π
π π
is the right answer. 11/16/2016 UofT: ISR in Communication
14
General One-Way Communication
For each random string π
Aliceβs message = π π
β 2 π Bobβs output = π π
( π π
) where π π
: 2 π β 0,1 W.p. β₯ 2 3 , π π
π π
is the right answer. Vector representation: π π
β¦ π₯ π
β 0,1 2 π (unit coordinate vector) π π
β¦ π¦ π
β 0,1 2 π (truth table of π π
). π π
π π
=β© π₯ π
, π¦ π
βͺ; Acc. Prob. β π,π ;π= π₯ π
π
;π= π¦ π
π
Gaussian protocol estimates inner products of unit vectors to within Β±π with π π 1 π 2 communication. 11/16/2016 UofT: ISR in Communication
15
Two-way communication
Still decided by inner products. Simple lemma: β πΎ π΄ π , πΎ π΅ π β β 2 π convex, that describe private coin k-bit comm. strategies for Alice, Bob s.t. accept prob. of π π΄ β πΎ π΄ π , π π΅ β πΎ π΅ π equals β© π π΄ , π π΅ βͺ Putting things together: Theorem: ππ π β€πβππ π π β€ π π ( 2 π ) 11/16/2016 UofT: ISR in Communication
16
Main Technical Result: Matching lower bound
Theorem: There exists a (promise) problem π s.t. ππ π β€π ππ π π π β₯expβ‘(π) The Problem: Gap Sparse Inner Product (G-Sparse-IP). Alice gets sparse π₯β 0,1 π ; wt π₯ β 2 βπ β
π Bob gets π¦β 0,1 π Promise: β©π₯,π¦βͺ β₯ βπ β
π or π₯,π¦ β€ βπ β
π. Decide which. G-Sparse-IP: π, π β π,π π ;ππ π β π βπ β
π Decide π,π β₯(.π) π βπ β
π or π,π β€(.π) π βπ β
π? 11/16/2016 UofT: ISR in Communication
17
πππ Protocol for G-Sparse-IP
Note: Gaussian protocol takes π 2 π bits. Need to get exponentially better. Idea: π₯ π β 0 β π¦ π correlated with answer. Use (perfectly) shared randomness to find random index π s.t. π₯ π β 0 . Shared randomness: π 1 , π 2 , π 3 , β¦ uniform over [π] Alice β Bob: smallest index π s.t. π₯ π π β 0. Bob: Accept if π¦ π π =1 Expect πβ 2 π ; ππβ€π. G-Sparse-IP: π, π β π,π π ;ππ π β π βπ β
π Decide π,π β₯(.π) π βπ β
π or π,π β€(.π) π βπ β
π? 11/16/2016 UofT: ISR in Communication
18
UofT: ISR in Communication
Lower Bound Idea Catch: β Dist., β Det. Protocol w. comm. β€π. Need to fix strategy first, construct dist. later. Main Idea: Protocol can look like G-Sparse-Inner-Product But implies players can agree on common index to focus on β¦ Agreement is hard Protocol can ignore sparsity Requires 2 π bits Protocol can look like anything! Invariance Principle [MOO, Mossel β¦ ] π₯ 11/16/2016 UofT: ISR in Communication
19
Invariance Principle [MOOβ08]
Informally: Analysis of prob. processes with bits often hard. Analysis of related processes with Gaussian variables easy. βInvariance Principleβ β under sufficiently general conditions β¦ prob. of events βinvariantβ when switching from bits to Gaussian Theorem: For every convex πΎ 1 , πΎ 2 β β1,1 β β transformations π 1 , π 2 s.t. if π: 0,1 π β πΎ 1 and π: 0,1 π β πΎ 2 have no common influential variable, then πΉ= π 1 π: β π β πΎ 1 and πΊ= π 2 π: β π β πΎ 2 satisfy Exp π₯,π¦ β©π π₯ ,π π¦ βͺ β Exp π,π β©πΉ π ,πΊ π βͺ 11/16/2016 UofT: ISR in Communication
20
UofT: ISR in Communication
Summarizing π bits of comm. with perfect sharing β 2 π bits with imperfect sharing. This is tight Invariance principle for communication Agreement distillation Low-influence strategies G-Sparse-IP: π, π β π,π π ;ππ π β π βπ β
π Decide π,π β₯(.π) π βπ β
π or π,π β€(.π) π βπ β
π? 11/16/2016 UofT: ISR in Communication
21
UofT: ISR in Communication
Conclusions Imperfect agreement of context important. Dealing with new layer of uncertainty. Notion of scale (context LARGE) Many open directions+questions: Imperfectly shared randomness: One-sided error? Does interaction ever help? How much randomness? More general forms of correlation? 11/16/2016 UofT: ISR in Communication
22
UofT: ISR in Communication
Thank You! 11/16/2016 UofT: ISR in Communication
23
Towards a lower bound: Ruling out a natural approach
Alice and Bob use (many) correlated bits to agree perfectly on few random bits? For G-Sparse-IP need π( 2 π log π) random bits. Agreement Distillation Problem: Alice & Bob exchange π‘ bits; generate π random bits, with agreement probability πΎ. Lower bound [Bogdanov, Mossel]: π‘β₯π βπ log 1 πΎ 11/16/2016 UofT: ISR in Communication
24
UofT: ISR in Communication
Towards Lower Bound Explaining two natural protocols: Gaussian Inner Product Protocol: Ignore sparsity and just estimate inner product. Uses ~ 2 2π bits. Need to prove it canβt be improved! G-Sparse-IP: π, π β π,π π ;ππ π β π βπ β
π Decide π,π β₯(.π) π βπ β
π or π,π β€(.π) π βπ β
π? 11/16/2016 UofT: ISR in Communication
25
Optimality of Gaussian Protocol
Problem: π₯,π¦ β π π : π= π ππΈπ ππ π ππ supported on βΓβ π ππΈπ :π-correlated Gaussians π ππ : uncorrelated Gaussians Lemma: Separating π ππΈπ π π£π . π ππ π requires Ξ© π β1 bits of communication. Proof: Reduction from Disjointness Conclusion: Canβt ignore sparsity! G-Sparse-IP: π, π β π,π π ;ππ π β π βπ β
π Decide π,π β₯(.π) π βπ β
π or π,π β€(.π) π βπ β
π? 11/16/2016 UofT: ISR in Communication
26
UofT: ISR in Communication
Towards Lower Bound Explaining two natural protocols: Gaussian Inner Product Protocol: Ignore sparsity and just estimate inner product. Uses ~ 2 2π bits. Need to prove it canβt be improved! Protocol with perfectly shared randomness: Alice & Bob agree on coordinates to focus on: ( π 1 , π 2 , β¦, π 2 π , β¦); Either π 1 has high entropy (over choice of π,π ) Violates agreement distillation bound Or has low-entropy: Fix distributions of π₯,π¦ s.t. π₯ π 1 β₯ π¦ π 1 G-Sparse-IP: π, π β π,π π ;ππ π β π βπ β
π Decide π,π β₯(.π) π βπ β
π or π,π β€(.π) π βπ β
π? 11/16/2016 UofT: ISR in Communication
27
Aside: Distributional lower bounds
Challenge: Usual CC lower bounds are distributional. ππ(G-Sparse-IP) β€π, β inputs. βππ(G-Sparse-IP) β€π β distributions. β det-cc (G-Sparse-IP) β€π β distributions. So usual approach canβt work β¦ Need to fix strategy first and then βidentifyβ a hard distribution for the strategy β¦ G-Sparse-IP: π, π β π,π π ;ππ π β π βπ β
π Decide π,π β₯(.π) π βπ β
π or π,π β€(.π) π βπ β
π? 11/16/2016 UofT: ISR in Communication
28
UofT: ISR in Communication
Towards lower bound Summary so far: Symmetric strategy β 2 π bits of comm. Strategy asymmetric; π₯ 1 , π¦ 1 β¦ π₯ π , π¦ π have high influence β fix the distribution so these coordinates do not influence answer. Strategy asymmetric; with random coordinate having high influence β violates agreement lower bound. Are these exhaustive? How to prove this? Invariance Principle!! [Mossel, OβDonnell, Oleskiewisz], [Mossel] β¦ 11/16/2016 UofT: ISR in Communication
29
ISR lower bound for GSIP.
One-way setting (for now) Strategies: Alice π π π₯ β πΎ ; Bob π π π¦ β 0,1 πΎ ; Distributions: If π₯ π , π¦ π have high influence on (π π , π π ) w.h.p. over π,π then set π₯ π = π¦ π =0. [π is BAD] Else π¦ π correlated with π₯ π in YES case, and independent in NO case. Analysis: πβπ΅π΄π· influential in both π π , π π β No help. πβπ΅π΄π· influential β¦ β violates agreement lower bound. No common influential variable β π₯,π¦ can be replaced by Gaussians β 2 π bits needed. G-Sparse-IP: π, π β π,π π ;ππ π β π βπ β
π Decide π,π β₯(.π) π βπ β
π or π,π β€(.π) π βπ β
π? 11/16/2016 UofT: ISR in Communication
30
Invariance Principle + Challenges
Informal Invariance Principle: π,π low-degree polynomials with no common influential variable β Exp π₯,π¦ π π₯ π π¦ β Exp π,π [π(π)π(π)] (caveat πβπ;πβπ) where π₯,π¦ Boolean π-wise product dist. and π,π Gaussian π-wise product dist Challenges [+ Solutions]: Our functions not low-degree [Smoothening] Our functions not real-valued π: 0,1 π β 0,1 β : [Truncate range to 0,1 β ] π: 0,1 π β β : [???, [work with Ξ β ]] 11/16/2016 UofT: ISR in Communication
31
Invariance Principle + Challenges
Informal Invariance Principle: π,π low-degree polynomials with no common influential variable β Exp π₯,π¦ π π₯ π π¦ β Exp π,π [π(π)π(π)] (caveat πβπ;πβπ) Challenges Our functions not low-degree [Smoothening] Our functions not real-valued [Truncate] Quantity of interest is not π π₯ β
π π¦ β¦ [Can express quantity of interest as inner product. ] β¦ (lots of grunge work β¦) Get a relevant invariance principle (next) 11/16/2016 UofT: ISR in Communication
32
Invariance Principle for CC
Theorem: For every convex πΎ 1 , πΎ 2 β β1,1 β β transformations π 1 , π 2 s.t. if π: 0,1 π β πΎ 1 and π: 0,1 π β πΎ 2 have no common influential variable, then πΉ= π 1 π: β π β πΎ 1 and πΊ= π 2 π: β π β πΎ 2 satisfy Exp π₯,π¦ β©π π₯ ,π π¦ βͺ β Exp π,π β©πΉ π ,πΊ π βͺ Main differences: π,π vector-valued. Functions are transformed: πβ¦πΉ;πβ¦πΊ Range preserved exactly ( πΎ 1 =Ξ β ; πΎ 2 = 0,1 β )! So πΉ,πΊ are still communication strategies! 11/16/2016 UofT: ISR in Communication
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.