Presentation is loading. Please wait.

Presentation is loading. Please wait.

Imperfectly Shared Randomness

Similar presentations


Presentation on theme: "Imperfectly Shared Randomness"β€” Presentation transcript:

1 Imperfectly Shared Randomness
in Communication Madhu Sudan Harvard Joint work with ClΓ©ment Canonne (Columbia), Venkatesan Guruswami (CMU) and Raghu Meka (UCLA). 11/16/2016 UofT: ISR in Communication

2 Classical theory of communication
Shannon (1948) Clean architecture for reliable communication. Remarkable mathematical discoveries: Prob. Method, Entropy, (Mutual) Information, Compression schemes, Error-correcting codes. But does Bob really need all of π‘₯? π‘₯ Alice Bob Encoder Decoder 11/16/2016 UofT: ISR in Communication

3 [Yao’80] Communication Complexity
The model (with shared randomness) π‘₯ 𝑦 𝑓: π‘₯,𝑦 ↦Σ 𝑅=$$$ Alice Bob 𝐢𝐢 𝑓 =# bits exchanged by best protocol 𝑓(π‘₯,𝑦) w.p. 2/3 11/16/2016 UofT: ISR in Communication

4 Communication Complexity - Motivation
Standard: Lower bounds Circuit complexity, Streaming, Data Structures, extended formulations … This talk: Communication complexity as model for communication Food for thought: Shannon vs. Yao? Which is the right model. Typical (human-human/computer-computer) communication involves large context and brief communication. Contexts imperfectly shared. 11/16/2016 UofT: ISR in Communication

5 UofT: ISR in Communication
This talk Example problems with low communication complexity. New form of uncertainty and overcoming it. 11/16/2016 UofT: ISR in Communication

6 Aside: Easy CC Problems
Equality testing: 𝐸𝑄 π‘₯,𝑦 =1⇔π‘₯=𝑦; 𝐢𝐢 𝐸𝑄 =𝑂 1 Hamming distance: 𝐻 π‘˜ π‘₯,𝑦 =1⇔Δ π‘₯,𝑦 β‰€π‘˜; 𝐢𝐢 𝐻 π‘˜ =𝑂(π‘˜ log π‘˜ ) [Huang etal.] Small set intersection: ∩ π‘˜ π‘₯,𝑦 =1⇔ wt π‘₯ , wt 𝑦 β‰€π‘˜ & βˆƒπ‘– 𝑠.𝑑. π‘₯ 𝑖 = 𝑦 𝑖 =1. 𝐢𝐢 ∩ π‘˜ =𝑂(π‘˜) [HΓ₯stad Wigderson] Gap (Real) Inner Product: π‘₯,π‘¦βˆˆ ℝ 𝑛 ; π‘₯ 2 , 𝑦 2 =1; 𝐺𝐼 𝑃 𝑐,𝑠 π‘₯,𝑦 =1 if π‘₯,𝑦 β‰₯πœ–; =0 if π‘₯,𝑦 ≀0; 𝐢𝐢 𝐺𝐼 𝑃 𝑐,𝑠 =𝑂 1 πœ– 2 ; [Alon, Matias, Szegedy] Protocol: Fix ECC 𝐸: 0,1 𝑛 β†’ 0,1 𝑁 ; Shared randomness: 𝑖← 𝑁 ; Exchange 𝐸 π‘₯ 𝑖 , 𝐸 𝑦 𝑖 ; Accept iff 𝐸 π‘₯ 𝑖 =𝐸 𝑦 𝑖 . π‘π‘œπ‘™π‘¦ π‘˜ Protocol: Use common randomness to hash 𝑛 β†’[ π‘˜ 2 ] π‘₯=( π‘₯ 1 , …, π‘₯ 𝑛 ) 𝑦= 𝑦 1 , …, 𝑦 𝑛 〈π‘₯,𝑦βŒͺβ‰œ 𝑖 π‘₯ 𝑖 𝑦 𝑖 Main Insight: If 𝐺←𝑁 0,1 𝑛 , then 𝔼 𝐺,π‘₯ β‹… 𝐺,𝑦 =〈π‘₯,𝑦βŒͺ Summary from [Ghazi, Kamath, S.’16] 11/16/2016 UofT: ISR in Communication

7 Uncertainty in Communication
Overarching question: Are there communication mechanisms that can overcome uncertainty? What is uncertainty? This talk: Alice, Bob don’t share randomness perfectly; only approximately. A B f R 11/16/2016 UofT: ISR in Communication

8 UofT: ISR in Communication
Rest of this talk Model: Imperfectly Shared Randomness Positive results: Coping with imperfectly shared randomness. Negative results: Analyzing weakness of imperfectly shared randomness. 11/16/2016 UofT: ISR in Communication

9 Model: Imperfectly Shared Randomness
Alice β†π‘Ÿ ; and Bob ←𝑠 where π‘Ÿ,𝑠 = i.i.d. sequence of correlated pairs π‘Ÿ 𝑖 , 𝑠 𝑖 𝑖 ; π‘Ÿ 𝑖 , 𝑠 𝑖 ∈{βˆ’1,+1};𝔼 π‘Ÿ 𝑖 =𝔼 𝑠 𝑖 =0;𝔼 π‘Ÿ 𝑖 𝑠 𝑖 =𝜌β‰₯0 . Notation: 𝑖𝑠 π‘Ÿ 𝜌 (𝑓) = cc of 𝑓 with 𝜌-correlated bits. 𝑐𝑐(𝑓): Perfectly Shared Randomness cc. π‘π‘Ÿπ‘–π‘£ 𝑓 : cc with PRIVate randomness Starting point: for Boolean functions 𝑓 𝑐𝑐 𝑓 ≀𝑖𝑠 π‘Ÿ 𝜌 𝑓 β‰€π‘π‘Ÿπ‘–π‘£ 𝑓 ≀𝑐𝑐 𝑓 + log 𝑛 What if 𝑐𝑐 𝑓 β‰ͺlog 𝑛? E.g. 𝑐𝑐 𝑓 =𝑂(1) =𝑖𝑠 π‘Ÿ 1 𝑓 =𝑖𝑠 π‘Ÿ 0 𝑓 πœŒβ‰€πœβ‡’ 𝑖𝑠 π‘Ÿ 𝜌 𝑓 β‰₯𝑖𝑠 π‘Ÿ 𝜏 𝑓 11/16/2016 UofT: ISR in Communication

10 Distill Perfect Randomness from ISR?
Agreement Distillation: Alice β†π‘Ÿ; Bob ←𝑠; π‘Ÿ,𝑠 𝜌-corr. unbiased bits Outputs: Alice →𝑒; Bob →𝑣; 𝐻 ∞ 𝑒 , 𝐻 ∞ 𝑣 β‰₯π‘˜ Communication = 𝑐 bits; What is max. prob. 𝜏 of agreement (𝑒=𝑣)? Well-studied in the literature! [Ahlswede-Csiszar β€˜70s]: πœβ†’1⇒𝑐=Ξ© π‘˜ (one-way) [Bogdanov-Mossel β€˜2000s]: 𝑐=0β‡’πœβ‰€exp(βˆ’π‘˜) [Our work]: 𝜏= exp(βˆ’π‘˜+𝑂 𝑐 ) (two-way) Summary: Can’t distill randomness! 11/16/2016 UofT: ISR in Communication

11 UofT: ISR in Communication
Results Model first studied by [Bavarian,Gavinsky,Ito’14] (β€œIndependently and earlier”). Their focus: Simultaneous Communication; general models of correlation. They show π‘–π‘ π‘Ÿ Equality =𝑂 1 (among other things) Our Results: Generally: 𝑐𝑐 𝑓 β‰€π‘˜β‡’π‘–π‘ π‘Ÿ 𝑓 ≀ 2 π‘˜ Converse: βˆƒπ‘“ with 𝑐𝑐 𝑓 β‰€π‘˜ & π‘–π‘ π‘Ÿ 𝑓 β‰₯ 2 π‘˜ 11/16/2016 UofT: ISR in Communication

12 Equality Testing (our proof)
Key idea: Think inner products. Encode π‘₯↦𝑋=𝐸(π‘₯);π‘¦β†¦π‘Œ=𝐸 𝑦 ;𝑋,π‘Œβˆˆ βˆ’1,+1 𝑁 π‘₯=𝑦⇒ βŒ©π‘‹,π‘ŒβŒͺ=𝑁 π‘₯≠𝑦⇒ βŒ©π‘‹,π‘ŒβŒͺ≀𝑁/2 Estimating inner products: Building on sketching protocols … Alice: Picks Gaussians 𝐺 1 ,… 𝐺 𝑑 ∈ ℝ 𝑁 , Sends π‘–βˆˆ 𝑑 maximizing 𝐺 𝑖 ,𝑋 to Bob. Bob: Accepts iff 𝐺′ 𝑖 ,π‘Œ β‰₯0 Analysis: 𝑂 𝜌 (1) bits suffice if 𝐺 β‰ˆ 𝜌 𝐺′ Gaussian Protocol 11/16/2016 UofT: ISR in Communication

13 General One-Way Communication
Idea: All communication ≀ Inner Products (For now: Assume one-way-cc 𝑓 ≀ π‘˜) For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 π‘˜ Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 π‘˜ β†’ 0,1 W.p. β‰₯ 2 3 over 𝑅, 𝑓 𝑅 𝑖 𝑅 is the right answer. 11/16/2016 UofT: ISR in Communication

14 General One-Way Communication
For each random string 𝑅 Alice’s message = 𝑖 𝑅 ∈ 2 π‘˜ Bob’s output = 𝑓 𝑅 ( 𝑖 𝑅 ) where 𝑓 𝑅 : 2 π‘˜ β†’ 0,1 W.p. β‰₯ 2 3 , 𝑓 𝑅 𝑖 𝑅 is the right answer. Vector representation: 𝑖 𝑅 ↦ π‘₯ 𝑅 ∈ 0,1 2 π‘˜ (unit coordinate vector) 𝑓 𝑅 ↦ 𝑦 𝑅 ∈ 0,1 2 π‘˜ (truth table of 𝑓 𝑅 ). 𝑓 𝑅 𝑖 𝑅 =〈 π‘₯ 𝑅 , 𝑦 𝑅 βŒͺ; Acc. Prob. ∝ 𝑋,π‘Œ ;𝑋= π‘₯ 𝑅 𝑅 ;π‘Œ= 𝑦 𝑅 𝑅 Gaussian protocol estimates inner products of unit vectors to within Β±πœ– with 𝑂 𝜌 1 πœ– 2 communication. 11/16/2016 UofT: ISR in Communication

15 Two-way communication
Still decided by inner products. Simple lemma: βˆƒ 𝐾 𝐴 π‘˜ , 𝐾 𝐡 π‘˜ βŠ† ℝ 2 π‘˜ convex, that describe private coin k-bit comm. strategies for Alice, Bob s.t. accept prob. of πœ‹ 𝐴 ∈ 𝐾 𝐴 π‘˜ , πœ‹ 𝐡 ∈ 𝐾 𝐡 π‘˜ equals 〈 πœ‹ 𝐴 , πœ‹ 𝐡 βŒͺ Putting things together: Theorem: 𝑐𝑐 𝑓 β‰€π‘˜β‡’π‘–π‘ π‘Ÿ 𝑓 ≀ 𝑂 𝜌 ( 2 π‘˜ ) 11/16/2016 UofT: ISR in Communication

16 Main Technical Result: Matching lower bound
Theorem: There exists a (promise) problem 𝑓 s.t. 𝑐𝑐 𝑓 β‰€π‘˜ 𝑖𝑠 π‘Ÿ 𝜌 𝑓 β‰₯exp⁑(π‘˜) The Problem: Gap Sparse Inner Product (G-Sparse-IP). Alice gets sparse π‘₯∈ 0,1 𝑛 ; wt π‘₯ β‰ˆ 2 βˆ’π‘˜ ⋅𝑛 Bob gets π‘¦βˆˆ 0,1 𝑛 Promise: 〈π‘₯,𝑦βŒͺ β‰₯ βˆ’π‘˜ ⋅𝑛 or π‘₯,𝑦 ≀ βˆ’π‘˜ ⋅𝑛. Decide which. G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 11/16/2016 UofT: ISR in Communication

17 𝒑𝒔𝒓 Protocol for G-Sparse-IP
Note: Gaussian protocol takes 𝑂 2 π‘˜ bits. Need to get exponentially better. Idea: π‘₯ 𝑖 β‰ 0 β‡’ 𝑦 𝑖 correlated with answer. Use (perfectly) shared randomness to find random index 𝑖 s.t. π‘₯ 𝑖 β‰ 0 . Shared randomness: 𝑖 1 , 𝑖 2 , 𝑖 3 , … uniform over [𝑛] Alice β†’ Bob: smallest index 𝑗 s.t. π‘₯ 𝑖 𝑗 β‰ 0. Bob: Accept if 𝑦 𝑖 𝑗 =1 Expect π‘—β‰ˆ 2 π‘˜ ; π‘π‘β‰€π‘˜. G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 11/16/2016 UofT: ISR in Communication

18 UofT: ISR in Communication
Lower Bound Idea Catch: βˆ€ Dist., βˆƒ Det. Protocol w. comm. β‰€π‘˜. Need to fix strategy first, construct dist. later. Main Idea: Protocol can look like G-Sparse-Inner-Product But implies players can agree on common index to focus on … Agreement is hard Protocol can ignore sparsity Requires 2 π‘˜ bits Protocol can look like anything! Invariance Principle [MOO, Mossel … ] π‘₯ 11/16/2016 UofT: ISR in Communication

19 Invariance Principle [MOO’08]
Informally: Analysis of prob. processes with bits often hard. Analysis of related processes with Gaussian variables easy. β€œInvariance Principle” – under sufficiently general conditions … prob. of events β€œinvariant” when switching from bits to Gaussian Theorem: For every convex 𝐾 1 , 𝐾 2 βŠ† βˆ’1,1 β„“ βˆƒ transformations 𝑇 1 , 𝑇 2 s.t. if 𝑓: 0,1 𝑛 β†’ 𝐾 1 and 𝑔: 0,1 𝑛 β†’ 𝐾 2 have no common influential variable, then 𝐹= 𝑇 1 𝑓: ℝ 𝑛 β†’ 𝐾 1 and 𝐺= 𝑇 2 𝑔: ℝ 𝑛 β†’ 𝐾 2 satisfy Exp π‘₯,𝑦 βŒ©π‘“ π‘₯ ,𝑔 𝑦 βŒͺ β‰ˆ Exp 𝑋,π‘Œ 〈𝐹 𝑋 ,𝐺 π‘Œ βŒͺ 11/16/2016 UofT: ISR in Communication

20 UofT: ISR in Communication
Summarizing π‘˜ bits of comm. with perfect sharing β†’ 2 π‘˜ bits with imperfect sharing. This is tight Invariance principle for communication Agreement distillation Low-influence strategies G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 11/16/2016 UofT: ISR in Communication

21 UofT: ISR in Communication
Conclusions Imperfect agreement of context important. Dealing with new layer of uncertainty. Notion of scale (context LARGE) Many open directions+questions: Imperfectly shared randomness: One-sided error? Does interaction ever help? How much randomness? More general forms of correlation? 11/16/2016 UofT: ISR in Communication

22 UofT: ISR in Communication
Thank You! 11/16/2016 UofT: ISR in Communication

23 Towards a lower bound: Ruling out a natural approach
Alice and Bob use (many) correlated bits to agree perfectly on few random bits? For G-Sparse-IP need 𝑂( 2 π‘˜ log 𝑛) random bits. Agreement Distillation Problem: Alice & Bob exchange 𝑑 bits; generate π‘˜ random bits, with agreement probability 𝛾. Lower bound [Bogdanov, Mossel]: 𝑑β‰₯π‘˜ βˆ’π‘‚ log 1 𝛾 11/16/2016 UofT: ISR in Communication

24 UofT: ISR in Communication
Towards Lower Bound Explaining two natural protocols: Gaussian Inner Product Protocol: Ignore sparsity and just estimate inner product. Uses ~ 2 2π‘˜ bits. Need to prove it can’t be improved! G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 11/16/2016 UofT: ISR in Communication

25 Optimality of Gaussian Protocol
Problem: π‘₯,𝑦 ← πœ‡ 𝑛 : πœ‡= πœ‡ π‘ŒπΈπ‘† π‘œπ‘Ÿ πœ‡ 𝑁𝑂 supported on ℝ×ℝ πœ‡ π‘ŒπΈπ‘† :πœ–-correlated Gaussians πœ‡ 𝑁𝑂 : uncorrelated Gaussians Lemma: Separating πœ‡ π‘ŒπΈπ‘† 𝑛 𝑣𝑠. πœ‡ 𝑁𝑂 𝑛 requires Ξ© πœ– βˆ’1 bits of communication. Proof: Reduction from Disjointness Conclusion: Can’t ignore sparsity! G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 11/16/2016 UofT: ISR in Communication

26 UofT: ISR in Communication
Towards Lower Bound Explaining two natural protocols: Gaussian Inner Product Protocol: Ignore sparsity and just estimate inner product. Uses ~ 2 2π‘˜ bits. Need to prove it can’t be improved! Protocol with perfectly shared randomness: Alice & Bob agree on coordinates to focus on: ( 𝑖 1 , 𝑖 2 , …, 𝑖 2 π‘˜ , …); Either 𝑖 1 has high entropy (over choice of π‘Ÿ,𝑠) Violates agreement distillation bound Or has low-entropy: Fix distributions of π‘₯,𝑦 s.t. π‘₯ 𝑖 1 βŠ₯ 𝑦 𝑖 1 G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 11/16/2016 UofT: ISR in Communication

27 Aside: Distributional lower bounds
Challenge: Usual CC lower bounds are distributional. 𝑐𝑐(G-Sparse-IP) β‰€π‘˜, βˆ€ inputs. ⇒𝑐𝑐(G-Sparse-IP) β‰€π‘˜ βˆ€ distributions. β‡’ det-cc (G-Sparse-IP) β‰€π‘˜ βˆ€ distributions. So usual approach can’t work … Need to fix strategy first and then β€œidentify” a hard distribution for the strategy … G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 11/16/2016 UofT: ISR in Communication

28 UofT: ISR in Communication
Towards lower bound Summary so far: Symmetric strategy β‡’ 2 π‘˜ bits of comm. Strategy asymmetric; π‘₯ 1 , 𝑦 1 … π‘₯ π‘˜ , 𝑦 π‘˜ have high influence β‡’ fix the distribution so these coordinates do not influence answer. Strategy asymmetric; with random coordinate having high influence β‡’ violates agreement lower bound. Are these exhaustive? How to prove this? Invariance Principle!! [Mossel, O’Donnell, Oleskiewisz], [Mossel] … 11/16/2016 UofT: ISR in Communication

29 ISR lower bound for GSIP.
One-way setting (for now) Strategies: Alice 𝑓 π‘Ÿ π‘₯ ∈ 𝐾 ; Bob 𝑔 𝑠 𝑦 ∈ 0,1 𝐾 ; Distributions: If π‘₯ 𝑖 , 𝑦 𝑖 have high influence on (𝑓 π‘Ÿ , 𝑔 𝑠 ) w.h.p. over π‘Ÿ,𝑠 then set π‘₯ 𝑖 = 𝑦 𝑖 =0. [𝑖 is BAD] Else 𝑦 𝑖 correlated with π‘₯ 𝑖 in YES case, and independent in NO case. Analysis: π‘–βˆˆπ΅π΄π· influential in both 𝑓 π‘Ÿ , 𝑔 𝑠 β‡’ No help. π‘–βˆ‰π΅π΄π· influential … β‡’ violates agreement lower bound. No common influential variable β‡’ π‘₯,𝑦 can be replaced by Gaussians β‡’ 2 π‘˜ bits needed. G-Sparse-IP: 𝒙, π’š ∈ 𝟎,𝟏 𝒏 ;π’˜π’• 𝒙 β‰ˆ 𝟐 βˆ’π’Œ ⋅𝒏 Decide 𝒙,π’š β‰₯(.πŸ—) 𝟐 βˆ’π’Œ ⋅𝒏 or 𝒙,π’š ≀(.πŸ”) 𝟐 βˆ’π’Œ ⋅𝒏? 11/16/2016 UofT: ISR in Communication

30 Invariance Principle + Challenges
Informal Invariance Principle: 𝑓,𝑔 low-degree polynomials with no common influential variable β‡’ Exp π‘₯,𝑦 𝑓 π‘₯ 𝑔 𝑦 β‰ˆ Exp 𝑋,π‘Œ [𝑓(𝑋)𝑔(π‘Œ)] (caveat π‘“β‰ˆπ‘“;π‘”β‰ˆπ‘”) where π‘₯,𝑦 Boolean 𝑛-wise product dist. and 𝑋,π‘Œ Gaussian 𝑛-wise product dist Challenges [+ Solutions]: Our functions not low-degree [Smoothening] Our functions not real-valued 𝑔: 0,1 𝑛 β†’ 0,1 β„“ : [Truncate range to 0,1 β„“ ] 𝑓: 0,1 𝑛 β†’ β„“ : [???, [work with Ξ” β„“ ]] 11/16/2016 UofT: ISR in Communication

31 Invariance Principle + Challenges
Informal Invariance Principle: 𝑓,𝑔 low-degree polynomials with no common influential variable β‡’ Exp π‘₯,𝑦 𝑓 π‘₯ 𝑔 𝑦 β‰ˆ Exp 𝑋,π‘Œ [𝑓(𝑋)𝑔(π‘Œ)] (caveat π‘“β‰ˆπ‘“;π‘”β‰ˆπ‘”) Challenges Our functions not low-degree [Smoothening] Our functions not real-valued [Truncate] Quantity of interest is not 𝑓 π‘₯ ⋅𝑔 𝑦 … [Can express quantity of interest as inner product. ] … (lots of grunge work …) Get a relevant invariance principle (next) 11/16/2016 UofT: ISR in Communication

32 Invariance Principle for CC
Theorem: For every convex 𝐾 1 , 𝐾 2 βŠ† βˆ’1,1 β„“ βˆƒ transformations 𝑇 1 , 𝑇 2 s.t. if 𝑓: 0,1 𝑛 β†’ 𝐾 1 and 𝑔: 0,1 𝑛 β†’ 𝐾 2 have no common influential variable, then 𝐹= 𝑇 1 𝑓: ℝ 𝑛 β†’ 𝐾 1 and 𝐺= 𝑇 2 𝑔: ℝ 𝑛 β†’ 𝐾 2 satisfy Exp π‘₯,𝑦 βŒ©π‘“ π‘₯ ,𝑔 𝑦 βŒͺ β‰ˆ Exp 𝑋,π‘Œ 〈𝐹 𝑋 ,𝐺 π‘Œ βŒͺ Main differences: 𝑓,𝑔 vector-valued. Functions are transformed: 𝑓↦𝐹;𝑔↦𝐺 Range preserved exactly ( 𝐾 1 =Ξ” β„“ ; 𝐾 2 = 0,1 β„“ )! So 𝐹,𝐺 are still communication strategies! 11/16/2016 UofT: ISR in Communication


Download ppt "Imperfectly Shared Randomness"

Similar presentations


Ads by Google