Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Limitations of Quantum Advice and One-Way Communication Scott Aaronson UC Berkeley IAS Useful?
The Average Case Complexity of Counting Distinct Elements David Woodruff IBM Almaden.
Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Theory of Computing Lecture 23 MAS 714 Hartmut Klauck.
Circuit and Communication Complexity. Karchmer – Wigderson Games Given The communication game G f : Alice getss.t. f(x)=1 Bob getss.t. f(y)=0 Goal: Find.
Quantum Information and the PCP Theorem Ran Raz Weizmann Institute.
Random non-local games Andris Ambainis, Artūrs Bačkurs, Kaspars Balodis, Dmitry Kravchenko, Juris Smotrovs, Madars Virza University of Latvia.
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
Applied Algorithmics - week7
Locally Decodable Codes from Nice Subsets of Finite Fields and Prime Factors of Mersenne Numbers Kiran Kedlaya Sergey Yekhanin MIT Microsoft Research.
QuickSort Average Case Analysis An Incompressibility Approach Brendan Lucier August 2, 2005.
Gillat Kol joint work with Ran Raz Competing Provers Protocols for Circuit Evaluation.
1 Information complexity and exact communication bounds April 26, 2013 Mark Braverman Princeton University Based on joint work with Ankit Garg, Denis Pankratov,
Eran Omri, Bar-Ilan University Joint work with Amos Beimel and Ilan Orlov, BGU Ilan Orlov…!??!!
Complexity 26-1 Complexity Andrei Bulatov Interactive Proofs.
Poorvi Vora/CTO/IPG/HP 01/03 1 The channel coding theorem and the security of binary randomization Poorvi Vora Hewlett-Packard Co.
CS151 Complexity Theory Lecture 6 April 15, 2015.
On the tightness of Buhrman- Cleve-Wigderson simulation Shengyu Zhang The Chinese University of Hong Kong On the relation between decision tree complexity.
CS151 Complexity Theory Lecture 7 April 20, 2004.
Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.
On Gallager’s problem: New Bounds for Noisy Communication. Navin Goyal & Mike Saks Joint work with Guy Kindler Microsoft Research.
BB84 Quantum Key Distribution 1.Alice chooses (4+  )n random bitstrings a and b, 2.Alice encodes each bit a i as {|0>,|1>} if b i =0 and as {|+>,|->}
DANSS Colloquium By Prof. Danny Dolev Presented by Rica Gonen
DAST 2005 Week 4 – Some Helpful Material Randomized Quick Sort & Lower bound & General remarks…
Theory of Computing Lecture 22 MAS 714 Hartmut Klauck.
Lo-Chau Quantum Key Distribution 1.Alice creates 2n EPR pairs in state each in state |  00 >, and picks a random 2n bitstring b, 2.Alice randomly selects.
Collecting Correlated Information from a Sensor Network Micah Adler University of Massachusetts, Amherst.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
Noise, Information Theory, and Entropy
Secure Computation of the k’th Ranked Element Gagan Aggarwal Stanford University Joint work with Nina Mishra and Benny Pinkas, HP Labs.
A Few Simple Applications to Cryptography Louis Salvail BRICS, Aarhus University.
1 Information and interactive computation January 16, 2012 Mark Braverman Computer Science, Princeton University.
CS151 Complexity Theory Lecture 13 May 11, Outline proof systems interactive proofs and their power Arthur-Merlin games.
Channel Capacity.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Foundations of Communication on Multiple-Access Channel Dariusz Kowalski.
Noisy Connections: A Survey of Interactive Coding and its Borders with Other Topics Allison Bishop Lewko Columbia University featuring works by Schulman,
One-way multi-party communication lower bound for pointer jumping with applications Emanuele Viola & Avi Wigderson Columbia University IAS work done while.
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
CS3505: DATA LINK LAYER. data link layer  phys. layer subject to errors; not reliable; and only moves information as bits, which alone are not meaningful.
Interactive Channel Capacity Ran Raz Weizmann Institute Joint work with Gillat Kol Technion.
Asymmetric Communication Complexity And its implications on Cell Probe Complexity Slides by Elad Verbin Based on a paper of Peter Bro Miltersen, Noam Nisan,
The Cost of Fault Tolerance in Multi-Party Communication Complexity Binbin Chen Advanced Digital Sciences Center Haifeng Yu National University of Singapore.
Randomization Carmella Kroitoru Seminar on Communication Complexity.
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Data Stream Algorithms Lower Bounds Graham Cormode
Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel.
Steps Towards a Unified Coding Theory for Error- Resilient Computation Allison Bishop Columbia University featuring works by Schulman, Haeupler, Brakerski,
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.
Andrea CLEMENTI Radio Networks The Model Broadcast.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Information Complexity Lower Bounds
New Characterizations in Turnstile Streams with Applications
Communication Complexity as a Lower Bound for Learning in Games
CS 154, Lecture 6: Communication Complexity
Linear sketching with parities
Quantum Information Theory Introduction
An Upper Bound on the GKS Game via Max Bipartite Matching
CS21 Decidability and Tractability
Presentation transcript:

Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity

“A Mathematical Theory of Communication” Claude Shannon 1948 An exact formula for the channel capacity of any noisy channel

 -noisy channel: Each bit is flipped with prob  (independently) Alice wants to send an n bit message to Bob. How many bits does Alice need to send over the  -noisy channel, so Bob can retrieve w.p. 1-o(1)? – Is the blow-up even constant? Shannon: Channel Capacity 1-  n bits noiseless channel AB ? bits  -noisy channel A B

 -noisy channel: Each bit is flipped with prob  (independently) Alice wants to send an n bit message to Bob. How many bits does Alice need to send over the  -noisy channel, so Bob can retrieve w.p. 1-o(1)? [Shannon ‘48]: # bits  n / 1-H(  ) – Entropy function H(  ) = -  log(  ) – (1-  ) log(1-  ) – Matching upper and lower bounds Shannon: Channel Capacity Channel Capacity 1- 

Alice and Bob want to have an n bits long conversation. How many bits do they need to send over the  -noisy channel, so both can retrieve transcript w.p. 1-o(1)?  -noisy channel A B ? bits n bits noiseless channel AB Us: Interactive Channel Capacity

Communication Complexity Setting: Alice has input x, Bob has input y. They want to compute f(x,y) (f is publicly known) Communication Complexity of f: The least number of bits they need to communicate – Deterministic, CC(f):  x,y, compute f(x,y) w.p. 1 – Randomized, RCC(f):  x,y, compute f(x,y) w.p. 1-o(1) Players share a random string – Noisy, CC  (f):  x,y, compute f(x,y) w.p. 1-o(1) Players communicate over the  -noisy channel Players share a random string

Def: Interactive Channel Capacity – RCC(f) = Randomized CC (over the noiseless channel) – CC  (f) = Noisy CC (over the  -noisy channel) * Results hold when we use CC(f) instead of RCC(f) * Results hold for worst case & average case RCC(f),CC  (f) Def: Interactive Channel Capacity

– RCC(f) = Randomized CC (over the noiseless channel) – CC  (f) = Noisy CC (over the  -noisy channel) For f(x,y) = x (msg transmission), we get Channel Capacity – Interactive Channel Capacity  Channel Capacity In the interactive case, an error in the first bit may cause the whole conversation to be meaningless. We may need to “encode” every bit separately. Def: Interactive Channel Capacity

[Schulman ’92]: – Theorem: If RCC(f) = n then CC  (f)  O(n) Corollary: C(  ) > 0 – Open Question: Is Interactive Channel Capacity = Channel Capacity? Many other works [Sch,BR,B,GMS,BK,BN,FGOS…]: – Simulation of any communication protocol with adversarial noise – Large constants, never made explicit Previous Works

Our Results

Channel Types Synchronous Channel: Exactly one player sends a bit at each time step Asynchronous Channel: If both send bits at the same time these bits are lost Two channels: Each player sends a bit at any time this work

Channel Types Synchronous Channel: Exactly one player sends a bit at each time step – The order of turns in a protocol is pre-determined (independent of the inputs, randomness, noise). Otherwise players may send bits at the same time – Alternating turns is a special case Asynchronous Channel: If both send bits at the same time these bits are lost Two channels: Each player sends a bit at any time this work

Example f with CC  > RCC: 2 k -Pointer Jumping Game Parameters: – 2 k -ary tree, depth d – k = O(1), d   –  = logk / k 2 Alice owns odd layers, Bob owns even layers Pointer Jumping Game: – Inputs: Each player gets an edge going out of every node he owns – Goal: Find the leaf reached depth = d Pointer Jumping

depth = d Pointer Jumping

Bounding CC  (PJG) - The Idea “Any good PJG protocol does the following:” Alice starts by sending the first edge (k bits) – wp   k a bit was flipped Case 1: Alice sends additional bits to correct first edge – Even if a single error occurred and Alice knows its index, she needs to send the index  logk bit waste Case 2: Bob sends the next edge (k bits) – wp   k these k bits are wasted, as Bob had wrong first edge  In expectation,  k 2 = logk bit waste In both cases, sending the first edge costs k+  (log k)! –  was chosen to balance the 2 losses  = logk / k 2

Let players exchange the first 1.25k bits of the protocol. t 1 = #bits out of the first 1.25k bits sent by Alice (well defined due to pre-determined order of turns) Case 1: Alice sends additional bits to correct first edge corresponds to t 1  k+0.5logk Case 2: Bob sends the next edge corresponds to t 1 < k+0.5logk Bounding CC  (PJG) - More Formal  = logk / k 2

After the exchange of the first 1.25k bits, we “voluntarily” reveal the first edge to Bob. The players now play a new PJG of depth d-1. We need to show that sending the first edge of the new PJG also costs k+  (log k). Challenge: In the new PJG, some info about the players’ inputs may already be known – How do we measure the players’ progress? d Bounding CC  (PJG) - Why is the actual proof challenging?

Simulation Parameters (same): – k = O(1) –  = logk / k 2 Given a communication protocol P, we simulate P over the  -noisy channel using a recursive protocol: – The basic step simulates k steps of P – The i th inductive step simulates k i+1 steps of P

Simulating Protocol - Basic Step Simulating Protocol (Basic Step): – Players run k steps of P. Alice observes transcript T a, and Bob transcript T b – Players run an O(logk) bit consistency check of T a,T b using hash functions, each bit sent many times – A player that finds an inconsistency starts over and removes this step’s bits from his transcript k bits Protocol P O(logk) bits consistency check inconsistency

Simulating Protocol - Interactive Step Simulating Protocol (first inductive step): – Players run the Basic Step k consecutive times. Alice observes transcript T a, and Bob transcript T b (Players may go out of sync, but due to the alternating turns they know who should speak next) – Players run an O(log 2 k) bit consistency check of T a,T b using hash functions, each bit sent many times – A player that finds an inconsistency starts over and removes this step’s bits from his transcript k times inconsistency O(log 2 k) bits

Analysis: Correctness The final protocol simulates P with probability 1-o(1): – If an error occurred or the players went out of sync, they will eventually fix it, as the consistency check checks the whole transcript so far and is done with larger and larger parameters

Analysis: Waste in Basic Step  = logk / k 2 k bits Protocol P O(logk) bits consistency check inconsistency

Analysis: Waste in First Inductive Step Length of consistency check: O(log 2 k) bits Probability to start over: << O(1/k 10 ) Prob of undetected error in one of the k Basic Steps Total waste (in expectation): O(log 2 k) + O(1/k 10 ) O(k 2 ) = O(log 2 k) bits Fraction of bits wasted: O(log 2 k / k 2 ) << O(logk/k) negligible compared to the basic step! – Waste in next inductive steps is even smaller k times inconsistency O(log 2 k) bits  = logk / k 2

Thank You!