High-entropy random selection protocols Michal Koucký (Institute of Mathematics, Prague) Harry Buhrman, Matthias Christandl, Zvi Lotker, Boaz Patt-Shamir,

Slides:



Advertisements
Similar presentations
Polylogarithmic Private Approximations and Efficient Matching
Advertisements

Optimal Bounds for Johnson- Lindenstrauss Transforms and Streaming Problems with Sub- Constant Error T.S. Jayram David Woodruff IBM Almaden.
Efficient Private Approximation Protocols Piotr Indyk David Woodruff Work in progress.
Coin Tossing With A Man In The Middle Boaz Barak.
Extracting Randomness From Few Independent Sources Boaz Barak, IAS Russell Impagliazzo, UCSD Avi Wigderson, IAS.
Foundations of Cryptography Lecture 7 Lecturer:Danny Harnik.
The Round Complexity of Two-Party Random Selection Saurabh Sanghvi and Salil Vadhan Harvard University.
Foundations of Cryptography Lecture 2: One-way functions are essential for identification. Amplification: from weak to strong one-way function Lecturer:
Random non-local games Andris Ambainis, Artūrs Bačkurs, Kaspars Balodis, Dmitry Kravchenko, Juris Smotrovs, Madars Virza University of Latvia.
Secure Computation of Linear Algebraic Functions
Random non-local games Andris Ambainis, Artūrs Bačkurs, Kaspars Balodis, Dmitry Kravchenko, Juris Smotrovs, Madars Virza University of Latvia.
1 Reducing Complexity Assumptions for Statistically-Hiding Commitment Iftach Haitner Omer Horviz Jonathan Katz Chiu-Yuen Koo Ruggero Morselli Ronen Shaltiel.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
Robust Randomness Expansion Upper and Lower Bounds Matthew Coudron, Thomas Vidick, Henry Yuen arXiv:
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Gillat Kol joint work with Ran Raz Competing Provers Protocols for Circuit Evaluation.
On Fair Exchange, Fair Coins and Fair Sampling Shashank Agrawal, Manoj Prabhakaran University of Illinois at Urbana-Champaign.
Foundations of Cryptography Lecture 4 Lecturer: Moni Naor.
Introduction to Modern Cryptography, Lecture 12 Secure Multi-Party Computation.
Experimental Bit String Generation Serge Massar Université Libre de Bruxelles.
Short course on quantum computing Andris Ambainis University of Latvia.
EC – Tutorial / Case study Iterated Prisoner's Dilemma Ata Kaban University of Birmingham.
Some Limits on Non-Local Randomness Expansion Matt Coudron and Henry Yuen /12/12 God does not play dice. --Albert Einstein Einstein, stop telling.
High-entropy random selection protocols Michal Koucký (Institute of Mathematics, Prague) Harry Buhrman, Matthias Christandl, Zvi Lotker, Boaz Patt-Shamir,
CS151 Complexity Theory Lecture 7 April 20, 2004.
Seminar in Foundations of Privacy Gil Segev Message Authentication in the Manual Channel Model.
The 1’st annual (?) workshop. 2 Communication under Channel Uncertainty: Oblivious channels Michael Langberg California Institute of Technology.
Ref. Cryptography: theory and practice Douglas R. Stinson
Derandomizing LOGSPACE Based on a paper by Russell Impagliazo, Noam Nissan and Avi Wigderson Presented by Amir Rosenfeld.
Oded Regev (Tel Aviv University) Ben Toner (CWI, Amsterdam) Simulating Quantum Correlations with Finite Communication.
Introduction to Modern Cryptography, Lecture 7/6/07 Zero Knowledge and Applications.
Alice and Bob’s Revenge? AliceBob Elvis. If there is a protocol If there is a protocol then there must be a shortest protocol 1.Alice  Bob : ??? 2.Bob.
Shannon ’ s theory part II Ref. Cryptography: theory and practice Douglas R. Stinson.
BB84 Quantum Key Distribution 1.Alice chooses (4+  )n random bitstrings a and b, 2.Alice encodes each bit a i as {|0>,|1>} if b i =0 and as {|+>,|->}
Introduction to Modern Cryptography, Lecture 9 More about Digital Signatures and Identification.
Lo-Chau Quantum Key Distribution 1.Alice creates 2n EPR pairs in state each in state |  00 >, and picks a random 2n bitstring b, 2.Alice randomly selects.
Finite probability space set  (sample space) function P:  R + (probability distribution)  P(x) = 1 x 
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Foundations of Cryptography Lecture 9 Lecturer: Moni Naor.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
Complexity Theory Lecture 2 Lecturer: Moni Naor. Recap of last week Computational Complexity Theory: What, Why and How Overview: Turing Machines, Church-Turing.
On Embedding Edit Distance into L_11 On Embedding Edit Distance into L 1 Robert Krauthgamer (Weizmann Institute and IBM Almaden)‏ Based on joint work (i)
Wonders of the Digital Envelope Avi Wigderson Institute for Advanced Study.
CS151 Complexity Theory Lecture 13 May 11, Outline proof systems interactive proofs and their power Arthur-Merlin games.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
The sum-product theorem and applications Avi Wigderson School of Mathematics Institute for Advanced Study.
Practical Aspects of Quantum Coin Flipping Anna Pappa Presentation at ACAC 2012.
On the Communication Complexity of SFE with Long Output Daniel Wichs (Northeastern) joint work with Pavel Hubáček.
A limit on nonlocality in any world in which communication complexity is not trivial IFT6195 Alain Tapp.
Randomization Carmella Kroitoru Seminar on Communication Complexity.
Massive Data Sets and Information Theory Ziv Bar-Yossef Department of Electrical Engineering Technion.
CS 4803 Fall 04 Public Key Algorithms. Modular Arithmetic n Public key algorithms are based on modular arithmetic. n Modular addition. n Modular multiplication.
Pseudorandom generators for group products Michal Koucký Institute of Mathematics, Prague Prajakta Nimbhorkar Pavel Pudlák IMSC, Chenai IM, Prague IMSC,
The question Can we generate provable random numbers? …. ?
Pseudorandom Bits for Constant-Depth Circuits with Few Arbitrary Symmetric Gates Emanuele Viola Harvard University June 2005.
Communication Complexity Guy Feigenblat Based on lecture by Dr. Ely Porat Some slides where adapted from various sources Complexity course Computer science.
Iftach Haitner and Eran Omri Coin Flipping with Constant Bias Implies One-Way Functions TexPoint fonts used in EMF. Read the TexPoint manual before you.
The sum-product theorem and applications Avi Wigderson School of Mathematics Institute for Advanced Study.
Locking of correlations Debbie Leung U. Waterloo From: Charles Bennett Date: Sept 06, 2001 Subject: Pictures from Huangshan China Dear Friends, Here is.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Multi-Party Computation r n parties: P 1,…,P n  P i has input s i  Parties want to compute f(s 1,…,s n ) together  P i doesn’t want any information.
Information Complexity Lower Bounds
Stochastic Streams: Sample Complexity vs. Space Complexity
Branching Programs Part 3
Lecture 11: Nearest Neighbor Search
CS 154, Lecture 6: Communication Complexity
The Curve Merger (Dvir & Widgerson, 2008)
Quantum Information Theory Introduction
How to Delegate Computations: The Power of No-Signaling Proofs
Presentation transcript:

High-entropy random selection protocols Michal Koucký (Institute of Mathematics, Prague) Harry Buhrman, Matthias Christandl, Zvi Lotker, Boaz Patt-Shamir, KoliaVereshchagin

2 Random string selection: Alice Bob Alice Bob Goal: Alice and Bob want to agree on a random string r.

3 → Measure of randomness: Shannon entropy H( R ) = -  r Pr[R = r ] ∙ log Pr[ R = r ] e.g. R uniform on {0,1} n → H( R ) = n R uniform on 0 n/2 {0,1} n/2 → H( R ) = n /2 R uniform on 0 n → H( R ) = 0

4 Example: random r 1 r 2 … r n/2 Alice random r n/2+1 … r n Bob → output r = r 1 r 2 … r n H( R ) = n if Alice and Bob follow the protocol. H( R ) = n if Alice and Bob follow the protocol. H( R )  n/2 if one of them cheats. H( R )  n/2 if one of them cheats.

5 Main results: Random selection protocol that guaranteesH( R )  n – O(1) even if one of the parties cheats. This protocol runs in log* n rounds and communicates O( n 2 ). Random selection protocol that guaranteesH( R )  n – O(1) even if one of the parties cheats. This protocol runs in log* n rounds and communicates O( n 2 ). Three-round protocol that guarantees H( R )  ¾ n and communicates O( n ) bits. Three-round protocol that guarantees H( R )  ¾ n and communicates O( n ) bits.

6 Previous work: Different variants Different variants random selection protocol [GGL’95, SV’05, GVZ’06] random selection protocol [GGL’95, SV’05, GVZ’06] collective coin flipping [B’82, Y’86, B-OL’89, AN’90, …] collective coin flipping [B’82, Y’86, B-OL’89, AN’90, …] leader selection [AN’90,…] leader selection [AN’90,…] fault-tolerant computation [GGL’95] fault-tolerant computation [GGL’95] multiple-party protocols [AN’90,…] multiple-party protocols [AN’90,…] quantum protocols [ABDR’04] quantum protocols [ABDR’04] different measures different measures resilience resilience statistical distance from uniform distribution statistical distance from uniform distribution entropy entropy

7 H( R )  n – O(1)  ( , log -1 1/  )-resilience. H( R )  n – O(1)  ( , log -1 1/  )-resilience. O( log* n )-rounds, O( n 2 )-communication. [GGL] ( ,  )-resilience, [GGL] ( ,  )-resilience, O( n 2 )-rounds, O( n 2 )-communication. [SV] ( ,  +  )-resilience, [SV] ( ,  +  )-resilience, O( log* n )-rounds, O( n 2 )-communication. [GVZ] ( ,  )-resilience [GVZ] ( ,  )-resilience O( log* n )-rounds, O( n )-communication. B {0,1} n ( ,  )-resilience:  B; |B|   2 n Pr[r  B]  

8 Our basic protocol: random x 1, …, x n  {0,1} n Alice random y  {0,1} n Bob random i  {1, …, n} → output x i  y H( R ) = n if Alice and Bob follow the protocol. H( R ) = n if Alice and Bob follow the protocol. H( R )  n – log n if Alice cheats. H( R )  n – log n if Alice cheats. H( R )  n – O(1) if Bob cheats. H( R )  n – O(1) if Bob cheats.

9 Alice cheats, Bob plays honestly: Alice carefully selects x 1, …, x n Alice carefully selects x 1, …, x n Bob picks a random y Bob picks a random y  for all i and r, Pr y [ r = x i  y ] = 2 -n.  for all r, Pr y [  i ; r = x i  y ]  n 2 -n.  H( R )  n – log n. H( R )  n – log n.

10 Alice plays honestly, Bob cheats: For any r 1, r 2, … r n, Pr x [ r 1 = x 1, … r n = x n ] = 2 – n 2 For any r 1, r 2, … r n, Pr x [ r 1 = x 1, … r n = x n ] = 2 – n 2  Pr[ r 1 = x 1  y, … r n = x n  y ]  2 n – n 2 where y is a function of the random x 1, x 2, … x n  H( x 1  y, …, x n  y )  n 2 - n  E[[ H( x i  y ) ]]  n – 1.   E[[ H( x i  y ) ]]  n – 1.  H( R )  n – O(1) H( R )  n – O(1)

11 Our basic protocol: random x 1, …, x n  {0,1} n Alice random y  {0,1} n Bob random i  {1, …, n} → output x i  y H( R ) = n if Alice and Bob follow the protocol. H( R ) = n if Alice and Bob follow the protocol. H( R )  n – log n if Alice cheats. H( R )  n – log n if Alice cheats. H( R )  n – O(1) if Bob cheats. H( R )  n – O(1) if Bob cheats.

12 Iterating our protocol x 1, …, x m y 1, …, y m’ x 1, …, x m y 1, …, y m’ A B ijijijij AB r’’ = … r = x i  r’ r’ = y i  r’’ r = x i  r’ r’ = y i  r’’ → log* n iterations H( R )  n – 3 regardless of who cheats. H( R )  n – 3 regardless of who cheats.

13 Protocol P i (A, B) x 1, …, x l i x 1, …, x l i A P i-1 (B,A) P i-1 (B,A) j y j y A r = x j  y r = x j  y l 0 = nl i = log l i-1 k = log* n – l l k = 2

14 Claim: For i =0,…, k, output R i of P i (Alice,Bob) satisfies H( R i ) = n if Alice and Bob follow the protocol. H( R i ) = n if Alice and Bob follow the protocol. H( R i )  n – log 4 l i if Alice cheats. H( R i )  n – log 4 l i if Alice cheats. H( R i )  n – 2 if Bob cheats. H( R i )  n – 2 if Bob cheats. Pf: Alice carefully selects x 1, …, x l i. P i-1 (Bob, Alice) gives y = R i-1 P i-1 (Bob, Alice) gives y = R i-1 with H( y| x 1, …, x l i )  n – 2. Alice carefully selects j to output R i = x j  y Alice carefully selects j to output R i = x j  y

15 Pf: Alice carefully selects x 1, …, x l i. P i-1 (Bob, Alice) gives y = R i-1 P i-1 (Bob, Alice) gives y = R i-1 with H( y| x 1, …, x l i )  n – 2. Alice carefully selects j to output R i = x j  y Alice carefully selects j to output R i = x j  y H( x j  y )  H( x j  y | x 1, …, x l i ) H( x j  y )  H( x j  y | x 1, …, x l i )  H( y | x 1, …, x l i ) - H( j | x 1, …, x l i )  H( y | x 1, …, x l i ) - H( j )  n – 2 – log l i H( x j  y, j| x 1, …, x l i )  H( y | x 1, …, x l i ) 

16 Cost of our protocol: 2 log* n rounds 2 log* n rounds O( n 2 ) bits communicated Question: How to reduce the amount of communication close to linear?

17 Generic protocol: random x  {0,1} n Alice random y  {0,1} n Bob random i  {1, …, n} → output f ( x, y, i ) for some f : {0,1} n  {0,1} n  {1, …, n} → {0,1} n for some f : {0,1} n  {0,1} n  {1, …, n} → {0,1} n W.h.p for a random function f W.h.p for a random function f H( R )  n – O( log n ) regardless of cheating.

18 Explicit candidate functions: x i  yrotation of x i-times. x i  yrotation of x i-times. ix + yx, y  F k i  F ix + yx, y  F k i  F F = GF(2 log n ) k = n / log n ix + yx, y  F i  H  F ix + yx, y  F i  H  F F = GF(2 n ) |H|=n

19 Rotations: Fix i and j. For any x and y ( x i  y )  ( x j  y ) = x i  x j = x A ij where A ij has rank n – 1. x random  n – 1  H( x A ij )  H( x i  y, x j  y ) x random  n – 1  H( x A ij )  H( x i  y, x j  y )  H( R )  n – log nwhen Alice cheats H( R )  n /2when Bob cheats H( R )  n /2when Bob cheats

20 ¾n-protocol: 1. Pick one half of the string by A-B-A “rotating” protocol and the other one by B-A-B “rotating” protocol, i.e., use the asymmetry in the cheating powers. 2. The “line” protocol ix + y, where x, y  [GF(2 n/4 )] k and k = 4 → analysis related to the problem of Kakeya.

21 Kakeya Problem: P FkFkFkFk Q: P contains a line in each direction. How large is P ?

22 L … collection of lines; in each direction one line. Conjecture: |P L | must be close to |F | k where P L is the union of points in L. (|F |>2.) X L … random variable – choose a line from L at random and pick a random point on it. Def: H(|F |, k ) = min L H( X L ) H( X L )  log |P L | H( X L )  log |P L |

23 Geometric protocol: ix + yx, y  F k i  F ix + yx, y  F k i  F → line given by direction x and point y Claim: Let R be the outcome of the geometric protocol. If Alice is honest then H( R )  H(|F |, k ). H( R )  H(|F |, k ). Furthermore, Bob can impose H( R ) = H(|F |, k ). → proof of security of our protocol implies the conjecture for Kakeya problem.

24 Geometric protocol: ix + yx, y  F k i  F ix + yx, y  F k i  F → line given by direction x and point y Claim: Let R be the outcome of the geometric protocol. If Alice is honest then H( R )  (k /2 + 1)|F | – O(1). H( R )  (k /2 + 1)|F | – O(1). → For k = 4 and |F |= 2 n/4 we get H( R )  3n/4.

25 Open problems: Better analysis of our candidate functions. Better analysis of our candidate functions. Other candidate functions? Other candidate functions? Multiple parties. Multiple parties.