Presentation is loading. Please wait.

Presentation is loading. Please wait.

High-entropy random selection protocols Michal Koucký (Institute of Mathematics, Prague) Harry Buhrman, Matthias Christandl, Zvi Lotker, Boaz Patt-Shamir,

Similar presentations


Presentation on theme: "High-entropy random selection protocols Michal Koucký (Institute of Mathematics, Prague) Harry Buhrman, Matthias Christandl, Zvi Lotker, Boaz Patt-Shamir,"— Presentation transcript:

1 High-entropy random selection protocols Michal Koucký (Institute of Mathematics, Prague) Harry Buhrman, Matthias Christandl, Zvi Lotker, Boaz Patt-Shamir, KoliaVereshchagin

2 2 Random string selection: Alice Bob Alice Bob Goal: Alice and Bob want to agree on a random string r.

3 3 → Measure of randomness: Shannon entropy H( R ) = -  r Pr[R = r ] ∙ log Pr[ R = r ]

4 4 Example: random r 1 r 2 … r n/2 Alice random r n/2+1 … r n Bob → output r = r 1 r 2 … r n H( R ) = n if Alice and Bob follow the protocol. H( R ) = n if Alice and Bob follow the protocol. H( R )  n/2 if one of them cheats. H( R )  n/2 if one of them cheats.

5 5 Main results: Random selection protocol that guaranteesH( R )  n – O(1) even if one of the parties cheats. This protocol runs in log* n rounds and communicates O( n 2 ). Random selection protocol that guaranteesH( R )  n – O(1) even if one of the parties cheats. This protocol runs in log* n rounds and communicates O( n 2 ). Three-round protocol that guarantees H( R )  ¾ n and communicates O( n ) bits. Three-round protocol that guarantees H( R )  ¾ n and communicates O( n ) bits.

6 6 Previous work: Different variants Different variants random selection protocol [GGL’95, SV’05, GVZ’06] random selection protocol [GGL’95, SV’05, GVZ’06] collective coin flipping [B’82, Y’86, B-OL’89, AN’90, …] collective coin flipping [B’82, Y’86, B-OL’89, AN’90, …] leader selection [AN’90,…] leader selection [AN’90,…] fault-tolerant computation [GGL’95] fault-tolerant computation [GGL’95] multiple-parties protocols [AN’90,…] multiple-parties protocols [AN’90,…] quantum protocols [ABDR’04] quantum protocols [ABDR’04] different measures different measures statistical distance from uniform distribution statistical distance from uniform distribution entropy entropy

7 7 H( R )  n – O(1)  ( , log -1 1/  )-resilience. H( R )  n – O(1)  ( , log -1 1/  )-resilience. O( log* n )-rounds, O( n 2 )-communication. [GGL] ( ,  )-resilience, [GGL] ( ,  )-resilience, O( n 2 )-rounds, O( n 2 )-communication. [SV] ( ,  +  )-resilience, [SV] ( ,  +  )-resilience, O( log* n )-rounds, O( n 2 )-communication. [GVZ] ( ,  )-resilience [GVZ] ( ,  )-resilience O( log* n )-rounds, O( n )-communication. B {0,1} n ( ,  )-resilience:  B; |B|   2 n Pr[r  B]  

8 8 Our basic protocol: random x 1, …, x n  {0,1} n Alice random y  {0,1} n Bob random i  {1, …, n} → output x i  y H( R ) = n if Alice and Bob follow the protocol. H( R ) = n if Alice and Bob follow the protocol. H( R )  n – log n if Alice cheats. H( R )  n – log n if Alice cheats. H( R )  n – O(1) if Bob cheats. H( R )  n – O(1) if Bob cheats.

9 9 Alice cheats, Bob plays honestly: Alice carefully selects x 1, …, x n Alice carefully selects x 1, …, x n Bob picks a random y Bob picks a random y  for all i and r, Pr y [ r = x i  y ] = 2 -n.  for all r, Pr y [  i ; r = x i  y ]  n 2 -n.  H( R )  n – log n. H( R )  n – log n. H( R )  n – O(1) if Bob cheats. H( R )  n – O(1) if Bob cheats.

10 10 Iterating our protocol x 1, …, x m y 1, …, y m’ x 1, …, x m y 1, …, y m’ A B ijijijij AB r’’ = … r = x i  r’ r’ = y i  r’’ r = x i  r’ r’ = y i  r’’ → log* n iterations H( R )  n – 3 regardless of who cheats. H( R )  n – 3 regardless of who cheats.

11 11 Cost of our protocol: 2 log* n rounds 2 log* n rounds O( n 2 ) bits communicated Question: How to reduce the amount of communication close to linear?

12 12 Generic protocol: random x  {0,1} n Alice random y  {0,1} n Bob random i  {1, …, n} → output f ( x, y, i ) for some f : {0,1} n  {0,1} n  {1, …, n} → {0,1} n for some f : {0,1} n  {0,1} n  {1, …, n} → {0,1} n W.h.p for a random function f W.h.p for a random function f H( R )  n – O( log n ) regardless of cheating.

13 13 Explicit candidate functions: x i  yrotation of x i-times. x i  yrotation of x i-times. ix + yx, y  F k i  F ix + yx, y  F k i  F F = GF(2 log n ) k = n / log n ix + yx, y  F i  H  F ix + yx, y  F i  H  F F = GF(2 n ) |H|=n

14 14 Rotations: For any x and y ( x i  y )  ( x j  y ) = x i  x j = x A ij where A ij has rank n – 1. x random  n – 1  H( x A ij )  H( x i  y, x j  y ) x random  n – 1  H( x A ij )  H( x i  y, x j  y )  H( R )  n – log nwhen Alice cheats H( R )  n /2when Bob cheats H( R )  n /2when Bob cheats

15 15 ¾n-protocol: 1. Pick one half of the string by A-B-A “rotating” protocol and the other one by B-A-B “rotating” protocol, i.e., use the asymmetry in the cheating powers. 2. The “line” protocol ix + y, where x, y  [GF(2 n/4 )] k and k = 4 → analysis related to the problem of Kakeya.

16 16 Kakeya Problem: P FkFkFkFk Conj: P contains a line in each direction  |P|  | F | k ( 1 – c /|F|)

17 17 Open problems: Better analysis of our candidate functions. Better analysis of our candidate functions. Other candidate functions? Other candidate functions? Multiple parties. Multiple parties.

18 18 Alice plays honestly, Bob cheats: For any r 1, r 2, … r n, Pr x [ r 1 = x 1, … r n = x n ] = 2 – n 2 For any r 1, r 2, … r n, Pr x [ r 1 = x 1, … r n = x n ] = 2 – n 2  Pr[ r 1 = x 1  y, … r n = x n  y ]  2 n – n 2 where y is a function of the random x 1, x 2, … x n  H( x 1  y, …, x n  y )  n 2 - n  E[[ H( x i  y ) ]]  n – 1.   E[[ H( x i  y ) ]]  n – 1.  H( R )  n – O(1) H( R )  n – O(1)


Download ppt "High-entropy random selection protocols Michal Koucký (Institute of Mathematics, Prague) Harry Buhrman, Matthias Christandl, Zvi Lotker, Boaz Patt-Shamir,"

Similar presentations


Ads by Google