Presentation is loading. Please wait.

Presentation is loading. Please wait.

Extractors: applications and constructions Avi Wigderson IAS, Princeton Randomness Seeded.

Similar presentations


Presentation on theme: "Extractors: applications and constructions Avi Wigderson IAS, Princeton Randomness Seeded."— Presentation transcript:

1 Extractors: applications and constructions Avi Wigderson IAS, Princeton Randomness Seeded

2 Extractors: original motivation Unbiased, independent Probabilistic algorithms Cryptography Game Theory Applications: Analyzed on perfect randomness biased, dependent Reality: Sources of imperfect randomness Stock market fluctuations Sun spots Radioactive decay Extractor Theory

3 Applications of Extractors Using weak random sources in prob algorithms [B84,SV84,V85,VV85,CG85,V87,CW89,Z90-91] Randomness-efficient error reduction of prob algorithms [ Sip88, GZ97, MV99,STV99 ] Derandomization of space-bounded algorithms [ NZ93, INW94, RR99, GW02 ] Distributed Algorithms [ WZ95, Zuc97, RZ98, Ind02 ]. Hardness of Approximation [ Zuc93, Uma99, MU01 ] Cryptography [ CDHKS00, MW00, Lu02 Vad03 ] Data Structures [Ta02]

4 Unifying Role of Extractors Extractors are intimately related to: Hash Functions [ILL89,SZ94,GW94] Expander Graphs [NZ93, WZ93, GW94, RVW00, TUZ01, CRVW02] Samplers [G97, Z97] Pseudorandom Generators [Trevisan 99, …] Error-Correcting Codes [T99, TZ01, TZS01, SU01, U02] Ergodic Theory [Lindenstrauss 07] Exponential sums  Unify the theory of pseudorandomness.

5 Definitions

6 Weak random sources Distributions X on {0,1} n with some entropy: [vN] sources: n coins of unknown fixed bias [SV] sources : Pr[X i+1 =1|X 1 =b 1,…,X i =b i ]  ( δ, 1-δ) Bit fixing: n coins, some good, some “sticky” ….. [Z] k-sources: H ∞ (X) ≥ k  x Pr[X = x]  2 -k e.g X uniform with support ≥ 2 k k – the entropy in the weak source {0,1} n X

7 Randomness Extractors (1 st attempt) E XT X k -source of length n m almost-uniform bits Ext : {0,1} n  {0,1} m Impossible even if k=n-1 and m=1 “weak” random source X k can be e.g n/2, √n, log n,… Ext=0 Ext=1 {0,1} n X m < k

8 Extractors [Nisan & Zuckerman `93] E XT k -source of length n m bits  -close to uniform Ext i : {0,1} n  {0,1} m i  {0,1} d = [D]  k-source X,  but  -fraction of i’s, | Ext i (X) – U m | 1 <  d random bits (short) “seed” {0,1} n X {0,1} m Ext i (X) i  {0,1} d

9 Probabilistic algorithms with weak random bits k-source of length n m random bits E XT d random bits Probabilistic algorithm Input (upto  L 1 error) Output Error prob < δ ++ Where from? Try all possible D=2 d indices. Take majority vote. Efficient? Want: efficient Ext, small d, , large m

10 Extractors - Parameters E XT k -source of length n m bits  -close to uniform Goals: minimize d, maximize m. Non-constructive & optimal [Sip88,NZ93,RT97]: –Seed length d = log n + O(1). –Output length m = k - O(1). d random bits (short) “seed”  = 0.01 k  n/2

11 Explicit Constructions Non-constructive & optimal [Sip88,NZ93,RT97]: –Seed length d = log n + O(1). –Output length m = k - O(1). [...B86,SV86,CG87, NZ93, WZ93, GW94, SZ94, SSZ95, Zuc96, Ta96, Ta98, Tre99, RRV99a, RRV99b, ISW00, RSW00, RVW00, TUZ01, TZS01, SU01, LRVW03,…] New explicit constructions [GUV07, DW08] - Seed length d = O(log n) [even for  =1/n] –Output length m =.99k

12 Applications

13 Probabilistic algorithms with weak random bits k-source of length n X m random bits E XT d random bits Probabilistic algorithm Input (upto  ) Output Error prob < δ ++ Try all D=2 d = poly(n) strings. Take Majority vote Efficient! The error set B  {0,1} m of alg is sampled accurately whp

14 Extractors as samplers k-source X |X|=2 k (k,  )-extractor Ext i : {0,1} n  {0,1} m i  {0,1} d = [D] {0,1} n {0,1} m x Ext(x) i B Discrepancy: For every B, for all but 2 k of the x  {0,1} n | |Ext(X)  B|/D - |B|/2 m |<  Sampling Hashing Amplification Coding Expanders … 

15 Beating e-value expansion [WZ] Task: Construct an graph on [N] of minimal degree DEG s.t. every two sets of size K are connected by an edge. Any such graph: DEG > N/K Ramanujan graphs: DEG < (N/K) 2 Random graphs: DEG < (N/K) 1+o(1) Extractors: DEG < (N/K) 1+o(1) N K K

16 Extractors as expanders (k,. 01 )-extractor Ext: {0,1} n  {0,1} d  {0,1} m 2 k = K = M 1+o(1) Ext: [N] x [D]  [M] 2 d = D < M o(1) [N] [M] |Ext(X)| >. 99 M |X|=K |X’|=K Take G = Ext 2 on [N] DEG < (N/K) 1+o(1) Many edges between any two K-sets X,X’ x i Ext i (x)

17 Extractors as list-decodable error-correcting codes [TZ] For z  {0,1} D let B z  {0,1} d+1 be the set {(i,z i ) : i  [D] } List decoding: For every z, at most K=D 2 of x’s have C(x) fall in (1/2 -  )D hamming ball around z c2c2 c1c1 c3c3 {0,1} D c8c8 c7c7 c6c6 c5c5 c4c4 c9c9 d = c log n D =2 d = n c k = 2d m = 1 C: {0,1} n  {0,1} D Polynomial rate! Efficient encoding!! Efficient decoding? z n-bit string x Ext 1 (x) E XT Ext 2 (x) Ext D (x) 1 bit 1 bit 1 bit C(x)= (,, ………, ) z Unique decoding: Radius < D/4 List decoding: Radius < D/2 Can one get radius ~ D/2 and small list?

18 Constructions

19 Expanders as extractors {0,1} m r 1 r 2 …. r D random G-path (n= m+O(D) random bits) G explicit expander of const degree B |B|/2 m = δ r 1 r 2 r i r D Thm [AKS,G] Pr[||{r 1 r 2 …. r t }  B}|/D – δ| >  ] < exp(-  2 D) Thm [Z] D=cm=2 d, Ext i (r 1 r 2 …. r D ) = r i is an (k=.9n,  )–extractor of d=O(log n) seed (breaks down for k < n/2)

20 Condensers [RR99,RSW00,TUZ01] d random bits seed Con X k -source of length n.9k -source of length k Thm: Sufficient to construct such condensers: from here we can use [Z] extractor

21 Mergers [T96] d random bits seed Mer X 1 X 2 … X S.9k -source Some block X i is random. The other X j are correlated arbitrarily with it. Mer outputs a high entropy distribution. Thm: Sufficient to construct mergers! X= n=ks k k … k k

22 Mergers d random bits seed Mer X 1 X 2 … X S. 9k-source X= n=ks k k … k k X i  F q k q ~ n 100 Some X i is random [LRVW] Mer = a 1 X 1 +a 2 X 2 +…+a s X s a i  F q ( d=slog q ) Mer is a random element in the subspace spanned by X i ’s [D] It works! (proof of the Wolf conjecture). But d large! [DW] Mer = a 1 (y)X 1 +a 2 (y)X 2 +…+a s (y)X s y  F q ( d=log q ) Mer is a random element in the curve through the X i ’s

23 The proof Assume: E [|C(X)  B|] > 2 ε & B small x1x1 x2x2 xixi xsxs x1x1 x2x2 xixi xsxs C(x) (F q ) k B Mer(x) B Pr x [ |C(x)  B|> ε ] >ε  Pr x [ Q(C(x))  0 ] >ε Deg(C) = s-1  Pr [ Q(x i )  0 ] >ε  Q  0 # Q:(F q ) k  F q Q(B)  0, deg <εq/s x=(x 1, x 2, …, x s )

24 Open Problems Find explicit extractors with –Seed length d = 1×log n + O(1). –Output length m = 1×k - O(1). Expand connections to Ergodic Theory and Number Theory Find explicit bipartite graph, of constant deg [N 3 ] [N 2 ] |X|=N |Γ(X)|≥ N

25 Extractors - Parameters E XT k -source of length n m bits  -close to uniform Goals: minimize d, , maximize m. Non-constructive & optimal [Sip88,NZ93,RT97]: –Seed length d = log(n-k) + 2 log 1/  + O(1). –Output length m = k + d - 2 log 1/  - O(1). d random bits (short) “seed”


Download ppt "Extractors: applications and constructions Avi Wigderson IAS, Princeton Randomness Seeded."

Similar presentations


Ads by Google