Presentation is loading. Please wait.

Presentation is loading. Please wait.

Deterministic Extractors for Small Space Sources Jesse Kamp, Anup Rao, Salil Vadhan, David Zuckerman.

Similar presentations


Presentation on theme: "Deterministic Extractors for Small Space Sources Jesse Kamp, Anup Rao, Salil Vadhan, David Zuckerman."— Presentation transcript:

1 Deterministic Extractors for Small Space Sources Jesse Kamp, Anup Rao, Salil Vadhan, David Zuckerman

2 Randomness Extractors Defn: min-entropy(X)k if x Pr[X=x] · 2 -k. No “deterministic” (seedless) extractor for all X with min-entropy k: 1.Can add seed. 2.Can restrict X. ExtX  Uniform

3 Independent Sources Ext Uniform

4 Bit-Fixing Sources ? 1 ? ? 0 1 Ext

5 Small Space Sources Space s source: min-entropy k source generated by width 2 s branching program. n+1 layers 110100 1/ , 0 1-1/ , 0 1,1 0.1,0 0.8,1 0.1,0 0.3,0 0.5,1 0.1,1 0.1,0 1 width 2 s

6 Related Work [Blum]: Markov Chain with a constant number of states [Koenig, Maurer]: related model [Trevisan, Vadhan]: considered sources sampled by small circuits  requires complexity theoretic assumptions.

7 Small space sources capture: Bit fixing sources  space 0 sources General Sources with min-entropy k  space k sources c Independent sources  space n/c sources

8 Bit Fixing Sources can be modelled by Space 0 sources ? 1 ? ? 0 1 0.5,1 0.5,0 1,11,01,1

9 General Sources are Space n sources Pr[X 2 = 1|X 1 =1], 1 Pr[X 1 = 0], 0 Pr[X 1 = 1], 1 Pr[X 2 = 0|X 1 =1], 0 n layers width 2 n X = X 1 X 2 X 3 X 4 X 5 …..………….. Min-entropy k sources are convex combinations of space k sources

10 c Independent Sources: Space n/c sources. 0 1 1 1 0 1 0 0 1 1 0 1 0 1 00 0 1 0 1 0 0 0 1 0 1 0 1 1 1 1 width 2 n/c

11 Our Main Results Min-EntropySpaceErrorOutput Bits k = n 1-c n 1-4c 2 -n c 99% k k =  ncncn 2 -n/polylog(n) 99% k c = sufficiently small constant > 0

12 Outline Our Techniques  Extractor for linear min-entropy rate  Extractor for polynomial min-entropy rate Future Directions

13 We reduce to another model Total Entropy k independent sources:

14 X|State 5 = V Y| State 5 = V Y The Reduction X V These two distributions are independent! Expect the min-entropy of X|State 5 = V, Y|State 5 = V to be about k – s.

15 Can get many independent sources W X Y Z If we condition on t states, we expect to lose ts bits of min-entropy.

16 Entropy Loss Let S 1, …, S t denote the random variables for the state in the t layers. Pr[X = x]  Pr[X=x|S 1 =s 1,…,S t =s t ] Pr[S 1 =s 1,…,S t =s t ] X|S 1 =s 1,…,S t =s t has min-entropy < k – 2ts ) Pr[S 1 = s 1,…,S t =s t ] < 2 -2ts Union bound: happens with prob < 2 -ts

17 The Reduction Every space s source with min-entropy k is close to a convex combination of t total entropy k-2ts sources. W X Y Z

18 Some Additive Number Theory [Bourgain, Glibichuk, Konyagin] (   >0) (  integers C=C(  ), c=c(  )):  non-trivial additive character  of GF(2 p ) and every independent min-entropy  p sources X 1, …, X C, | E[  ( X 1 X 2 … X C )] | < 2 -cp

19 Vazirani’s XOR lemma Z  GF(2 n ) a random variable with |E[  (Z)]| <  for every nontrivial , then any m bits of Z are  2 m/2 close to uniform. | E[  ( X 1 X 2 … X C )] | < 2 -cp ) lsb m (X 1 X 2 … X C ) is 2 m/2 – cp close to uniform X 1 X 2 X 3 X 4 lsb(X 1 X 2 X 3 X 4 )

20 More than an independent sources extractor Analysis: (X 1 X 2 ), (X 3 X 4 ), (X 5 X 6 X 7 ), X 8 are independent sources. X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 lsb m (X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 )

21 Small Space Extractor for  n entropy If the source has min-entropy  n,  /2 fraction of blocks must have min-entropy rate . Take (2/  ) C(  /2) blocks ) C(  /2) blocks have min-entropy rate  /2. lsb(  )

22 Result Theorem: (   > 0,   > 0)  efficient extractor for min-entropy k   n space   n output length =  (n) error = 2 -  (n) Can improve to get 99% of the min-entropy out using techniques from [Gabizon,Raz,Shaltiel]

23 For Polynomial Entropy Rate Black Boxes: Good Condensers: [Barak, Kindler, Shaltiel, Sudakov, Wigderson], [Raz] Good Mergers: [Raz], [Dvir, Raz] White Box: Condensing somewhere random sources: [Rao]

24 Somewhere Random Source Def: [TS96] Has some uniformly random row. t r

25 Aligned Somewhere High Entropy Sources Def: Two somewhere high-entropy sources are aligned if the same row has high entropy in both sources.

26 Condensers [BKSSW],[Raz],[Z] A B C nn A B C AC+B  (1.1) (2n/3) Elements in a prime field

27 Iterating the condenser A B C nn  (1.1) t (2/3) t n

28 Mergers [Raz], [Dvir, Raz]  0.9  99% of rows in output have entropy rate 0.9  C

29 Condense + Merge [Raz] 1.1  99% of rows in output have entropy rate 1.1   Condense Merge C

30 This process maintains alignment  1.1  C (1.1) 2  C2C2

31 Bottom Line: (1.1) t  CtCt  [BGK] X1X1 Y1Y1 Z1Z1 lsb(X 1 Y 1 Z 1 ) n/d t

32 Extracting from SR-sources [Rao] r sqrt(r) r We generalize this: Arbitrary number of sources

33 Recap (1.1) t  CtCt  [BGK] X1X1 Y1Y1 Z1Z1 lsb(X 1 Y 1 Z 1 ) sqrt(r) r Arbitrary number of sources W X Y Z

34 Solution Entropy:  n  2 of these have rate  /2  4 of these have rate  /4 CtCt (1.1) t  [BGK] X1X1 Y1Y1 Z1Z1 lsb(X 1 Y 1 Z 1 )

35 Final Entropy:  n  2 of these have rate  /2 If   n -0.01 # rows << length of row

36 Result Theorem: (Assuming we can find primes) (   )  efficient extractor for min-entropy n 1-  space n 1-4  output length n  (1) error 2 -n  (1) Can improve to get 99% of the min-entropy out using techniques from [Gabizon,Raz,Shaltiel]

37 Future Directions Smaller min-entropy k?  Non-explicit: k=O(log n)  Our results: k=n 1-  (1) Larger space?  Non-explicit:  (k)  Our results:  (k) only for k=  (n) Other natural models?

38 Questions?


Download ppt "Deterministic Extractors for Small Space Sources Jesse Kamp, Anup Rao, Salil Vadhan, David Zuckerman."

Similar presentations


Ads by Google