Download presentation
Presentation is loading. Please wait.
Published byJarred Christley Modified over 10 years ago
1
Pseudorandomness from Shrinkage David Zuckerman University of Texas at Austin Joint with Russell Impagliazzo and Raghu Meka
2
Randomness and Computing Randomness extremely useful in computing. – Randomized algorithms – Monte Carlo simulations – Cryptography – Distributed computing Problem: high-quality randomness expensive.
3
What is minimal randomness requirement? Can we eliminate randomness completely? If not: – Can we minimize quantity of randomness? – Can we minimize quality of randomness? What does this mean?
4
What is minimal randomness requirement? Can we eliminate randomness completely? If not: – Can we minimize quantity of randomness? Pseudorandom generator – Can we minimize quality of randomness? Randomness extractor
5
Pseudorandom Numbers Computers rely on pseudorandom generators: PRG 71294 141592653589793238 short random string long “ random-enough ” string What does “ random enough ” mean?
6
Modern Approach to PRGs [Blum-Micali 1982, Yao 1982] Alg random pseudorandom ≈ same behavior Require PRG to “ fool ” all efficient algorithms.
7
Which efficient algorithms? Most functions fool all polynomial-time circuits. – Construct explicitly? Poly-time PRG fooling all polynomial-time circuits implies NP≠P. So either: – Make unproven assumption. – Try to fool interesting subclasses of algorithms.
8
Two Major Challenges 1.Prove circuit lower bounds. – EXP does not have poly-size circuits. 2.Derandomize algorithms. Hardness vs. Randomness paradigm – (1) implies (2) [Nisan-Wigderson, BFNW,…] – Almost equivalent [Kabanets-Impagliazzo …]
9
Pseudorandom Generators PRG fools class F of functions if |Pr[f(U n )=1] - Pr[f(PRG(U d ))=1]| ≤ ε. Cryptography: e.g., F=BPTIME(n log n ). – Equivalent to one-way functions [HILL]. Derandomizing BPP: F=n c -size circuits. – Need unproven lower bound assumptions. What F, d without unproven assumptions? PRG pseudorandomrandom seed n d
10
Pseudorandom Generators PRG fools class F of functions if |Pr[f(U n )=1] - Pr[f(PRG(U d ))=1]| ≤ ε. PRG fooling {f | size M (f)≤s} with seed length s 1/c implies g in NP with size M (g)≥≈n c. Can we achieve converse: does g in P with size M (g)≥n c imply PRG with seed of length ≈ s 1/c ? Previous work gives nothing in this case. PRG pseudorandomrandom seed n d
11
New Results Construct such near optimal PRGs if lower bound is proved via “shrinkage.” Obtain following seed lengths to fool size s, error = 1/poly. – Formulas over { ∨, ∧,NOT}: s 1/3+o(1) – Formulas over arbitrary basis: s 1/2+o(1) – Read-once formulas over { ∨, ∧,NOT}: s.234… – Branching programs: s 1/2+o(1)
12
Previous Work Seed length (1-α)n fooling read-once formulas and read-once branching programs of width 2 αn, α>0 small enough constant. [Bogdanov, Papakonstantinou, Wan]. For ROBPs reading bits in known order, seed length O(log 2 n) [Nisan,…].
13
Random Restrictions Choose random restriction ρ, fraction p unset. E[size(f| ρ )] ≤ p size(f), size(formula)= # leaves. Whp size(f| ρ ) ≤ 2p size(f). Holds even if ρ chosen k-wise independently.
14
Shrinkage Exponent Random ρ, fraction p unset. Shrinkage Γ: E[size(f| ρ )] = O(p Γ s). Example: Formulas. – Formulas over arbitrary basis: Γ = 1. – Formulas over DM={ ∨, ∧,NOT}: Γ = 2 [Subbotovskaya ‘61, …., Hastad ‘93] – Read-once formulas over DM: Γ = 3.27… [Paterson-Zwick ‘91, Hastad-Razborov-Yao ‘95] General circuits: Γ = 0.
15
Branching Programs Layered, ordered, read-once BPs needed for PRG for Space Size = # edges ≤ 2wn. Γ = 1: size of shrunken BP proportionally to |{unfixed var’s}|. |{layered, ordered ROBPs}| ≤ w 2wn. We consider arbitrary BPs, reading bits in arbitrary order. n+1 layers width w 0 0 1 1 x1x1 x2x2 acc rej
16
PRGs from Shrinkage Random ρ, fraction p unset. Shrinkage Γ: E[size(f| ρ )] = O(p Γ s). Shrinkage Γ n Γ+1 /polylog(n) lower bounds [Andreev]. Main theorem: High probability shrinkage Γ wrt pseudorandom restrictions gives PRG with seed length s 1/(Γ+1) + o(1). Showing shrinkage wrt pseudorandom restrictions is nontrivial when Γ ≠ 1.
17
Outline Background on Randomness Extractors New Theorem about Old PRG New PRG Correctness Proof Pseudorandom Restrictions Conclusions
18
Weak Random Source […CG ‘85 Z ‘90] Random variable X on {0,1} r. General model: min-entropy Flat source: – Uniform on A, |A| ≥ 2 k. |A| 2 k {0,1} r
19
How Arise in PRGs Condition on information – E.g., TM configuration Uniform X in {0,1} r, f:{0,1} r {0,1} b. f regular: H ∞ (X|f(X) = a) = r - b. Any f: Pr a=f(X’) [H ∞ (X|f(X) = a) ≥ r – b – Δ] ≥ 1-2 -Δ.
20
Goal: Extract Randomness Ext r bits m bits statistical error Problem: Impossible, even for k=r-1, m=1, ε<1/2.
21
Impossibility Proof Suppose f:{0,1} r {0,1} satisfies ∀ sources X with H ∞ (X) ≥ r-1, f(X) ≈ U. f -1 (0) f -1 (1) Take X=f -1 (0)
22
Randomness Extractor: short seed [Nisan-Z ‘93,…, Guruswami-Umans-Vadhan ‘07] Ext r bits m =.99k bits statistical error d=O(log (r/ε)) random bit seed Y
23
Extractor-Based PRG for Read-Once Branching Programs [Nisan-Z ‘93] Basic PRG: G(x, y 1,…, y t )=Ext(x,y 1 )…Ext(x,y t ) Parameters: r = |x| = 2√n d = |y i | = O(log n) t = m = |Ext(x,y i )| = √n
24
PRG for Ordered Read-Once BPs G(x, y 1,…, y t )=Ext(x,y 1 )…Ext(x,y t ) Condition on v reached after reading up to Ext(X,Y i-1 ). Whp H ∞ (X|reach v) ≥ |x| – log w - Δ. Hence (Ext(X,Y i )|reach v) ≈ uniform. n+1 layers width w 0 0 1 1 z1z1 z2z2 acc rej v
25
New: Same PRG works if bits read in any order z 1,z 2,…,z m can appear anywhere. Still, after fixing all z i, i>m, restricted function is a ROBP on z 1,z 2,…,z m read in the same order as original ROBP. n+1 layers width w 0 0 1 1 z 41 z 26 acc rej
26
New: Same PRG works if bits read in any order Still, after fixing all z i, i>m, restricted function is a ROBP on z 1,z 2,…,z m read in the same order as original ROBP. Information = lg(# restricted functions) = lg(w 2wm ) n+1 layers width w 0 0 1 1 z 41 z 26 acc rej
27
New: Works if bits read in any order PRG: G(x, y 1,…, y t )=Ext(x,y 1 )…Ext(x,y t )=z 1 …z n BP could read in order z 12 z 7 z 8 … D=distribution of PRG output, U=Unif({0,1} n ). Suppose |Pr[f(D)=1] – Pr[f(U)=1]| > δ. Let Z i =Ext(X,Y i ), U i =Unif({0,1} m ) – Z 1 =z 1 z 2 …z m,Z 2 =z m+1 …z 2m,… Bits in Z i can appear anywhere.
28
New: Works if bits read in any order PRG: G(x, y 1,…, y t )=Ext(x,y 1 )…Ext(x,y t ). D=distribution of PRG output, U=Unif({0,1} n ). Suppose |Pr[f(D)=1] – Pr[f(U)=1]| > δ. Let Z i =Ext(X,Y i ), U i =Unif({0,1} m ). Hybrid argument. Let D i = (U 1,…,U i,Z i+1,…,Z t ). D 0 =D, D t =U. Exists i: |Pr[f(D i )=1] – Pr[f(D i-1 =1)]| > δ/t. Changing Z i =Ext(X,Y i ) to U i changes Pr[accept].
29
New: Works if bits read in any order Exists i: |Pr[f(D i )=1] – Pr[f(D i-1 =1)]| > δ/t. Changing Z i =Ext(X,Y i ) to U i changes Pr[accept]. Consider ρ = (Z 1,…,Z i-1,**…*,U i+1,…,U t ) Then g = f| ρ is a ROBP on m bits. f(D i )=g(Z i ), f(D i-1 )=g(U i ). Goal: whp g(Z i ) ≈ g(U i ). Only w 2wm possibilities for g. Whp, H ∞ (X|G=g) ≥ r – 2mw log w - Δ. Whp, conditioned G=g, Ext(X,Y i ) ≈ U i.
30
General Branching Programs Even PRG for unordered ROBPs is new – Our seed length is O(√(wn) log n) – Previous was (1-α)n [Bogdanov, Papakonstantinou, Wan] – Known order: O(log 2 n) [Nisan,…]. What if not read once? – Some variables could be read many times. – Pseudorandomly permute variables before construction. – Gives seed length size(f) ½+o(1). What about formulas? General reduction?
31
General PRG Construction Assume have pseudorandom restrictions which give shrinkage Γ whp. ρ 1 = 0 1 * 1 1 0 1 1 * 0 0 1 0 * 0 1 0 0 1 1 1 ρ 2 = 0 0 1 0 1 0 * 0 1 1 0 1 * 0 1 1 0 * * 1 0 … ρ t = * 0 1 0 1 1 * 1 * 0 0 1 0 0 0 1 * 0 1 1 1 Set t=c(log n)/p so whp all columns have *.
32
General PRG Construction ρ 1 = 0 1 * 1 1 0 1 1 * 0 0 1 0 * 0 1 0 0 1 1 1 ρ 2 = 0 0 1 0 1 0 * 0 1 1 0 1 * 0 1 1 0 * * 1 0 … ρ t = * 0 1 0 1 1 * 1 * 0 0 1 0 0 0 1 * 0 1 1 1 Choose X, Y 1,…,Y t randomly. Replace *’s in i th row with Ext(X,Y i ). PRG output = XOR of resulting strings.
33
Correctness Proof D=distribution of PRG output, U=uniform. Suppose |Pr[f(D)=1] – Pr[f(U=1)]| > δ. Let Z i =Ext(X,Y i ). Hybrid argument. Change Z 1,…,Z i to U 1,…,U i to get D i. D t ≈ U: Whp *’s cover all columns. Exists i: |Pr[f(D i )=1] – Pr[f(D i-1 =1)]| > δ/t. Changing Z i to U i changes Pr[f accepts].
34
Correctness Proof Exists i: changing Z i =Ext(X,Y i ) to U i changes Pr[f accepts]. Fix everything but ρ=ρ i, Z i, U i. Let v = i th row. Let f i (v) = f(v+w), w = XOR of rows except i th. Let g = f i | ρ, so g(v| A ) = f i (v), A = *’s of ρ. f(D i )=g(Z i ), f(D i-1 )=g(U i ). Goal: whp g(Z i ) ≈ g(U i ). E=event that size(g) ≤ s=cp Γ size(f i ). Pr[E] ≥ 1-ε. Conditioned on E, g describable by b ≈ s log s bits. Whp, H ∞ (X|E,G=g) ≥ r – b - Δ. Whp conditioned on E and G=g, Ext(X,Y i ) ≈ U i.
35
Improving the PRG To get nearly optimal output length for Γ > 1, replace *’s with G k-wise (Ext(X,Y i )).
36
Pseudorandom Restrictions Need pseudorandom restrictions that yield shrinkage. BPs and formulas over arbitrary basis: – clog n wise independence suffices. – Deal with heavy variables separately. Formulas over { ∧, ∨,NOT}, incl. read-once: – More work. – Hastad and Hastad-Razborov-Yao as black boxes. – They only guarantee shrinkage in expectation for truly random restrictions.
37
Proof Idea Decompose formula: O(n/k) subformulas of size ≤k=n o(1). Use k 2 -wise independence. Goal: p ≈ n -1/(Γ+1). Too small here. Instead, shrink by q ≈ k -.1 and iterate.
38
Unrestrictable inputs Many subformulas have inputs that must = *. Does shrinkage for random restrictions imply shrinkage when some inputs must = *? Further decomposition: each subformula has ≤ 2 such inputs. h such inputs increase size by ≤ 2 h. – For each setting of variables have subformula. – Combine with selector formula.
39
Read-Once Formulas Need different trick for read-once formula. g small but unlikely to shrink to nothing. * * gg
40
Dependencies Read-once case: k-wise independence. Read-t case: Consider independent sets in dependency graph on subformulas. General case: tricky dependencies.
41
Conclusions New, extractor-based PRG based on shrinkage. Without improving lower bounds, essentially best possible PRGs for: – Formulas over { ∨, ∧,NOT}: s 1/3+o(1) seed length. – Formulas over arbitrary basis: s 1/2+o(1) – Read-once formulas over { ∨, ∧,NOT}: s.234… – Branching programs: s 1/2+o(1)
42
Open Questions Better PRGs for unordered ROBPs? – Can we recurse somehow? – Subsequent work: Reingold-Steinke-Vadhan give O(log 2 n) seed for unordered permutation ROBPs. PRGs from other lower bound techniques? – Subsequent work: Trevisan-Xue on PRGs for AC0. Improve lower bounds? – Our PRG gives alternate function f: formula-size(f) ≥ n 3-o(1), matching Hastad/Andreev. – Subsequent: average-case lower bound of n 3-o(1) [Komargodski-Raz-Tal] (improving [Komargodski-Raz])
43
Thank you!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.