Computational Entropy Joint works with Iftach Haitner (Tel Aviv), Thomas Holenstein (ETH Zurich), Omer Reingold (MSR-SVC), Hoeteck Wee (George Washington U.), and Colin Jia Zheng (Harvard) TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A Salil Vadhan Harvard University (on sabbatical at MSR-SVC and Stanford)
How I came to work on Pseudorandomness Fall 93: CS226r “Efficient Algorithms,” Prof. M. Rabin –“The R word plays a role” –I am amazed by the power of randomness. Spring 95: S. Rudich tells me about evidence that P=BPP. –I am skeptical. : I learn about pseudorandom generators & derandomization. –I need to work on this too!
One-Way Functions [DH76] Candidate: f(x,y) = x ¢ y Formally, a OWF is f : {0,1} n ! {0,1} n s.t. f poly-time computable 8 poly-time A Pr[A(f(X)) 2 f -1 (f(X))] = 1/n ! (1) for X Ã {0,1} n x f(x) easy hard
OWFs & Cryptography one-way functions pseudorandom generators target-collision-resistant hash functions (UOWHFs) statistically hiding commitments pseudorandom functions statistically binding commitments zero-knowledge proofs statistical ZK arguments private-key encryption MACs digital signatures secure protocols & applications [HILL90] [R90] [HNORV07] [GGM86] [N89] [GMW86] [NY89][BCC86]
OWFs & Cryptography one-way functions pseudorandom generators target-collision-resistant hash functions (UOWHFs) statistically hiding commitments pseudorandom functions statistically binding commitments zero-knowledge proofs statistical ZK arguments private-key encryption MACs digital signatures secure protocols & applications [HILL90] [R90] [HNORV07] [GGM86] [N89] [GMW86] [NY89][BCC86]
Computational Entropy [Y82,HILL90,BSW03] Question: How can we use the “raw hardness” of a OWF to build useful crypto primitives? Answer (today’s talk): Every crypto primitive amounts to some form of “computational entropy”. One-way functions already have a little bit of “computational entropy”.
Entropy Def: The Shannon entropy of r.v. X is H(X) = E x à X [log(1/Pr[X=x)] H(X) = “Bits of randomness in X (on avg)” 0 · H(X) · log |Supp(X)| Conditional Entropy: H(X|Y) = E y à Y [H(X| Y=y )] X concentrated on single point X uniform on Supp(X)
Conditional Entropy H(X|Y) = E y à Y [H(X| Y=y )] Chain Rule: H(X,Y) = H(Y) + H(X|Y) H(X)-H(Y) · H(X|Y) · H(X) H(X|Y) = 0 iff 9 f X=f(Y).
Worst-Case Entropy Measures Min-Entropy: H 1 (X) = min x log(1/Pr[X=x]) Max-Entropy: H 0 (X) = log |Supp(X)| H 1 (X) · H(X) · H 0 (X)
Computational Entropy A poly-time algorithm may “perceive” the entropy of X to be very different from H(X). Example: a pseudorandom generator G: {0,1} m ! {0,1} n –G(U m ) is computationally indistinguishable from U n –But H(G(U m )) · m.
Pseudoentropy Def [HILL90]: X has pseudoentropy ¸ k iff there exists a random variable Y s.t. 1.Y ´ c X 2.H(Y) ¸ k Interesting when k > H(X), i.e. Pseudoentropy > Real Entropy
OWFs & Cryptography one-way functions pseudorandom generators target-collision-resistant hash functions (UOWHFs) statistically hiding commitments pseudorandom functions statistically binding commitments zero-knowledge proofs statistical ZK arguments private-key encryption MACs digital signatures secure protocols & applications [HILL90] [R90] [HNORV07] [GGM86] [N89] [GMW86] [NY89][BCC86] pseudoentropy
Application of Pseudoentropy Thm [HILL90]: 9 OWF ) 9 PRG Proof idea: OWF X with pseudo-min-entropy ¸ H 0 (X)+poly(n) X with pseudoentropy ¸ H(X)+1/poly(n) PRG to discuss repetitions hashing
Pseudoentropy from OWF Intuition: for a 1-1 OWF f and X Ã {0,1} n –H(X|f(X))=0, but – 8 poly-time A Pr[A(f(X))=X] = 1/n ! (1) = 1/2 ! (log n) ) X has “unpredictability entropy” ! (log n) given f(X) [HLR07] Challenges: –How to turn “unpredictability” into “pseudoentropy”? –When f not 1-1, unpredictability can be trivial.
Pseudoentropy from OWF Thm [HILL90]: W=(f(X),H,H(X) 1,…H(X) J ) has pseudoentropy ¸ H(W)+ ! (log n)/n, where H : {0,1} n ! {0,1} n is a certain kind of hash function, X Ã {0,1} n, J Ã {1,…,n}. Thm [HRV10,VZ11]: (f(X),X 1,…,X n ) has “next-bit pseudoentropy” ¸ n+ ! (log n). –No hashing! –Total amount of pseudoentropy known & > n. –Get full ! (log n) bits of pseudoentropy.
Next-bit Pseudoentropy Thm [HRV10,VZ11]: (f(X),X 1,…,X n ) has “next-bit pseudoentropy” ¸ n+ ! (log n). Note: (f(X),X) easily distinguishable from every random variable of entropy > n. Next-bit pseudoentropy: 9 (Y 1,…,Y n ) s.t. –(f(X),X 1,…,X i ) ´ c (f(X),X 1,…,X i-1,Y i ) – H(f(X))+ i H(Y i |f(X),X 1,…,X i-1 ) = n+ ! (log n).
Consequences Simpler and more efficient construction of pseudorandom generators from one-way functions. [HILL90,H06]: OWF f of input length n ) PRG G of seed length O(n 8 ). [HRV10,VZ11]: OWF f of input length n ) PRG G of seed length O(n 3 ).
Unpredictability ) Pseudoentropy [previous work] Assume: Z has unpredictability entropy ¸ k given Y (i.e. 8 poly-time A Pr[A(Y)=Z] · 2 -k ). [Y82]: If k ¼ |Z|, then (Y,Z) ´ c (Y,U k ) [GL89,HLR07]: (Y,H,H(Z)) ´ c (Y,H,U k- ! (log n) ) [I95,H05]: If |Z|=1, then 9 Z’ of “average min- entropy [DORS04]” k given X s.t. (X,Z) ´ c (X,Z’) Coping with info-theoretic unpredictability of X given f(X)? [ HILL90] Use Leftover Hash Lemma to analyze prefix of (f(X),H,H(X) 1,…H(X) J ).
Pseudoentropy, Hardness of Sampling Thm [VZ11]: Let (Y,Z) 2 {0,1} n £ {0,1} O(log n). Z has pseudoentropy ¸ H(Z|Y)+k given Y m There is no probabilistic poly-time A s.t. D((Y,Z)||(Y,A(Y)) · k. [D = Kullback-Liebler Divergence] Proof: variant of proofs of Impagliazzo’s Hardcore Lemma [I95,N95,H05,BHK09].
Pseudoentropy, Hardness of Sampling Thm [VZ11] : Let (Y,Z) 2 {0,1} n £ {0,1} O(log n). Z has pseudoentropy ¸ H(Z|Y)+k given Y m There is no probabilistic poly-time A s.t. D((Y,Z)||(Y,A(Y)) · k. Cor [HRV10,VZ11] : (f(X),X 1,…,X n ) has next-bit pseudoentropy n+ ! (log n).
OWFs & Cryptography one-way functions pseudorandom generators target-collision-resistant hash functions (UOWHFs) statistically hiding commitments pseudorandom functions statistically binding commitments zero-knowledge proofs statistical ZK arguments private-key encryption MACs digital signatures secure protocols & applications [HRV10, VZ11] [R90] [HNORV07] [GGM86] [N89] [GMW86] [NY89][BCC86] next-bit pseudoentropy
OWFs & Cryptography one-way functions pseudorandom generators target-collision-resistant hash functions (UOWHFs) statistically hiding commitments pseudorandom functions statistically binding commitments zero-knowledge proofs statistical ZK arguments private-key encryption MACs digital signatures secure protocols & applications [HRV10, VZ11] [GGM86] [N89] [GMW86] [NY89][BCC86] next-bit pseudoentropy inaccessible entropy [HRVW09, HHRVW10]
Inaccessible Entropy [HRVW09,HHRVW10] Example: if h : {0,1} n ! {0,1} n-k is collision- resistant and X Ã {0,1} n, then –H(X|h(X)) ¸ k, but –To an efficient algorithm, once it produces h(X), X is determined ) “accessible entropy” 0. –Accessible entropy ¿ Real Entropy! Thm [HRVW09]: f a OWF ) (f(X) 1,…,f(X) n,X) has accessible entropy n- ! (log n). –Cf. (f(X),X 1,…,X n ) has pseudoentropy n+ ! (log n).
Conclusion Complexity-based cryptography is possible because of gaps between real & computational entropy. “Secrecy” pseudoentropy > real entropy “Unforgeability” accessible entropy < real entropy
Research Directions Formally unify inaccessible entropy and pseudoentropy. OWF f : {0,1} n ! {0,1} n ) Pseudorandom generators of seed length O(n)? More applications of inaccessible entropy in crypto or complexity.