Presentation is loading. Please wait.

Presentation is loading. Please wait.

Inaccessible Entropy Iftach Haitner Microsoft Research Omer Reingold Weizmann & Microsoft Hoeteck Wee Queens College, CUNY Salil Vadhan Harvard University.

Similar presentations


Presentation on theme: "Inaccessible Entropy Iftach Haitner Microsoft Research Omer Reingold Weizmann & Microsoft Hoeteck Wee Queens College, CUNY Salil Vadhan Harvard University."— Presentation transcript:

1 Inaccessible Entropy Iftach Haitner Microsoft Research Omer Reingold Weizmann & Microsoft Hoeteck Wee Queens College, CUNY Salil Vadhan Harvard University TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A

2 outline  Entropy  Secrecy & Pseudoentropy  Unforgeability & Inaccessible Entropy  Applications

3 Def: The Shannon entropy of r.v. X is H(X) = E x à X [log(1/Pr[X=x)]  H(X) = “Bits of randomness in X (on avg)”  0 · H(X) · log |Supp(X)|  Conditional Entropy: H(X|Y) = E y à Y [H(X| Y=y )] Entropy X concentrated on single point X uniform on Supp(X)

4 Conditional Entropy H(X|Y) = E y à Y [H(X| Y=y )]  Chain Rule: H(X,Y) = H(Y) + H(X|Y)  H(X)-H(Y) · H(X|Y) · H(X)  H(X|Y) = 0 iff 9 f X=f(Y).

5 Worst-Case Entropy Measures  Min-Entropy: H 1 (X) = min x log(1/Pr[X=x])  Max-Entropy: H 0 (X) = log |Supp(X)| H 1 (X) · H(X) · H 0 (X)

6 outline  Entropy  Secrecy & Pseudoentropy  Unforgeability & Inaccessible Entropy  Applications

7 Perfect Secrecy & Entropy Def [Sh49]: Encryption scheme (Enc,Dec) has perfect secrecy if 8 m,m’ 2 {0,1} n Enc K (m) & Enc K (m’) are identically distributed for a random key K. Thm [Sh49]: Perfect secrecy ) |K| ¸ n

8 Perfect Secrecy ) |K| ¸ n Proof:  Perfect secrecy ) (M,Enc K (M)) ´ (M,Enc K (M’)) for M,M’ Ã {0,1} n ) H(M|Enc K (M)) = n  Decryptability ) H(M|Enc K (M),K) = 0 ) H(M|Enc K (M)) · H(K).

9 Computational Secrecy Def [GM82]: Encryption scheme (Enc,Dec) has computational secrecy if 8 m,m’ 2 {0,1} n Enc K (m) & Enc K (m’) are computationally indistinguishable. ) can have |K| ¿ n.

10 Where Shannon’s Proof Breaks  Computational secrecy ) (M,Enc K (M)) ´ c (M,Enc K (M’)) for M,M’ Ã {0,1} n ) “H pseudo (M|Enc K (M))” = n  Decryptability ) H(M|Enc K (M)) · H(K). Key point: can have H pseudo (X) À H(X) e.g. X = G(U k ) for PRG G : {0,1} k ! {0,1} n

11 Pseudoentropy Def [HILL90]: X has pseudoentropy ¸ k iff there exists a random variable Y s.t. 1.Y ´ c X 2.H(Y) ¸ k Pseudoentropy Generator: G S Ã {0,1} n X Y ´ c

12 Application of Pseudoentropy Thm [HILL90]: 9 OWF ) 9 PRG Proof outline: OWF X with pseudo-min-entropy ¸ H 0 (X)+poly(n) X with pseudoentropy ¸ H(X)+1/poly(n) PRG hardcore bit [GL89]+hashing repetitions hashing

13 outline  Entropy  Secrecy & Pseudoentropy  Unforgeability & Inaccessible Entropy  Applications

14 Unforgeability  Crypto is not just about secrecy.  Unforgeability: security properties saying that it has hard for an adversary to generate “valid” messages. –Unforgeability of MACs, Digital Signatures –Collision-resistance of hash functions –Binding of commitment schemes  Cf. decision problems vs. search/sampling problems.

15 Ex: Collision-resistant Hashing  Shrinking  Collision Resistance: Given f ÃF, an efficient A cannot output x 1  x 2 such that f(x 1 ) = f(x 2 ) F = { f : {0,1} n ! {0,1} n-k }

16 Ex: Collision-resistant Hashing  Shrinking: H(X | F,Y) ¸ k  Collision Resistance: From (even a cheating) G’s point of view, X is determined by (F,Y)  X has “accessible” entropy 0 F = {f : {0,1} n ! {0,1} n-k } G X Ã {0,1} n Y= F(X) F ÃF X

17 Ex: Collision-resistant Hashing  Collision Resistance: H(X |F,Y,S 1 ) = neg(n) for every efficient G *. F = {f : {0,1} n ! {0,1} n-k } G * S 1 Ã {0,1} r Y F ÃF X  F -1 (Y) S 2 Ã {0,1} r

18 Measuring Accessible Entropy Goal: A useful entropy measure to capture possibility that H acc (X) ¿ H(X) 1st attempt: X has accessible entropy at most k if there is a random variable Y s.t. 1.Y ´ c X 2.H(Y) · k Not useful! every X is indistinguishable from some Y of entropy polylog(n).

19 Inaccessible Entropy Idea: A generator G has inaccessible entropy if H(G’s outputs from an observer’s perspective) > H(G * ’s outputs from G * ’s perspective) Real Entropy Accessible Entropy

20 Real Entropy Def: The real entropy of G is H(Y 1,….,Y m |Z)  i H(Y i | Z,Y 1,…,Y i-1 ) G R Ã {0,1} n Y1Y1 Z Y2Y2 YmYm

21 Accessible Entropy Def: G has accessible entropy at most k, if 8 PPT G *   i H(Y i |Z,S 1,S 2,…,S i-1 ) · k  Inaccessible entropy = real – accessible entropy  Unbounded G * can achieve real entropy. G* Y1Y1 Z Y2Y2 YmYm S1S1 S2S2 SmSm R s.t. G(Z,R)=(Y 1,….,Y m )

22 OWF  Inaccessible Entropy Claim:  Real entropy = n  Accessible entropy < n-log n [cf. Omer’s talk: G(x)=(f(x),x 1,…,x n ) next-bit pseudoentropy n+log n for OWP f] G X Ã {0,1} n f(X) 1 f(X) 2 f(X) n Given a one-way function f : {0,1} n  {0,1} n, define X

23 Y m+1 X YnYn 1 0 Y2Y2 1 OWF  Inaccessible Entropy Claim: Accessible entropy < n-log n  Suppose  G * s.t.  i H(Y i |S 1,…,S i-1 )  n-log n  Then can invert f on input Y’ by sequentially finding S 1,..,S n s.t. Y i =Y’ i (via sampling).  High accessible entropy  success on random Y=f(X) w.p. 1/poly(n). G* Y1Y1 S1S1 S2S2 SnSn S m+1 1 0 R=Y m+1 Y’ = 0 1 0

24 Real Entropy AB B1B1 A1A1 B2B2 A2A2 BmBm AmAm Def: The real entropy of (A,B) is  i H(A i | B 1,A 1,…,B i )

25 Accessible Entropy A*A* B B1B1 A1A1 B2B2 A2A2 BmBm AmAm  Tosses coins S i  Sends message A i  Privately outputs justification W i (e.g. consistent coins of honest A) coins S 1 coins S 2 coins S m What A * does at each round W1W1 W2W2 WmWm

26 Accessible Entropy A*A* B B1B1 A1A1 B2B2 A2A2 BmBm AmAm coins S 1 coins S 2 coins S m W1W1 W2W2 WmWm Def: (A,B) has accessible entropy at most k if for every PPT A *  i H(A i |B 1,S 1,B 2,S 2,…,S i-1,B i ) · k Remarks 1.Needs adjustment in case A * outputs invalid justification. 2.Unbounded A * can achieve real entropy. never Assume

27 Ex: Collision-resistant Hashing Real Entropy= H(Y|F)+H(X|Y,F) = H(X|F) = n AB F Ã F F = { f : {0,1} n ! {0,1} n-k } F Y X X Ã {0,1} n Y=F(X)

28 Ex: Collision-resistant Hashing Accessible Entropy= H(Y|F)+H(X|F,S 1 ) · (n-k) + neg(n) A*A* B F Ã F F = { f : {0,1} n ! {0,1} n-k } F Y X toss coins S 1 toss coins S 2

29 outline  Entropy  Secrecy & Pseudoentropy  Unforgeability & Inaccessible Entropy  Applications

30 Commitment Schemes

31 m COMMIT STAGE SR

32 m R Commitment Schemes S REVEAL STAGE

33 Commitment Schemes COMMIT STAGE accept/ reject SR m 2 {0,1} n REVEAL STAGE (m,K)

34 Security of Commitments COMMIT STAGE accept/ reject SR m 2 {0,1} n REVEAL STAGE (m,K)  Hiding –Statistical –Computational  Binding –Statistical –Computational COMMIT (m) & COMMIT (m’) indistinguishable even to cheating R* Even cheating S * cannot reveal (m,K), (m’,K’) with m  m’

35 Statistical Security? COMMIT STAGE accept/ reject SR m 2 {0,1} t REVEAL STAGE (m,K)  Hiding –Statistical –Computational  Binding –Statistical –Computational Impossible!

36 Statistical Binding COMMIT STAGE accept/ reject SR m 2 {0,1} n REVEAL STAGE (m,K)  Hiding –Statistical –Computational  Binding –Statistical –Computational Thm [HILL90,Naor91]: One-way functions ) Statistically Binding Commitments

37 Statistical Hiding COMMIT STAGE accept/ reject SR m 2 {0,1} n REVEAL STAGE (m,K)  Hiding –Statistical –Computational  Binding –Statistical –Computational Thm [HNORV07]: One-way functions ) Statistically Hiding Commitments Too Complicated!

38 Our Results I  Much simpler proof that OWF ) Statistically Hiding Commitments via accessible entropy.  Conceptually parallels [HILL90,Naor91] construction of PRGs & Statistically Binding Commitments from OWF.  “Nonuniform” version achieves optimal round complexity, O(n/log n) [HHRS07]

39 Our Results II Thm: Assume one-way functions exist. Then: NP has constant-round parallelizable ZK proofs with “black-box simulation” m constant-round statistically hiding commitments exist. ( * due to [GK96,G01], novelty is  )

40 Statistically Hiding Commitments & Inaccessible Entropy COMMIT STAGE SR M Ã {0,1} n REVEAL STAGE M Statistical Hiding: H(M|C) = n - neg(n) K C

41 Statistically Hiding Commitments & Inaccessible Entropy COMMIT STAGE S*S* R REVEAL STAGE M Statistical Hiding: H(M|C) = n - neg(n) Comp’l Binding: For every PPT S * H(M|C,S 1 ) = neg(n)  “inaccessible entropy for protocols” K C coins S 1 coins S 2

42 OWF ) Statistically Hiding Commitments: Our Proof OWF G with real min-entropy ¸ accessible entropy+poly(n) G with real entropy ¸ accessible entropy+log n statistically hiding commitment done repetitions cut & choose & parallel rep (interactive) hashing [DHRS07] +UOWHFs [NY89,Rom90] “m-phase” commitment

43 Cf. OWF ) Statistically Binding Commitment [HILL90,Nao91] OWF X with pseudo-min-entropy ¸ H 0 (X)+poly(n) X with pseudoentropy ¸ H(X)+1/poly(n) PRG hardcore bit [GL89]+hashing repetitions hashing Statistically binding commitment expand output & translate

44 OWF ) Statistically Hiding Commitments: Our Proof OWF (A,B) with real min-entropy ¸ accessible entropy+poly(n) (A,B) with real entropy ¸ accessible entropy+log n statistically hiding commitment interactive hashing [NOVY92,HR07] repetitions cut & choose (interactive) hashing [DHRS07] +UOWHFs [NY89,Rom90] “m-phase” commitment

45 OWF ) Inaccessible Entropy AB Choose linearly indep. B 1,…,B m à {0,1} m f : {0,1} n ! {0,1} m OWF B1B1 h B 1,Y i X à {0,1} n Y=f(X)  Real Entropy = n  Can show: Accessible Entropy · n-log n BmBm h B m,Y i X

46 Claim: Accessible Entropy · n-log n A*A* B f : {0,1} n ! {0,1} m OWF. B1B1 h B 1,Y i BmBm h B m,Y i X BtBt h B t,Y i For simplicity, assume |f -1 (y)| = 2 k 8 y 2 Im(f) entropy · k entropy · t = n-k-2log n Claim: entropy = neg(n)

47 Claim: Accessible Entropy · n-log n A*A* B f : {0,1} n ! {0,1} m OWF. B1B1 h B 1,Y i BtBt h B t,Y i For simplicity, assume |f -1 (y)| = 2 k 8 y 2 Im(f). t=n-k-2log n Claim: 9 at most one consistent Y s.t. A * can produce a preimage (except w/neg prob,)

48 Claim: Accessible Entropy · n-log n A*A* B f : {0,1} n ! {0,1} m OWF. B1B1 h B 1,Y i BtBt h B t,Y i For simplicity, assume |f -1 (y)| = 2 k 8 y 2 Im(f). t=n-k-2log n Claim: 9 at most one consistent Y s.t. A * can produce a preimage (except w/neg prob,) Im(f) poly(n) Interactive Hashing Thms [NOVY92,HR07]: A * can “control” at most 1 consistent value

49 Claim: Accessible Entropy · n-log n A*A* B f : {0,1} n ! {0,1} m OWF. B1B1 h B 1,Y i BmBm h B m,Y i X BtBt h B t,Y i For simplicity, assume |f -1 (y)| = 2 k 8 y 2 Im(f) entropy · k entropy · t = n-k-2log n entropy = neg(n) Analysis holds whenever |f -1 (Y)| ¼ 2 k Choice of k contributes entropy · log n

50 Other Applications  Simpler/improved universal one-way hash functions from OWF [HRVW09b]  Inspired simpler/improved pseudorandom generators from OWF [HRV09]

51 Conclusion Complexity-based cryptography is possible because of gaps between real & computational entropy. Secrecy pseudoentropy > real entropy Unforgeability accessible entropy < real entropy

52 Research Directions  Formally unify inaccessible entropy and pseudoentropy.  Complexity-theoretic applications of inaccessible entropy  Remove “parallelizable” condition from ZK result.  Use inaccessible entropy for new understanding/constructions of MACS and digital signatures.

53 Benefit of Statistical Hiding In most protocols that use commitments:  Binding only required during protocol execution –Depends on adversary’s current capabilities –Safe to be computational  Hiding may matter long after execution –Adversary may gain computational resources –Hardness assumption may be broken –Statistical hiding ) “everlasting secrecy”

54 Example: Zero Knowledge for NP [Goldreich-Micali-Wigderson86] Hiding ) Zero Knowledge –Verifier learns nothing other than x 2 L Binding ) Soundness –Prover cannot convince verifier if x  L 1 2 3 4 5 6 (1,4) PV Corollary: One-Way Functions ) Statistical Zero Knowledge “Arguments” for NP.


Download ppt "Inaccessible Entropy Iftach Haitner Microsoft Research Omer Reingold Weizmann & Microsoft Hoeteck Wee Queens College, CUNY Salil Vadhan Harvard University."

Similar presentations


Ads by Google