The Many Entropies of One-Way Functions Thomas Holenstein Iftach Haitner Salil VadhanHoeteck Wee Joint With Omer Reingold.

Slides:



Advertisements
Similar presentations
Why Simple Hash Functions Work : Exploiting the Entropy in a Data Stream Michael Mitzenmacher Salil Vadhan And improvements with Kai-Min Chung.
Advertisements

Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Inaccessible Entropy Iftach Haitner Microsoft Research Omer Reingold Weizmann & Microsoft Hoeteck Wee Queens College, CUNY Salil Vadhan Harvard University.
1 Efficient Pseudorandom Generators from Exponentially Hard One-Way Functions Iftach Haitner, Danny Harnik, Omer Reingold.
Inaccessible Entropy Iftach Haitner Microsoft Research Omer Reingold Weizmann Institute Hoeteck Wee Queens College, CUNY Salil Vadhan Harvard University.
1 Reducing Complexity Assumptions for Statistically-Hiding Commitment Iftach Haitner Omer Horviz Jonathan Katz Chiu-Yuen Koo Ruggero Morselli Ronen Shaltiel.
Michael Alves, Patrick Dugan, Robert Daniels, Carlos Vicuna
Why Simple Hash Functions Work : Exploiting the Entropy in a Data Stream Michael Mitzenmacher Salil Vadhan.
1 Adam O’Neill Leonid Reyzin Boston University A Unified Approach to Deterministic Encryption and a Connection to Computational Entropy Benjamin Fuller.
CS151 Complexity Theory Lecture 8 April 22, 2004.
F(x 1 )h h( x 1 ) 1 … n+|h|+ 1 bits of next-block pseudoentropy f(x 2 )h h( x 2 )f(x t )h h( x t ) g(x t,h t )= g(x 2,h 2 )= g(x 1,h 1 )= G(x 1,h 1 …,x.
CIS 5371 Cryptography 3b. Pseudorandomness.
Foundations of Cryptography Lecture 5 Lecturer: Moni Naor.
Inaccessible Entropy Iftach Haitner Microsoft Research Omer Reingold Weizmann & Microsoft Hoeteck Wee Queens College, CUNY Salil Vadhan Harvard University.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
On the (Im)Possibility of Key Dependent Encryption Iftach Haitner Microsoft Research TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Seminar in Foundations of Privacy Gil Segev Message Authentication in the Manual Channel Model.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 12 June 18, 2006
The Bright Side of Hardness Relating Computational Complexity and Cryptography Oded Goldreich Weizmann Institute of Science.
Foundations of Cryptography Lecture 5: Signatures and pseudo-random generators Lecturer: Moni Naor.
Why Simple Hash Functions Work : Exploiting the Entropy in a Data Stream Michael Mitzenmacher Salil Vadhan.
Lecturer: Moni Naor Foundations of Cryptography Lecture 4: One-time Signatures, UOWHFs.
ACT1 Slides by Vera Asodi & Tomer Naveh. Updated by : Avi Ben-Aroya & Alon Brook Adapted from Oded Goldreich’s course lecture notes by Sergey Benditkis,
Introduction to Modern Cryptography, Lecture ?, 2005 Broadcast Encryption, Traitor Tracing, Watermarking.
Hash Functions: From Merkle-Damgård to Shoup Ilya Mironov, Stanford University.
Tirgul 8 Universal Hashing Remarks on Programming Exercise 1 Solution to question 2 in theoretical homework 2.
Oded Regev Tel-Aviv University On Lattices, Learning with Errors, Learning with Errors, Random Linear Codes, Random Linear Codes, and Cryptography and.
1 Streaming Computation of Combinatorial Objects Ziv Bar-Yossef U.C. Berkeley Omer Reingold AT&T Labs – Research Ronen.
GOING DOWN HILL: MORE EFFICIENT PSEUDORANDOM GENERATORS FROM ANY ONE-WAY FUNCTION Joint with Iftach Haitner and Salil Vadhan Omer Reingold&
The Many Entropies of One-Way Functions Thomas Holenstein Iftach Haitner Salil VadhanHoeteck Wee Joint With Omer Reingold.
1 A New Interactive Hashing Theorem Iftach Haitner and Omer Reingold WEIZMANN INSTITUTE OF SCIENCE.
1 On the Power of the Randomized Iterate Iftach Haitner, Danny Harnik, Omer Reingold.
Foundations of Cryptography Lecture 9 Lecturer: Moni Naor.
Foundations of Cryptography Lecture 8 Lecturer: Moni Naor.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
Computational Entropy Joint works with Iftach Haitner (Tel Aviv), Thomas Holenstein (ETH Zurich), Omer Reingold (MSR-SVC), Hoeteck Wee (George Washington.
Foundations of Cryptography Rahul Jain CS6209, Jan – April 2011
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
GOING DOWN HILL : EFFICIENCY IMPROVEMENTS IN CONSTRUCTING PSEUDORANDOM GENERATORS FROM ONE-WAY FUNCTIONS Iftach Haitner Omer Reingold Salil Vadhan.
Cryptography and Network Security Chapter 11 Fifth Edition by William Stallings Lecture slides by Lawrie Brown.
Simulating independence: new constructions of Condensers, Ramsey Graphs, Dispersers and Extractors Boaz Barak Guy Kindler Ronen Shaltiel Benny Sudakov.
Computational Entropy Joint works with Iftach Haitner (Tel Aviv), Thomas Holenstein (ETH Zurich), Omer Reingold (MSR-SVC), Hoeteck Wee (George Washington.
On Constructing Parallel Pseudorandom Generators from One-Way Functions Emanuele Viola Harvard University June 2005.
Foundations of Cryptography Lecture 6 Lecturer: Moni Naor.
On the Communication Complexity of SFE with Long Output Daniel Wichs (Northeastern) joint work with Pavel Hubáček.
On Constructing Parallel Pseudorandom Generators from One-Way Functions Emanuele Viola Harvard University June 2005.
Dan Boneh Stream ciphers PRG Security Defs Online Cryptography Course Dan Boneh.
Iftach Haitner and Eran Omri Coin Flipping with Constant Bias Implies One-Way Functions TexPoint fonts used in EMF. Read the TexPoint manual before you.
Does Privacy Require True Randomness? Yevgeniy Dodis New York University Joint work with Carl Bosley.
Fine Grained Hardness in Cryptography Omer Reingold SRA.
The Power of Negations in Cryptography
Real-life cryptography Pfeiffer Alain.  Types of PRNG‘s  History  General Structure  User space  Entropy types  Initialization process  Building.
Pseudorandomness: New Results and Applications Emanuele Viola IAS April 2007.
Umans Complexity Theory Lectures Lecture 9b: Pseudo-Random Generators (PRGs) for BPP: - Hardness vs. randomness - Nisan-Wigderson (NW) Pseudo- Random Generator.
B504/I538: Introduction to Cryptography
Randomness and Computation
FOC-2 Cryptography with Low Complexity: 3
Topic 14: Random Oracle Model, Hashing Applications
Lecture 18: Uniformity Testing Monotonicity Testing
Cryptography Lecture 19.
Cryptography Lecture 6.
B504/I538: Introduction to Cryptography
Conditional Computational Entropy
Cryptography Lecture 8.
Cryptography Lecture 14.
Emanuele Viola Harvard University June 2005
Cryptography Lecture 15.
Cryptography Lecture 18.
Cryptography Lecture 15.
Pseudorandomness: New Results and Applications
Presentation transcript:

The Many Entropies of One-Way Functions Thomas Holenstein Iftach Haitner Salil VadhanHoeteck Wee Joint With Omer Reingold

Cryptography  Rich array of applications and powerful implementations.  In some cases (e.g Zero-Knowledge), more than we would have dared to ask for.

Cryptography  Proofs of security very important  BUT, almost entirely based on computational hardness assumptions (factoring is hard, cannot find collisions in SHA-1, …)

One Way Functions (OWF)  Easy to compute  Hard to invert (even on the average) The most basic, unstructured form of cryptographic hardness [Impagliazzo-Luby ‘95] Major endeavor: base as much of Crypto on existence of OWFs – Great success (even if incomplete) xf(x) f Intermediate primitives

Primitives Hierarchy [Håstad-Impagliazzo-Levin-Luby ’91] Pseudorandomness naturally corresponds to secrecy. But Crypto is also about Unforgeability (preventing cheating)

Primitives Hierarchy [Haitner- Nguyen- Ong-R-Vadhan ‘07] [Rompel ‘90] Simplifications and improvements in efficiency Based on new notions of computational entropy Very involved and seems far from optimal [Haitner-R-Vadhan-Wee ’09] [HRV ’10, VZ ‘12] [Haitner-Holenstein-R-Vadhan-Wee ’10]

Primitives Hierarchy

Building the First Layer

Entropy and Pseudoentropy  For a random variable X denote by H(X) the entropy of X. Intuitively: how many bits of randomness in X.  Various measures of entropy: Shannon entropy (H(X) = E x  X [log(1/Pr[X=x)]), min-entropy, max-entropy, …  For this talk, enough to imagine X that is uniform over 2 k elements. For such X, H(X)=k.  X has pseudoentropy  k if  Y with H(Y)  k such that X and Y are computationally indistinguishable [HILL]

Pseudorandom Generators [BM,Yao] Efficiently computable function G:{0,1} s  {0,1} m  Stretching ( m > s )  Output is computationally indistinguishable from uniform. x G(x)

False Entropy Generator  Loosely, the most basic object in HILL is: G fe (x,g,i)=f(x),g,g(x) 1..i (think of g as matrix multiplication). Lemma Let k=log|f -1 (f(x))|, then when i=k+log n then g,g(x) 1..i is pseudorandom (even given f(x)).  Intuition: first k-c ¢ log n bits are statistically close to uniform (Leftover Hash Lemma) and next (c+1)log n bits are pseudorandom (GL Hard-Core Function).

False Entropy Generator (II) G fe (x,g,i)=f(x),g,g(x) 1..i Lemma: For the variable G fe (x,g,i) (with random inputs)  = pseudoentropy – real entropy > (log n)/n Reason: w.p 1/n over choice of i (when i=k+log n) the output G fe (x,g,i) is indistinguishable from distribution with entropy |x|+|g|+log n (whereas real entropy  |x|+|g|)  Disadvantages:  rather small, value of real entropy unknown, pseudoentropy < entropy of input

Building Block of [HRV ’10]  Simply do not truncate: G nb (x,g)=f(x),g, g(x)  Nonsense: G nb (x,g) is invertible and therefore has no pseudoentropy!  Well yes, but: G nb (x,g) does have pseudoentropy from the point of view of an online distinguisher (getting one bit at a time).

Next-Bit Pseudoentropy  X has pseudoentropy  k if  Y with H(Y)  k such that X and Y are computationally indistinguishable  X=(X 1 …X n ) has next-bit pseudoentropy  k if  {Y i } i 2 [n] with   i H(Y i |X 1 …X i-1 )  k such that  X i and Y i are computationally indistinguishable given X 1,…,X i-1  Remarks:  X and {Y i } are jointly distributed  The two notions are identical for k=n [BM, Yao, GGM]  Generalizes to blocks (rather than bits)  Next-bit pseudoentropy is weaker than pseudoentropy G nb (x,g)=f(x),g,g(x) 1,…n

Our Next-Block Pseudoentropy Generator  G nb (x,g)=f(x),g,g(x)  Next-block pseudoentropy > |x|+|g|+logn  X=G(x,g) and {Y i } obtained from X by replacing first k+logn bits of g(x) with uniform bits, where k=log|f -1 (f(x))|  Advantages:   = (next-block pseudoentropy – real entropy) > logn  Entropy bounds known (on total entropy)  “No bit left behind”

Simple form of PRGs in OWFs In conclusion: for OWF f:{0,1} n  {0,1} n & (appropriate) pair-wise independent hash function g:{0,1} n  {0,1} n  Has pseudoentropy in the eyes of an online distinguisher (i.e., next-block pseudoentropy)  [Vadhan-Zheng ‘12] Don’t need g at all + additional efficiency improvement. x f(x), g, g(x)

Pseudoentropy vs. Inaccessible Entropy [HILL ‘91]: A distribution X has pseudoentropy if it is indistinguishable from X’ such that H(X’)>H(X)  X looks like it has more entropy than it really does [HRVW ’09] X has in inaccessible entropy if for any efficient algorithm A, if A “samples from the support” of X then H(A(. )) < H(X)  X has entropy but some of it is inaccessible Secrecy Unforgeability

Universal One-Way Hash Functions [NY] G={g} a family of efficiently computable hash functions such that  (2 nd pre-image) Given random g and x, hard to find x’ such that g(x)=g(x’).  Compare with collision resistance: Given g, hard to find x and x’ such that g(x)=g(x’).

Simple form of UOWHFs in OWFs OWF f:{0,1} n  {0,1} n  Define F(x,i)= first i bits of f(x)  Given random x,i may be possible to find x’ such that F(x,i)= F(x’,i)  F may be broken as a UOWHF  But it is infeasible to sample such x’ with full entropy  F is “a bit like” a UOWHF

Simple form of UOWHFs in OWFs Proof idea: Assume that given x,i algorithm A samples x’ with full entropy such that F(x,i)= F(x’,i). In other words, x’ is uniform such that first i bits of f(x) equal first i bits of f(x’) Given y find x=f -1 (y) (breaking f) as follows:  Let x i be such that f(x i ) and y agree on first i bits.  To get x i+1 from x i use A on input ( x i,i ) until it samples x’ such that f(x’) and y agree on first i+1 bits (set x i+1 = x’ ).  Output x= x n.

Inaccessible Entropy Generator OWF f:{0,1} n  {0,1} n  Inaccessible entropy generator: Define G ie (x)=f(x) 1,f(x) 2,…,f(x) n,x  Informal thm: There is no algorithm that produces each one of the n+1 blocks (in an online fashion) with full entropy (hence an inaccessible entropy generator).  Useful in construction of statistically hiding commitment schemes and inspiring in construction of UOWHFS (slightly different analysis).

Connection to Statistical Commitment  Inaccessible entropy generator: Define G ie (s)=Z 1,Z 2,…,Z n  Assume Z i is a uniform bit (from the point of view of an observer) but is completely fixed conditioned on the internal state of any algorithm generating it.  Use Z i to mask a bit  (output Z 1,Z 2,…,Z i-1, Z i   ).  Then  is statistically hidden (for outside observer) but the committer can only open a single value.

Two Computational Entropy Generators f:{0,1} n  {0,1} n OWF.  Next block pseudoentropy generator: G nb (x)=f(x),x 1,x 2,…,x n  Looks (on-line) like it has entropy |x|+log n.  Inaccessible entropy generator: G ie (x)=f(x) 1,f(x) 2,…,f(x) n,x  Can generate (on-line) at most |x| - log n bits of entropy.

Summary  When viewed correctly, one-way functions rather directly imply simple forms of the “first layer” of cryptographic primitives.  This view relies on setting the “right” computational notions of entropy.  Open problems: Beyond the world of OWFs, Use for lower bounds, Further Unifications, Better constructions,

Widescreen Test Pattern (16:9) Aspect Ratio Test (Should appear circular) 16x9 4x3