Conditional Computational Entropy

Slides:



Advertisements
Similar presentations
On the Complexity of Parallel Hardness Amplification for One-Way Functions Chi-Jen Lu Academia Sinica, Taiwan.
Advertisements

Detection of Algebraic Manipulation with Applications to Robust Secret Sharing and Fuzzy Extractors Ronald Cramer, Yevgeniy Dodis, Serge Fehr, Carles Padro,
Computational Analogues of Entropy Boaz Barak Ronen Shaltiel Avi Wigderson.
Foundations of Cryptography Lecture 7 Lecturer:Danny Harnik.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
1 Efficient Pseudorandom Generators from Exponentially Hard One-Way Functions Iftach Haitner, Danny Harnik, Omer Reingold.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
Foundations of Cryptography Lecture 11 Lecturer: Moni Naor.
Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1.
1 Adam O’Neill Leonid Reyzin Boston University A Unified Approach to Deterministic Encryption and a Connection to Computational Entropy Benjamin Fuller.
CIS 5371 Cryptography 3b. Pseudorandomness.
Non-Malleable Hash Functions FORMACRYPT, 2007 Alexandra Boldyreva David Cash Marc Fischlin Bogdan Warinschi.
Lecturer: Moni Naor Joint work with Cynthia Dwork Foundations of Privacy Informal Lecture Impossibility of Disclosure Prevention or The Case for Differential.
On The Cryptographic Applications of Random Functions Oded Goldreich Shafi Goldwasser Silvio Micali Advances in Cryptology-CRYPTO ‘ 84 報告人 : 陳昱升.
Princeton University COS 433 Cryptography Fall 2005 Boaz Barak COS 433: Cryptography Princeton University Fall 2005 Boaz Barak Lecture 2: Perfect Secrecy.
GOING DOWN HILL: MORE EFFICIENT PSEUDORANDOM GENERATORS FROM ANY ONE-WAY FUNCTION Joint with Iftach Haitner and Salil Vadhan Omer Reingold&
1 Leonid Reyzin May 23, th International Conference on Information Theoretic Security Minentropy and its Variations for Cryptography.
CS555Spring 2012/Topic 41 Cryptography CS 555 Topic 4: Computational Approach to Cryptography.
On Everlasting Security in the Hybrid Bounded Storage Model Danny Harnik Moni Naor.
Princeton University COS 433 Cryptography Fall 2005 Boaz Barak COS 433: Cryptography Princeton University Fall 2005 Boaz Barak Lecture 2: Perfect Secrecy.
Foundations of Cryptography Lecture 9 Lecturer: Moni Naor.
Foundations of Cryptography Lecture 8 Lecturer: Moni Naor.
Computational Entropy Joint works with Iftach Haitner (Tel Aviv), Thomas Holenstein (ETH Zurich), Omer Reingold (MSR-SVC), Hoeteck Wee (George Washington.
GOING DOWN HILL : EFFICIENCY IMPROVEMENTS IN CONSTRUCTING PSEUDORANDOM GENERATORS FROM ONE-WAY FUNCTIONS Iftach Haitner Omer Reingold Salil Vadhan.
Computational Entropy Joint works with Iftach Haitner (Tel Aviv), Thomas Holenstein (ETH Zurich), Omer Reingold (MSR-SVC), Hoeteck Wee (George Washington.
Foundations of Cryptography Lecture 6 Lecturer: Moni Naor.
Cryptography Lecture 2 Arpita Patra. Summary of Last Class  Introduction  Secure Communication in Symmetric Key setting >> SKE is the required primitive.
On the Communication Complexity of SFE with Long Output Daniel Wichs (Northeastern) joint work with Pavel Hubáček.
Polynomially Homomorphic Signatures Dan Boneh Stanford University Joint work with David Freeman.
1 Leonid Reyzin Boston University Adam Smith Weizmann  IPAM  Penn State Robust Fuzzy Extractors & Authenticated Key Agreement from Close Secrets Yevgeniy.
Cryptography Lecture 4 Arpita Patra. Recall o Various Definitions and their equivalence (Shannon’s Theorem) o Inherent Drawbacks o Cannot afford perfect.
Dan Boneh Stream ciphers PRG Security Defs Online Cryptography Course Dan Boneh.
Error-Correcting Codes and Pseudorandom Projections Luca Trevisan U.C. Berkeley.
Correcting Errors Without Leaking Partial Information Yevgeniy Dodis New York University Adam SmithWeizmann Institute To appear in STOC 2005
Cryptography Lecture 3 Arpita Patra © Arpita Patra.
Pseudorandomness: New Results and Applications Emanuele Viola IAS April 2007.
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
Randomness.
Randomness and Computation
Computational Fuzzy Extractors
Selective-opening security in the presence of randomness failures
Sampling of min-entropy relative to quantum knowledge Robert König in collaboration with Renato Renner TexPoint fonts used in EMF. Read the TexPoint.
Cryptography Lecture 5.
Cryptography Lecture 19.
CAS CS 538 Cryptography.
Cryptography Lecture 3 Arpita Patra © Arpita Patra.
Cryptography Lecture 6.
Cryptography Lecture 7.
Cryptography for Quantum Computers
When are Fuzzy Extractors Possible?
Cryptography Lecture 4 Arpita Patra © Arpita Patra.
Cryptography Lecture 4 Arpita Patra © Arpita Patra.
When are Fuzzy Extractors Possible?
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
Cryptography Lecture 4 Arpita Patra © Arpita Patra.
On the Efficiency of 2 Generic Cryptographic Constructions
Cryptographic Hash Functions Part I
Cryptography Lecture 4.
Cryptography Lecture 5.
Cryptography Lecture 8.
Cryptography Lecture 14.
Cryptography Lecture 5 Arpita Patra © Arpita Patra.
Cryptography Lecture 6.
Cryptographic Applications of Randomness Extractors
Cryptography Lecture 7.
Impossibility of SNARGs
Cryptography Lecture 3.
Cryptography Lecture 6.
On Derandomizing Algorithms that Err Extremely Rarely
Pseudorandomness: New Results and Applications
Presentation transcript:

Conditional Computational Entropy Does Pseudo-Entropy = Incompressibility? How to extract more pseudorandom bits? Chun-Yuan Hsiao (Boston University, USA) Joint work with Chi-Jen Lu (Academia Sinica, Taiwan) Leonid Reyzin (Boston University, USA)

Shannon Entropy  H(X)  Exx [ log ( Pr[X  x] ) ] X 2.58 bits Usually in crypto: minimum instead of average (a.k.a. min-entropy H(X) )

Computational Entropy Pseudo-Entropy  X has pseudo-entropy k if Y, H(Y) = k and X  Y HHILL(X) = k [Håstad,Impagliazzo,Levin,Luby] X  means indistinguishable (in polynomial time) PRG (Blum-Micali-Yao) Computational Entropy (version 1: HILL)

Entropy vs Compressibility Shannon's Theorem | X | = 60 H(X) = 40 H(X) X C(X) D(C(X)) = X  Compression length C(X) Compress ( C ) Decompress ( D )

Compression-Entropy HYao(X) = k [Yao82] Computational Entropy (version 2: Yao) X has computational entropy k, if we cannot efficiently compress X shorter than k HYao(X) = k [Yao82] [Barak,Shaltiel,Wigderson03] gave min-entropy formulation any subset of the support of X cannot be compressed

Computational Entropy Version 1: HILL HHILL(X) = k, if Y, H(Y) = k and X  Y Version 2: Yao HYao(X) = k, if we cannot efficiently compress X shorter than k  Question [Impagliazzo99]: Are these equivalent definitions?   ? ?

(Pseudo-)Entropy vs Compressibility Recall Shannon’s Theorem: Is computational analogue true?  ? pseudo- entropy compression length efficient

Computational Entropy Version 1: HILL HHILL(X) = k, if Y, H(Y) = k and X  Y Version 2: Yao HYao(X) = k, if we cannot efficiently compress X shorter than k   ?

Cryptographic Motivation pseudo H(X) random bits computational Extractor (Hashing) entropy  key Which computational entropy? all extractors work for HHILL(X); some work for HYao(X) [BSW03] e.g. gab If HYao(X) > HHILL(X) may get longer a key (by using the right extractor)

Our results How? 0. New† notion: conditional computational entropy †previously used, but never formalized 1.  distribution* X such that HYao(X) > HHILL(X) 2. bits extracted via HYao > bits extracted via HHILL 3. Define computational entropy, version 3: new, unpredictability-based definition *conditional distribution

Our Definition: Conditional Computational Entropy HILL: HHILL(X | Z) = k if  Y, H(Y | Z) = k and (X , Z)  (Y , Z)  Z X Y ?

Our Definition: Conditional Computational Entropy Yao: HYao(X | Z) = k if we cannot efficiently compress X shorter than k Z Z D(C(X , Z) ,Z) =X C( X , Z )

Conditional is Everywhere in Crypto In cryptography, adversaries usually have additional information entropic secret: gab | adversary is given ga, gb entropic secret: x | adversary is given f(x) entropic secret: SignSK(m) | adversary is given PK To make extraction precise, must talk about conditional entropy Conditional computational entropy has been used implicitly in [Gennaro,Krawczyk,Rabin04], but never defined explicitly for HILL and Yao

Our results 0. New† notion: conditional computational entropy †previously used, but never formalized 1.  pair (X, Z) such that HYao(X | Z) >> HHILL(X | Z) (where Z is a uniform string) 2. Extract more pseudorandom bits from (X , Z) by considering its Yao-entropy 3. Define computational entropy, version 3: Hunp(X | Z) = k, if  efficient M, Pr[ M(Z) = X ] < 2k Allows to talk about entropy of singletons, like x | f(x) Can’t be defined unconditionally

Yao Entropy > HILL Entropy [Wee03] (oracle separation) [this paper] Length increasing random function f PRG G {0,1}n {0,1}3n X Caveat: need uniZK [Lepinski,Micali,Shelat05] X = ( G( Un ) ,  ) Z = NIZK reference string Non- Interactive Zero- Knowledge  Membership oracle m Yes No

Summary    Computational Entropy: Conditional Version 1: HHILL (X | Z) Conditional Version 2: HYao (X | Z) Conditional Version 3: Hunp (X | Z) Computational Entropy:    Can extract more from Yao than HILL (even unconditionally)

Thank You!