Presentation is loading. Please wait.

Presentation is loading. Please wait.

Key Derivation from Noisy Sources with More Errors Than Entropy Benjamin Fuller Joint work with Ran Canetti, Omer Paneth, and Leonid Reyzin May 5, 2014.

Similar presentations


Presentation on theme: "Key Derivation from Noisy Sources with More Errors Than Entropy Benjamin Fuller Joint work with Ran Canetti, Omer Paneth, and Leonid Reyzin May 5, 2014."— Presentation transcript:

1 Key Derivation from Noisy Sources with More Errors Than Entropy Benjamin Fuller Joint work with Ran Canetti, Omer Paneth, and Leonid Reyzin May 5, 2014 1 BWF 4/2/2014

2 Authenticating Users Users’ private data exists online in a variety of locations Must authenticate users before granting access to private data Passwords are widely used but guessable 2 BWF 4/2/2014 Are there alternatives to passwords with high entropy (uncertainty)?

3 Entropic sources are noisy – Source differs over time, first reading w later readings x, – Distance is bounded d(w, x) ≤ d max Derive stable and strong key from noisy source – w, x map to same key Different samples from source produce independent keys – Gen( w ) ≠ Gen( w’ ) Key Derivation from Noisy Sources Physically Unclonable Functions (PUFs) [PappuRechtTaylorGershenfield02] Biometric Data [Daugman04] 3 BWF 4/2/2014

4 Fuzzy Extractors Source Public (p) key Assume our source is strong – Traditionally, high entropy Fuzzy Extractors derive reliable keys from noisy data [DodisOstrovskyReyzinSmith04, 08] ( interactive setting in aaaa[BennettBrassardRobert88]) Generate Reproduce key p w 4 BWF 4/2/2014 Goals: Correctness: Gen, Rep give same key if d(w, x) ≤ d max Security: (key, p) ≈ (U, p) Can be statistical or computational [FullerMengReyzin13]

5 Fuzzy Extractors Source Public (p) key Ext Generate Reproduce key Assume our source is strong – Traditionally, high entropy Fuzzy Extractors derive reliable keys from noisy data [DodisOstrovskyReyzinSmith04, 08] ( interactive setting in aaaa[BennettBrassardRobert88]) Traditional Construction Derive a key using a randomness extractor p w Converts high entropy sources to uniform H ∞ (W 0 )≥ k Ext (W 0 ) ≈ U

6 Fuzzy Extractors SketchRec Ext Generate Reproduce Source Public (p) key p w Assume our source is strong – Traditionally, high entropy Fuzzy Extractors derive reliable keys from noisy data [DodisOstrovskyReyzinSmith04, 08] ( interactive setting in aaaa[BennettBrassardRobert88]) Traditional Construction Derive a key using a randomness extractor Error correct to w with a secure sketch

7 Error-Correcting Codes 7 BWF 4/2/2014 ec 1 Subset, C, of metric space For ec 1, ec 2 in C, d(w, x) > 2d max For any ec’ find closest ec 1 in C Linear codes: – C is span of expanding matrix Gc (generating matrix) ec 2 2d max ec’

8 Secure Sketches Generate Reproduce Ext SketchRec Code Offset Sketch p =ec  w G – Generating matrix for code that corrects d max errors ec = Gc key p w

9 Secure Sketches Code Offset Sketch ec’=Dec(p  x) p  x p =ec  w ec = Gc If w and w are close then w = ec’  p. If w and w are close then w = ec’  p. G – Generating matrix for code that corrects d max errors Generate Reproduce Ext SketchRec key p w

10 p  x p =ec  w p  x’ w is unknown (knowing p ): (k−k’) – entropy loss w is unknown (knowing p ): (k−k’) – entropy loss Secure Sketches Generate Reproduce Ext SketchRec Code Offset Sketch Ext must be able to extract from distributions where G – Generating matrix for code that corrects d max errors key p w

11 Entropy Loss From Fuzzy Extractors Entropy is at a premium for physical sources – Iris ≈ 249 [Daugman1996] – Fingerprint ≈ 82 [RathaConnellBolle2001] – Passwords ≈ 31 [ShayKomanduri+2010] Fuzzy extractors have two losses: – Secure sketches lose error correcting capability of the code (k-k’) Iris ≈ 200 bit error rate – Randomness extractors lose 2log (1/ε) or between 60-100 bits After these losses the key may be too short to be useful: 30-60 bits After these losses, there may not be any key left!

12 Entropy Loss From Fuzzy Extractors Can we eliminate either of these entropy losses? [DodisOstrovskyReyzinSmith] Secure Sketch Code (corrects random errors) Means k−k’≥ log |B dmax | (Ball of radius d max ) [DodisOstrovskyReyzinSmith] Secure Sketch Code (corrects random errors) Means k−k’≥ log |B dmax | (Ball of radius d max ) Entropy is at a premium for physical sources – Iris ≈ 249 [Daugman1996] – Fingerprint ≈ 82 [RathaConnellBolle2001] – Passwords ≈ 31 [ShayKomanduri+2010] Fuzzy extractors have two losses: – Secure sketches lose error correcting capability of the code (k-k’) Iris ≈ 200 bit error rate – Randomness extractors lose 2log (1/ε) or between 60-100 bits After these losses the key may be too short to be useful: 30-60 bits

13 Error Tolerance and Security at Odds M Any input to Rep in this ball produces key w 13 BWF 4/2/2014 Adversary shouldn’t guess x* where d(w, x*) ≤ d max Easier as d max increases Consider a source W where initial readings w (for different physical devices) are close If there is a point x* close to all points in W, no security is possible

14 Error Tolerance and Security at Odds M Adversary shouldn’t guess x* where d(w, x*) ≤ d max Easier as d max increases Consider a source W where initial readings w (for different physical devices) are close If there is a point x* close to all points in W, no security is possible By providing x* to Rep the adversary always learns key x*x* 14 BWF 4/2/2014 Let B dmax represent the points with distance d max There is a W where

15 Error Tolerance and Security at Odds M Adversary shouldn’t guess x* where d(w, x*) ≤ d max Easier as d max increases Consider a source W where initial readings w (for different physical devices) are close If there is a point x* close to all points in W, no security is possible By providing x* to Rep the adversary always learns key x*x* 15 BWF 4/2/2014 There is a W where Call this minimum usable entropy, H usable (W)

16 Minimum Usable Entropy Standard Fuzzy Extractors provide worst case security guarantees – Implies |key|≤H usable (W) Many sources have no minimum usable entropy – Irises are thought to be the “best” biometric, for irises H usable (W) ≈ -707 Need property other than entropy to secure these sources (e.g. points are not close together) Can we find reasonable properties and accompanying constructions? 16 BWF 4/2/2014

17 Hamming Metric Security parameter n Sources W = W 1,…, W k symbols W i over alphabet Z (grows with n ) d(w, x) = # of symbols in that differ 17 BWF 4/2/2014 100011101001000110101010111101 100110101001 110101010000100 w x d(w, x)=4

18 Results Security relies on point obfuscation (secure under strong vector DDH [BitanskiCanetti10] ) 18 BWF 4/2/2014 Construction 1Construction 2 Security Requirement ω(log n) entropy in most symbols Ω(1) entropy in most symbols Errors Corrected Θ(k)

19 Point Obfuscation Obfuscator transforms program I into “black-box” [BarakGoldreichImpagliazzo RudichSahaiVadhanYang01] Possible for point programs (we use need a version achievable under number- theoretic assumptions due to [BitanskiCanetti10] ) 19 BWF 4/2/2014

20 Point Obfuscation Obfuscator transforms program I into “black-box” [BarakGoldreichImpagliazzo RudichSahaiVadhanYang01] Possible for point programs [Canetti97] – We use a strong version achievable under number-theoretic assumptions (composable virtual gray-box obfuscation [BitanskiCanetti10] ) 20 BWF 4/2/2014

21 Point Obfuscation Obfuscator transforms program I into “black-box” [BarakGoldreichImpagliazzo RudichSahaiVadhanYang01] Possible for point programs [Canetti97] – Need a strong version achievable under strong vector DDH (composable virtual gray-box obfuscation [BitanskiCanetti10] ) 21 BWF 4/2/2014 w w

22 Construction Attempt #1 Hide w using obfuscation Can check if x = w without revealing w Generate Reproduce key 1/0 Two Problems: No key No error tolerance Two Problems: No key No error tolerance w p w w w w 22 BWF 4/2/2014

23 Construction Attempt #2 Generate Reproduce key 1/0 Two Problems: No key No error tolerance Two Problems: No key No error tolerance Obfuscate each symbol (recall w = w 1,…, w k ) Can now learn which symbols match w p 23 BWF 4/2/2014 w w w w

24 Construction Attempt #2 Obfuscate each symbol (recall w = w 1,…, w k ) Can learn which symbols match Generate Reproduce key w01w01 … Two Problems: No key No error tolerance Two Problems: No key No error tolerance w01w01 … 1/0 w p w1w1 w1w1 wkwk wkwk w1w1 w1w1 wkwk wkwk 24 BWF 4/2/2014

25 Construction Attempt #2 Obfuscate each symbol (recall w = w 1,…, w k ) Can learn which symbols match Generate Reproduce key Knowing where errors occur is useful in coding theory w01w01 … w01w01 … 1/0 Leverage a technique from point obfuscation w p 25 BWF 4/2/2014 w1w1 w1w1 wkwk wkwk w1w1 w1w1 wkwk wkwk

26 Can specify output of point function [CanettiDakdouk08] Lets try this on our construction 26 BWF 4/2/2014 w w w w c c

27 Construction Attempt #3 For each symbol i, flip c i – Obfuscate Knowing where errors occur is useful in coding theory 27 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … 1/0 w p w1w1 w1w1 wkwk wkwk w1w1 w1w1 wkwk wkwk

28 Construction Attempt #3 For each symbol i, flip c i – Obfuscate Knowing where errors occur is useful in coding theory 28 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … 1/0 w p wkwk wkwk w1w1 w1w1 wkwk wkwk c 1,…,c k w1w1 w1w1

29 Construction Attempt #3 For each symbol i, flip c i – Obfuscate Knowing where errors occur is useful in coding theory 29 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … 1/0 w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck

30 Construction Attempt #3 For each symbol i, flip c i – Obfuscate 30 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … 1/0 w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Can run obfuscations and recover most bits of c

31 Construction Attempt #3 For each symbol i, flip c i – Obfuscate 31 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … 1/0 w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Can run obfuscations and recover most bits of c

32 Construction Attempt #3 For each symbol i, flip c i – Obfuscate 32 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Can run obfuscations and recover most bits of c

33 Construction Sample c C from binary error correcting code For each symbol i, Obfuscate 33 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Can run obfuscations and recover most bits of c

34 Construction 34 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Decode Can run obfuscations and recover most bits of c Sample c C from binary error correcting code For each symbol i, Obfuscate

35 Construction 35 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Use c as output (run c through comp. ext. [Krawczyk10] to create key ) Use c as output (run c through comp. ext. [Krawczyk10] to create key ) Decode Sample c C from binary error correcting code For each symbol i, Obfuscate

36 Correctness and Security Correctness: Recover all but d(w, x) ≤ d max bits of c Exist binary error correcting codes with error tolerance Θ(k) Security Question: What about w and c is revealed by obfuscations Security Question: What about w and c is revealed by obfuscations … ? 36 BWF 4/2/2014 w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Generate Reproduce key w01w01 … w01w01 … w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Decode

37 What is revealed by obfuscations? Need to argue adversary learns little through equality oracle queries to symbols Enough to argue adversary sees as response to queries with overwhelming probability – That is, they rarely guess the stored value w i 37 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Decode

38 Block Unguessable Distributions Let A be an algorithm asking polynomial queries of the form: is w i = x i ? Def: W = W 1,…, W k is block unguessable if there exists a set such that for all A, Caution: Adaptivity is crucial, there are distributions with high overall entropy that can be guessed using equality queries to individual blocks 38 BWF 4/2/2014

39 Block Unguessable: Proceed with Caution W1W1 w1w1 … W2W2 WkWk An adversary can guess “easy” blocks, and use gained info to guess next block w2w2 wkwk 39 BWF 4/2/2014

40 Block Unguessable Distributions Caution: Adaptivity is crucial, there are distributions with high overall entropy that can be guessed using equality queries to individual blocks 40 BWF 4/2/2014 Positive Examples: block fixing sources [KampZuckerman07], blocks are independent and many are entropic, all entropic blocks Let A be an algorithm asking polynomial queries of the form: is w i = x i ? Def: W = W 1,…, W k is block unguessable if there exists a set such that for all A,

41 Security Let A be an algorithm asking polynomial queries of the form: is w i = x i ? Def: W = W 1,…, W k is block unguessable if there exists a set such that for all A, Thm: When the source is block unguessable, C has computational entropy Convertible to pseudorandom by comp. ext. 41 BWF 4/2/2014

42 Security Let A be an algorithm asking polynomial queries of the form: is w i = x i ? Def: W = W 1,…, W k is block unguessable if there exists a set such that for all A, Thm: When the source is block unguessable, C has computational entropy 42 BWF 4/2/2014

43 Security Let A be an algorithm asking polynomial queries of the form: is w i = x i ? Def: W = W 1,…, W k is block unguessable if there exists a set such that for all A, Thm: When the source is block unguessable, C has log(|C|) - (k-|J |) bits of comp. entropy size of the code minus the “guessable” positions 43 BWF 4/2/2014

44 Security Let A be an algorithm asking polynomial queries of the form: is w i = x i ? Def: W = W 1,…, W k is block unguessable if there exists a set such that for all A, Thm: When the source is block unguessable, C has log(|C|) - (k-|J |) bits of comp. entropy 44 BWF 4/2/2014 Note: In computational setting, size of key isn’t as crucial, can be expanded by computational extractor

45 Error Tolerance and Security at Odds M Adversary shouldn’t guess x* where d(w, x*) ≤ d max A block unguessable distribution has more unguessable symbols than are corrected There is at least one symbol an adversary must guess Get security from adversary’s inability to guess this one symbol w 45 BWF 4/2/2014

46 Error Tolerance and Security at Odds M Adversary shouldn’t guess x* where d(w, x*) ≤ d max A block unguessable distribution has more unguessable symbols than are corrected There is at least one symbol an adversary must guess Get security from adversary’s inability to guess this symbol 46 BWF 4/2/2014

47 Results 47 BWF 4/2/2014 Construction 1Construction 2 Security Requirement ω(log n) entropy in most symbols Ω(1) entropy in most symbols Errors Corrected Θ(k) Generate Reproduce key w01w01 … w01w01 … w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Decode H usable ≤ 0 if |Z| = ω(poly(n)) & C corrects Θ(k) errors

48 Reducing Required Entropy Obfuscating symbols individually leaks equality, entropy ensures A can’t guess stored values Can we reduce the necessary entropy if we obfuscate multiple symbols together? – Obfuscating all symbols together works but eliminates error tolerance 48 BWF 4/2/2014 Generate Reproduce key w01w01 … w01w01 … w p c 1,…,c k w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck w1w1 w1w1 c1c1 c1c1 wkwk wkwk ckck ckck Decode

49 Generate key w01w01 … c 1,…,c k … Instead of having symbols/obfuscations in 1-1 correspondence, introduce level of indirection Create random bipartite graph between symbols and obfuscations (published in p ) – Each obfuscation has degree α Reducing Required Entropy w1w1 w2w2 wkwk p wkwk wkwk w1w1 w1w1 w2w2 w2w2 49 BWF 4/2/2014 c1c1 c1c1 c2c2 c2c2 ckck ckck

50 Generate key c 1,…,c k … Instead of having symbols/obfuscations in 1-1 correspondence, introduce level of indirection Create random bipartite graph between symbols and obfuscations (published in p ) – Each obfuscation has degree α Reducing Required Entropy w01w01 … w1w1 w2w2 wkwk p w1w1 w1w1 w2w2 w2w2 wkwk wkwk 50 BWF 4/2/2014 c1c1 c1c1 c2c2 c2c2 ckck ckck

51 Generate key Instead of having symbols/obfuscations in 1-1 correspondence, introduce level of indirection Create random bipartite graph between symbols and obfuscations (published in p ) – Each obfuscation has degree α Reducing Required Entropy p 51 BWF 4/2/2014 c 1,…,c k … w1w1 w1w1 w2w2 w2w2 wkwk wkwk w01w01 … w1w1 w2w2 wkwk c1c1 c1c1 c2c2 c2c2 ckck ckck

52 Generate key Instead of having symbols/obfuscations in 1-1 correspondence, introduce level of indirection Create random bipartite graph between symbols and obfuscations (published in p ) – Each obfuscation has degree α Reducing Required Entropy p 52 BWF 4/2/2014 c 1,…,c k … w1w1 w1w1 w2w2 w2w2 wkwk wkwk w01w01 … w1w1 w2w2 wkwk c1c1 c1c1 c2c2 c2c2 ckck ckck

53 Generate key Instead of having symbols/obfuscations in 1-1 correspondence, introduce level of indirection Create random bipartite graph between symbols and obfuscations (published in p ) – Each obfuscation has degree α Reducing Required Entropy p 53 BWF 4/2/2014 c 1,…,c k … v 1 =w 1 ||w 2 ||w 4 ||w 10 w01w01 … w1w1 w2w2 wkwk c1c1 c1c1 c2c2 c2c2 ckck ckck v 2 =w 2 ||w 3 ||w 6 ||w 8 v k =w 3 ||w 4 ||w 7 ||w 9

54 The graph is an averaging sampler [Lu2002,Vadhan2003] Obfuscating multiple blocks together degrades error tolerance – If d(w, x) ≤ d max, then Pr. each v i contains an error is O(d max *α) – If C supports Θ(k) errors and α=ω(log k), construction correct w.h.p. if d(w, x)≤ k/ω(log k) (by Chernoff bound) Correctness 54 BWF 4/2/2014 Generate key p c 1,…,c k … v 1 =w 1 ||w 2 ||w 4 ||w 10 w01w01 … w1w1 w2w2 wkwk c1c1 c1c1 c2c2 c2c2 ckck ckck v 2 =w 2 ||w 3 ||w 6 ||w 8 v k =w 3 ||w 4 ||w 7 ||w 9

55 Assume exists set of symbols J with Ω(1) entropy conditioned on values of all other symbols E[ H ∞ ( V i )] ≥ Ω( E|{ indices of J included in V i }|) Security The size of this set is hyper-geometrically distributed. Expected size is α*|J|/k. Distribution has a small tail [Chvátal79]. The size of this set is hyper-geometrically distributed. Expected size is α*|J|/k. Distribution has a small tail [Chvátal79]. 55 BWF 4/2/2014 Generate key p c 1,…,c k … v 1 =w 1 ||w 2 ||w 4 ||w 10 w01w01 … w1w1 w2w2 wkwk c1c1 c1c1 c2c2 c2c2 ckck ckck v 2 =w 2 ||w 3 ||w 6 ||w 8 v k =w 3 ||w 4 ||w 7 ||w 9

56 Assume exists set of symbols J with Ω(1) entropy conditioned on values of all other symbols E[ H ∞ ( V i )] ≥ Ω( E|{ indices of J included in V i }|) If α = ω(log n), all H ∞ (V i ) ≥ ω(log n) entropy w.h.p. V = V 1,…,V k is a block unguessable distribution, security follows from previous construction Security 56 BWF 4/2/2014 Generate key p c 1,…,c k … v 1 =w 1 ||w 2 ||w 4 ||w 10 w01w01 … w1w1 w2w2 wkwk c1c1 c1c1 c2c2 c2c2 ckck ckck v 2 =w 2 ||w 3 ||w 6 ||w 8 v k =w 3 ||w 4 ||w 7 ||w 9

57 57 BWF 4/2/2014 Construction 1Construction 2 Security Requirement ω(log n) entropy in most symbols Ω(1) entropy in most symbols Errors Corrected Θ(k) Generate key p c 1,…,c k … v 1 =w 1 ||w 2 ||w 4 ||w 10 w01w01 … w1w1 w2w2 wkwk c1c1 c1c1 c2c2 c2c2 ckck ckck v 2 =w 2 ||w 3 ||w 6 ||w 8 v k =w 3 ||w 4 ||w 7 ||w 9 Results

58 Noisy Point Obfuscation A noisy point obfuscator is stronger than a fuzzy extractor – Cannot leak any partial information about w [DodisSmith05] achieve weaker distributional notion of noisy point obfuscation when H usable >> 0 Our constructions leak information (value of individual blocks, locations of errors) and are not standard obfuscation Can we construct noisy point obfuscation for all distributions? From indistinguishability obfuscation? [GargGentryHaleviRaykovaSahaiWaters13] 58 BWF 4/2/2014

59 Conclusion Construct the first (computational) fuzzy extractors when H usable ≤ 0 using point obfuscation Constructions allow H usable ≤ 0 when alphabet is super-polynomial – Necessary? Constructions for small alphabet? We restricted W, could restrict errors (that is restrict X ) 59 BWF 4/2/2014

60 Questions? 60 BWF 4/2/2014


Download ppt "Key Derivation from Noisy Sources with More Errors Than Entropy Benjamin Fuller Joint work with Ran Canetti, Omer Paneth, and Leonid Reyzin May 5, 2014."

Similar presentations


Ads by Google