Download presentation
Presentation is loading. Please wait.
Published byAnnice Rice Modified over 8 years ago
1
Real-life cryptography Pfeiffer Alain
2
Types of PRNG‘s History General Structure User space Entropy types Initialization process Building Blocks Security requirements Conclusion
3
Non-cryptographic deterministic: Should not be used for security (Mersenne Twister) Cryptographically secure: Algorithm with properties that make it suitable for the use in cryptography (Fortuna) Entropy inputs: Produces bits non- deterministically as the internal state is frequently refreshed with unpredictable data from one or several external entropy sources (LPRNG)
4
Part of the Linux Kernel since 1994 Written by Ts‘o Modified by Mackall +/- 1700 lines of C code
5
Internal states: Input pool (128, 32-bit words = 4096 bits) Blocking pool (32, 32 bit words = 1024 bits) Nonblocking pool (1024 bits) Output function: Sha-1 Mixing function: Linear mixing function ≠ hash Entropy Counter: Decremented when bits are extracted Incremented when new bits are collected
6
/dev/random Reads from blocking pool Limits the number of generated bits Blocked when not enough entropy Resumed when new entropy in input pool /dev/urandom Reads from nonblocking Generates random bits WITHOUT blocking Writing the data does NOT change the entropy counter!!! Get_random_bytes() Kernel space Reads random bytes from nonblocking pool
7
Backbone of security Injected: Into generator for initialization Through updating mechanism Usable independently Does NOT rely on physical non-deterministic phenomena Hardware RNGs ▪ Available for user space ▪ NOT mixed into LPRNG Entropy gathering daemon: ▪ Collects the outputs ▪ Feeds them into LPRNG
8
Reliable Entropy: User inputs (Keyboard, Mouse) Disk timings Interrupt timings are NOT reliable: Regular interrupts Miss-use of the „IRQF_SAMPLE_RANDOM“ flag
9
„num“ value (Type of event, 32 bits) Mouse (12 bits) Keyboard (8 bits) Interrupts (4 bits) Hard drive (3 bits) CPU „cycle“ Max: 32 bits AVG: 15 bits „jiffies“ count (32 bits) Kernel counter of timer interrupts (avg. 3 – 4 Bits) Frequency 100 – 1000 ticks/sec The generator never assumes max entropy.
10
1. Unknown distribution: Inputs vary a lot 2. Unknown correlation: Correlations between inputs are likely 3. Large sample space: Hard to keep track of 2 32 Jiffies values. 4. Limited time: Estimation happens after interrupts, so they must be fast. 5. Estimation at runtime: Estimation for every input! 6. Unknown knowledge of the attacker
11
Not much entropy in Linux boot process! At Shutdown: Generates data from /dev/urandom Save into file At Startup: Writes the saved data to /dev/random Mixes the data to: ▪ Blocking pool ▪ Nonblocking pool without changing the counter!
12
1. Mixing Function 2. Entropy Estimator 3. Output Function 4. Entropy Extraction
13
…
14
1. Mixes 1 byte after each other 2. Extend it to 32-bit word 3. Rotate it by 0-31 4. Linear shifting (LFSR) into the pool No entropy gets lost
15
Linear feedback shifting register (LFSR) over Galois field: GF(2 32 ) with Feedback Polynomial: Q(X) = α 3 (P(X) – 1) + 1 where Primitive element: α Size of the pool: P(X) Input Pool: P(X) = X 128 +X 103 +X 76 +X 51 +X 25 +X+1 Output Pool: P(X) = X 32 +X 26 +X 20 +X 14 +X 7 +X+1 Input pool period: 2 92*32 -1 ≠ 2 128*32 -1 Output pool period: 2 26*32 -1 ≠ 2 32*32 -1
16
Input Pool: P(X) = X 128 +X 103 +X 76 +X 51 +X 25 +X+1 Output Pool: P(X) = X 32 +X 26 +X 20 +X 14 +X 7 +X+1 P(X) is NOT irreducible! But by changing one feedback position Input Pool: P(X) = X 128 +X 104 +X 76 +X 51 +X 25 +X+1 Output Pool: P(X) = X 32 +X 26 +X 19 +X 14 +X 7 +X+1 P(X) is irreducible But NOT primitive! However by changing α to: α 2 (X 32 +X 26 +X 23 +X 14 +X 7 +X+1) α 4 α 7 … P(X) is irreducible AND primitive! Periods: 2 128*32 -1 & 2 32*32 -1
17
Function L 1 : {0,1} 8 {0,1} 32 ▪ Rotates ▪ Multiplication in GF(2 32 ) Feedback function L 2 : ({0,1} 32 ) 5 {0,1} 32
18
Random variables: Identically distributed Different (single) source Sample space: D where |D| >> 2 Jiffies count: ᵹ i [1] at time i Estimator with input T i : Logarithm function: Outcome:
19
To compute We must know: Time t i-1 Jiffies count: ᵹ i-1 [1] where [1] = event 1 Jiffies count: ᵹ i-1 [2] where [2] = event 2 Property: invariant under a permutation Permutation: Distribution q: Distribution p: H(p) ≠ H(q), since it uses the value of a given element and not its probability!
20
Transfer: Input pool output pool Generate data from output pool Uses Sha-1 hash Feedback phase Extraction phase
21
Sha-1 Get all pool bytes (32-bit word) Produce 5-word hash Send it to ▪ Mixing function ▪ Extraction phase Mixing function Get the 5-word hash Mix it back Shift 20 times (20 words = 640 bits)
22
Sha-1 Initial value (Hash) Get (16) Pool-words ▪ Overlap with last word from the feedback function ▪ Overlap with 3 first words of the output pool Produce 5-word hash Fold in half Extract w 0 xor w 1 xor w 2 xor w 3 xor w 4 Produce 10 byte output
23
Random Variable: X Rényi Entropy: H 2 (X) Hash function: Random choice of the hash: G IF H 2 (X) ≥ r G: uniformly distributed Entropy is close to r bits
24
LPRNG fixed hash function: Assumptions: Each element has size of Attacker knows all permutations Universal hash function: If the pool contains: k bits of Rényi entropy m ≤ k Entropy close to m bits:
25
Sound entropy estimation: Estimate the amount entropy correctly Guarantee that an attacker who knows the input can NOT guess the output! Pseudo randomness: Impossible to compute the: ▪ Internal state ▪ Future outputs Unable to recover: ▪ Internal state ▪ Future outputs with partial knowledge of the entropy
26
Samples: N = 7M Empirical frequency: Estimators: LPRNG entropy: Shannon entropy: Min-entropy: Rényi entropy: Results:
27
Sha-1: one-way function Adversary can NOT recover the content of ▪ output pool ▪ input pool if he only knows the outputs! Folding: Avoids recognizing patterns Output of the hash is NOT directly recognizable Secure if the internal state is NOT compromised!
28
Backtracking resistance: An attacker with knowledge of the current state should NOT be able to recover previous outputs! Prediction resistance: An attacker should NOT be able to predict future outputs with enough future entropy inputs!
29
Forward security: Knowledge of the initial state does NOT provide information on previous states. Even if the state was not refreshed by new entropy inputs. Backtracking provided by: One-way output function Backward security: Adversary who knows the internal state is able predict Outputs Future outputs because the Output function is deterministic… (Bad!) Prediction provided by: Reseed the internal state between requests!
30
Attacker knows: Input pool Output pool Attacker knows the previous states EXCEPT the 160 bits which were fed back. BUT without additional knowledge an generic attack would have: ▪ 2 160 overhead ▪ 2 80 solutions
31
Transferring k bits of entropy means that after: Generating data from UNKNOWN S1 Mixing S1 to the KNOWN S2 Guessing the NEW S2 would cost on average 2 k-1 trials for the attacker! Collecting k bits of entropy means that after: Processing unknown data from KNOWN S1 Guessing the NEW S1 would cost on average 2 k-1 trials for the observer!
32
1. Attacker: Knows the output pool Does NOT know the input pool 2. Attacker knows Input pool Output pool
33
Enough entropy (k >= 64 bits)? Yes! ▪ Transferring k bits from input ▪ Attacker looses k bits of knowledge ▪ NO output before k bits are mixed Generic attack (2 k-1 ): k bits resistance! No! ▪ NO bits are transferred ▪ Attacker keeps knowledge ▪ NO output before k bits are sent from input Generic attack (2 k-1 ): k bits resistance!
34
//k = 64 bits Collect k bits of entropy (2 k-1 guessings) If (counter >= k bits) then counter-- Else counter++ transfer k bits from input 64 bits resistance
35
Good level of security Mixing function could be improved! Newer hash-function could be used (Sha-3)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.