Download presentation
Presentation is loading. Please wait.
Published byAlexander Newton Modified over 11 years ago
1
A Probabilistic Analysis of Onion Routing in a Black-box Model 10/29/2007 Workshop on Privacy in the Electronic Society Aaron Johnson (Yale) with Joan Feigenbaum (Yale) Paul Syverson (NRL)
2
Contributions
3
1.Use a black-box abstraction to create a probabilistic model of onion routing
4
Contributions 1.Use a black-box abstraction to create a probabilistic model of onion routing 2.Analyze unlinkability a.Provide worst-case bounds b.Examine a typical case
5
Related Work A Model of Onion Routing with Provable Anonymity J. Feigenbaum, A. Johnson, and P. Syverson FC 2007 Towards an Analysis of Onion Routing Security P. Syverson, G. Tsudik, M. Reed, and C. Landwehr PET 2000 An Analysis of the Degradation of Anonymous Protocols M. Wright, M. Adler, B. Levine, and C. Shields NDSS 2002
6
Anonymous Communication Sender anonymity: Adversary cant determine the sender of a given message Receiver anonymity: Adversary cant determine the receiver of a given message Unlinkability: Adversary cant determine who talks to whom
7
Anonymous Communication Sender anonymity: Adversary cant determine the sender of a given message Receiver anonymity: Adversary cant determine the receiver of a given message Unlinkability: Adversary cant determine who talks to whom
8
How Onion Routing Works User u running client Internet destination d Routers running servers ud 12 3 4 5
9
How Onion Routing Works ud 1.u creates 3-hop circuit through routers 12 3 4 5
10
How Onion Routing Works ud 1.u creates 3-hop circuit through routers 12 3 4 5
11
How Onion Routing Works ud 1.u creates 3-hop circuit through routers 12 3 4 5
12
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 12 3 4 5
13
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{{m} 3 } 4 } 1 12 3 4 5
14
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{m} 3 } 4 12 3 4 5
15
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {m} 3 12 3 4 5
16
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged m 12 3 4 5
17
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged m 12 3 4 5
18
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {m} 3 12 3 4 5
19
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{m} 3 } 4 12 3 4 5
20
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{{m} 3 } 4 } 1 12 3 4 5
21
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged. 4.Stream is closed. 12 3 4 5
22
How Onion Routing Works u 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged. 4.Stream is closed. 5.Circuit is changed every few minutes. 12 3 4 5 d
23
Adversary u 12 3 4 5 d Active & Local
24
Anonymity u12 3 4 5 d 1. 2. 3. 4. v w e f
25
Anonymity u12 3 4 5 d 1.First router compromised 2. 3. 4. v w e f
26
Anonymity u12 3 4 5 d 1.First router compromised 2.Last router compromised 3. 4. v w e f
27
Anonymity u12 3 4 5 d 1.First router compromised 2.Last router compromised 3.First and last compromised 4. v w e f
28
Anonymity u12 3 4 5 d 1.First router compromised 2.Last router compromised 3.First and last compromised 4.Neither first nor last compromised v w e f
29
Black-box Abstraction ud v w e f
30
ud v w e f 1. Users choose a destination
31
Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed
32
Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed 3.Some outputs are observed
33
Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user.
34
Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary.
35
Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary.
36
Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary.
37
Probabilistic Black-box ud v w e f
38
ud v w e f Each user v selects a destination from distribution p v pupu
39
Probabilistic Black-box ud v w e f Each user v selects a destination from distribution p v Inputs and outputs are observed independently with probability b pupu
40
Probabilistic Anonymity ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations
41
Probabilistic Anonymity ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations Conditional distribution: Pr[u d] = 1
42
Black Box Model Let U be the set of users. Let be the set of destinations. Configuration C User destinations C D : U Observed inputs C I : U {0,1} Observed outputs C O : U {0,1} Let X be a random configuration such that: Pr[X=C] = u p u C D (u) b C I (u) (1-b) 1-C I (u) b C O (u) (1-b) 1-C O (u)
43
Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C]
44
Note: There are several other candidates for a probabilistic anonymity metric, e.g. entropy Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C]
45
Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary
46
Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary Unlinkability given that u visits d: E[Y | X D (u)=d]
47
Worst-case Anonymity
48
Theorem 1: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u
49
Show max. occurs when, for all v u, e v = d or e v =. Worst-case Anonymity Theorem 1: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u Show max. occurs when, for all v u, p v e v = 1 for some e v. Show max. occurs when e v =d for all v u, or when e v = for all v u.
50
Worst-case Estimates Let n be the number of users.
51
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users.
52
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Theorem 3: When p v d =1 for all v u: E[Y | X D (u)=d] = b 2 + b(1-b)p u d + (1-b) p u d /(1-(1- p u d ) b ) + O( logn/n) ] Let n be the number of users.
53
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users.
54
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d Let n be the number of users.
55
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users.
56
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users. Increased chance of total compromise from b 2 to b.
57
Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 4: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n)
58
Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 4: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n) E[Y | X D (u)=d] b 2 + ( 1 b 2 )p u d
59
Contributions 1.Use a black-box abstraction to create a probabilistic model of onion routing 2.Analyze unlinkability a.Provide worst-case bounds b.Examine a typical case
60
Future Work 1.Extend analysis to other types of anonymity and to other systems. 2.Examine how quickly users distribution are learned. 3.Analyze timing attacks.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.