A Probabilistic Analysis of Onion Routing in a Black-box Model 10/29/2007 Workshop on Privacy in the Electronic Society Aaron Johnson (Yale) with Joan Feigenbaum (Yale) Paul Syverson (NRL)
Contributions
1.Use a black-box abstraction to create a probabilistic model of onion routing
Contributions 1.Use a black-box abstraction to create a probabilistic model of onion routing 2.Analyze unlinkability a.Provide worst-case bounds b.Examine a typical case
Related Work A Model of Onion Routing with Provable Anonymity J. Feigenbaum, A. Johnson, and P. Syverson FC 2007 Towards an Analysis of Onion Routing Security P. Syverson, G. Tsudik, M. Reed, and C. Landwehr PET 2000 An Analysis of the Degradation of Anonymous Protocols M. Wright, M. Adler, B. Levine, and C. Shields NDSS 2002
Anonymous Communication Sender anonymity: Adversary cant determine the sender of a given message Receiver anonymity: Adversary cant determine the receiver of a given message Unlinkability: Adversary cant determine who talks to whom
Anonymous Communication Sender anonymity: Adversary cant determine the sender of a given message Receiver anonymity: Adversary cant determine the receiver of a given message Unlinkability: Adversary cant determine who talks to whom
How Onion Routing Works User u running client Internet destination d Routers running servers ud
How Onion Routing Works ud 1.u creates 3-hop circuit through routers
How Onion Routing Works ud 1.u creates 3-hop circuit through routers
How Onion Routing Works ud 1.u creates 3-hop circuit through routers
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{{m} 3 } 4 }
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{m} 3 }
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {m}
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged m
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged m
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {m}
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{m} 3 }
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{{m} 3 } 4 }
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged. 4.Stream is closed
How Onion Routing Works u 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged. 4.Stream is closed. 5.Circuit is changed every few minutes d
Adversary u d Active & Local
Anonymity u d v w e f
Anonymity u d 1.First router compromised v w e f
Anonymity u d 1.First router compromised 2.Last router compromised v w e f
Anonymity u d 1.First router compromised 2.Last router compromised 3.First and last compromised 4. v w e f
Anonymity u d 1.First router compromised 2.Last router compromised 3.First and last compromised 4.Neither first nor last compromised v w e f
Black-box Abstraction ud v w e f
ud v w e f 1. Users choose a destination
Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed
Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed 3.Some outputs are observed
Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user.
Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary.
Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary.
Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary.
Probabilistic Black-box ud v w e f
ud v w e f Each user v selects a destination from distribution p v pupu
Probabilistic Black-box ud v w e f Each user v selects a destination from distribution p v Inputs and outputs are observed independently with probability b pupu
Probabilistic Anonymity ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations
Probabilistic Anonymity ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations Conditional distribution: Pr[u d] = 1
Black Box Model Let U be the set of users. Let be the set of destinations. Configuration C User destinations C D : U Observed inputs C I : U {0,1} Observed outputs C O : U {0,1} Let X be a random configuration such that: Pr[X=C] = u p u C D (u) b C I (u) (1-b) 1-C I (u) b C O (u) (1-b) 1-C O (u)
Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C]
Note: There are several other candidates for a probabilistic anonymity metric, e.g. entropy Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C]
Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary
Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary Unlinkability given that u visits d: E[Y | X D (u)=d]
Worst-case Anonymity
Theorem 1: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u
Show max. occurs when, for all v u, e v = d or e v =. Worst-case Anonymity Theorem 1: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u Show max. occurs when, for all v u, p v e v = 1 for some e v. Show max. occurs when e v =d for all v u, or when e v = for all v u.
Worst-case Estimates Let n be the number of users.
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users.
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Theorem 3: When p v d =1 for all v u: E[Y | X D (u)=d] = b 2 + b(1-b)p u d + (1-b) p u d /(1-(1- p u d ) b ) + O( logn/n) ] Let n be the number of users.
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users.
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d Let n be the number of users.
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users.
Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users. Increased chance of total compromise from b 2 to b.
Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 4: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n)
Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 4: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n) E[Y | X D (u)=d] b 2 + ( 1 b 2 )p u d
Contributions 1.Use a black-box abstraction to create a probabilistic model of onion routing 2.Analyze unlinkability a.Provide worst-case bounds b.Examine a typical case
Future Work 1.Extend analysis to other types of anonymity and to other systems. 2.Examine how quickly users distribution are learned. 3.Analyze timing attacks.