A Probabilistic Analysis of Onion Routing in a Black-box Model 10/29/2007 Workshop on Privacy in the Electronic Society Aaron Johnson (Yale) with Joan.

Slides:



Advertisements
Similar presentations
Aaron Johnson with Joan Feigenbaum Paul Syverson
Advertisements

Provable Unlinkability Against Traffic Analysis Ron Berman Joint work with Amos Fiat and Amnon Ta-Shma School of Computer Science, Tel-Aviv University.
A Formal Analysis of Onion Routing 10/26/2007 Aaron Johnson (Yale) with Joan Feigenbaum (Yale) Paul Syverson (NRL)
Towards a Theory of Onion Routing Aaron Johnson Yale University 5/27/2008.
Tor: The Second-Generation Onion Router
Definition of the Anonymity of Mix Network Runs Andrei Serjantov University of Cambridge Computer Laboratory.
LASTor: A Low-Latency AS-Aware Tor Client
Self-Organized Anonymous Authentication in Mobile Ad Hoc Networks Julien Freudiger, Maxim Raya and Jean-Pierre Hubaux SECURECOMM, 2009.
Modelling and Analysing of Security Protocol: Lecture 10 Anonymity: Systems.
Trust-based Anonymous Communication: Models and Routing Algorithms Aaron Johnson Paul Syverson Roger Dingledine Nick Mathewson U.S. Naval Research Laboratory.
Quantifying Location Privacy: The Case of Sporadic Location Exposure Reza Shokri George Theodorakopoulos George Danezis Jean-Pierre Hubaux Jean-Yves Le.
Onion Routing Security Analysis Aaron Johnson U.S. Naval Research Laboratory DC-Area Anonymity, Privacy, and Security Seminar.
Anonymity Analysis of Onion Routing in the Universally Composable Framework Joan Feigenbaum Aaron Johnson Paul Syverson Yale University U.S. Naval Research.
How Much Anonymity does Network Latency Leak? Paper by: Nicholas Hopper, Eugene Vasserman, Eric Chan-Tin Presented by: Dan Czerniewski October 3, 2011.
Message Splitting Against the Partial Adversary Andrei Serjantov The Free Haven Project (UK) Steven J Murdoch University of Cambridge Computer Laboratory.
1 Analyzing Anonymity Protocols 1.Analyzing onion-routing security 1.Anonymity Analysis of Onion Routing in the Universally Composable Framework in Provable.
Reusable Anonymous Return Channels
On Traffic Analysis in Tor Guest Lecture, ELE 574 Communications Security and Privacy Princeton University April 3 rd, 2014 Dr. Rob Jansen U.S. Naval Research.
Privacy Protection In Grid Computing System Presented by Jiaying Shi.
1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum WITS’08; Princeton NJ; June 18, 2008 Acknowledgement:
Xinwen Fu Anonymous Communication & Computer Forensics Computer & Network Forensics.
Privacy on the Web Gertzman Lora Krakov Lena. Why privacy? Privacy is the number one consumer issue facing the internet. An eavesdropper (server, service.
Analysis of Onion Routing Presented in by Jayanthkumar Kannan On 10/8/03.
0x1A Great Papers in Computer Security Vitaly Shmatikov CS 380S
Anonymizing Network Technologies Some slides modified from Dingledine, Mathewson, Syverson, Xinwen Fu, and Yinglin Sun Presenter: Chris Zachor 03/23/2011.
Toward Understanding Congestion in Tor DC-area Anonymity, Privacy, and Security Seminar January 24 th, 2014 Rob Jansen U.S. Naval Research Laboratory *Joint.
Towards an Analysis of Onion Routing Security Syverson, Tsudik, Reed, and Landwehr PET 2000 Presented by: Adam Lee 1/26/2006 Syverson, Tsudik, Reed, and.
Preventing Active Timing Attacks in Low- Latency Anonymous Communication The 10 th Privacy Enhancing Technologies Symposium July 2010 Joan Feigenbaum Yale.
Aaron Johnson U.S. Naval Research Laboratory CSci 6545 George Washington University 11/18/2013.
Information Theory and Security Prakash Panangaden McGill University First Canada-France Workshop on Foundations and Practice of Security Montréal 2008.
On the Anonymity of Anonymity Systems Andrei Serjantov (anonymous)
A Tale of Research: From Crowds to Deeper Understandings Matthew Wright Jan. 25, : Adv. Network Security.
Provable Protocols for Unlinkability Ron Berman, Amos Fiat, Amnon Ta-Shma Tel Aviv University.
CSE 486/586, Spring 2012 CSE 486/586 Distributed Systems Case Study: TOR Anonymity Network Bahadir Ismail Aydin Computer Sciences and Engineering University.
Towards Highly Reliable Enterprise Network Services via Inference of Multi-level Dependencies Paramvir Bahl, Ranveer Chandra, Albert Greenberg, Srikanth.
Preserving Link Privacy in Social Network Based Systems Prateek Mittal University of California, Berkeley Charalampos Papamanthou.
Provable Unlinkability Against Traffic Analysis Amnon Ta-Shma Joint work with Ron Berman and Amos Fiat School of Computer Science, Tel-Aviv University.
Crowds: Anonymity for Web Transactions Michael K. Reiter Aviel D. Rubin Jan 31, 2006Presented by – Munawar Hafiz.
Lecture 6 Page 1 Advanced Network Security Review of Networking Basics Advanced Network Security Peter Reiher August, 2014.
Guard Sets for Onion Routing JOSHUA FREE. Tor Most popular low-latency distributed anonymity network Controversial decisions of guard selection strategies.
Teknik Routing Pertemuan 10 Matakuliah: H0524/Jaringan Komputer Tahun: 2009.
The Silk Road: An Online Marketplace
The Tor Network BY: CONOR DOHERTY AND KENNETH CABRERA.
Supplemental Information on TOR (The Onion Router) CEH ed 8, Rev 4 CS3695 – Network Vulnerability Assessment & Risk Mitigation–
Mix networks with restricted routes PET 2003 Mix Networks with Restricted Routes George Danezis University of Cambridge Computer Laboratory Privacy Enhancing.
Privacy Preserving in Social Network Based System PRENTER: YI LIANG.
Modified Onion Routing GYANRANJAN HAZARIKA AND KARAN MIRANI.
Belief in Information Flow Michael Clarkson, Andrew Myers, Fred B. Schneider Cornell University 18 th IEEE Computer Security Foundations Workshop June.
1 Anonymous Communications CSE 5473: Network Security Lecture due to Prof. Dong Xuan Some material from Prof. Joan Feigenbaum.
1 Anonymity. 2 Overview  What is anonymity?  Why should anyone care about anonymity?  Relationship with security and in particular identification 
Modified Onion Routing GYANRANJAN HAZARIKA AND KARAN MIRANI.
Benjamin Knapic Nicholas Johnson.  “Tor is free software and an open network that helps you defend against a form of network surveillance that threatens.
Modified Onion Routing and its Proof of Concept By: Gyanranjan Hazarika.
Aaron Johnson Rob Jansen Aaron D. Jaggard Joan Feigenbaum
Improving Tor’s Security with Trust-Aware Path Selection Aaron Johnson
CS590B/690B Detecting Network Interference (Fall 2016)
The Onion Router Hao-Lun Hsu
Feeling-based location privacy protection for LBS
What's the buzz about HORNET?
Anonymity Metrics R. Newman.
Towards Measuring Anonymity
A Privacy-Preserving Index for Range Queries
0x1A Great Papers in Computer Security
Anupam Das , Nikita Borisov
Anupam Das , Nikita Borisov
The Tor Network: Freedom and Privacy Online Aaron Johnson U. S
What’s a little leakage between friends?
Modeling Entropy in Onion Routing Networks
Anonymity – Generalizing Mixes
Presentation transcript:

A Probabilistic Analysis of Onion Routing in a Black-box Model 10/29/2007 Workshop on Privacy in the Electronic Society Aaron Johnson (Yale) with Joan Feigenbaum (Yale) Paul Syverson (NRL)

Contributions

1.Use a black-box abstraction to create a probabilistic model of onion routing

Contributions 1.Use a black-box abstraction to create a probabilistic model of onion routing 2.Analyze unlinkability a.Provide worst-case bounds b.Examine a typical case

Related Work A Model of Onion Routing with Provable Anonymity J. Feigenbaum, A. Johnson, and P. Syverson FC 2007 Towards an Analysis of Onion Routing Security P. Syverson, G. Tsudik, M. Reed, and C. Landwehr PET 2000 An Analysis of the Degradation of Anonymous Protocols M. Wright, M. Adler, B. Levine, and C. Shields NDSS 2002

Anonymous Communication Sender anonymity: Adversary cant determine the sender of a given message Receiver anonymity: Adversary cant determine the receiver of a given message Unlinkability: Adversary cant determine who talks to whom

Anonymous Communication Sender anonymity: Adversary cant determine the sender of a given message Receiver anonymity: Adversary cant determine the receiver of a given message Unlinkability: Adversary cant determine who talks to whom

How Onion Routing Works User u running client Internet destination d Routers running servers ud

How Onion Routing Works ud 1.u creates 3-hop circuit through routers

How Onion Routing Works ud 1.u creates 3-hop circuit through routers

How Onion Routing Works ud 1.u creates 3-hop circuit through routers

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{{m} 3 } 4 }

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{m} 3 }

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {m}

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged m

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged m

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {m}

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{m} 3 }

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{{m} 3 } 4 }

How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged. 4.Stream is closed

How Onion Routing Works u 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged. 4.Stream is closed. 5.Circuit is changed every few minutes d

Adversary u d Active & Local

Anonymity u d v w e f

Anonymity u d 1.First router compromised v w e f

Anonymity u d 1.First router compromised 2.Last router compromised v w e f

Anonymity u d 1.First router compromised 2.Last router compromised 3.First and last compromised 4. v w e f

Anonymity u d 1.First router compromised 2.Last router compromised 3.First and last compromised 4.Neither first nor last compromised v w e f

Black-box Abstraction ud v w e f

ud v w e f 1. Users choose a destination

Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed

Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed 3.Some outputs are observed

Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user.

Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary.

Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary.

Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary.

Probabilistic Black-box ud v w e f

ud v w e f Each user v selects a destination from distribution p v pupu

Probabilistic Black-box ud v w e f Each user v selects a destination from distribution p v Inputs and outputs are observed independently with probability b pupu

Probabilistic Anonymity ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations

Probabilistic Anonymity ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations Conditional distribution: Pr[u d] = 1

Black Box Model Let U be the set of users. Let be the set of destinations. Configuration C User destinations C D : U Observed inputs C I : U {0,1} Observed outputs C O : U {0,1} Let X be a random configuration such that: Pr[X=C] = u p u C D (u) b C I (u) (1-b) 1-C I (u) b C O (u) (1-b) 1-C O (u)

Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C]

Note: There are several other candidates for a probabilistic anonymity metric, e.g. entropy Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C]

Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary

Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary Unlinkability given that u visits d: E[Y | X D (u)=d]

Worst-case Anonymity

Theorem 1: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u

Show max. occurs when, for all v u, e v = d or e v =. Worst-case Anonymity Theorem 1: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u Show max. occurs when, for all v u, p v e v = 1 for some e v. Show max. occurs when e v =d for all v u, or when e v = for all v u.

Worst-case Estimates Let n be the number of users.

Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users.

Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Theorem 3: When p v d =1 for all v u: E[Y | X D (u)=d] = b 2 + b(1-b)p u d + (1-b) p u d /(1-(1- p u d ) b ) + O( logn/n) ] Let n be the number of users.

Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users.

Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d Let n be the number of users.

Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users.

Worst-case Estimates Theorem 2: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users. Increased chance of total compromise from b 2 to b.

Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 4: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n)

Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 4: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n) E[Y | X D (u)=d] b 2 + ( 1 b 2 )p u d

Contributions 1.Use a black-box abstraction to create a probabilistic model of onion routing 2.Analyze unlinkability a.Provide worst-case bounds b.Examine a typical case

Future Work 1.Extend analysis to other types of anonymity and to other systems. 2.Examine how quickly users distribution are learned. 3.Analyze timing attacks.