1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum WITS’08; Princeton NJ; June 18, 2008 Acknowledgement:

Slides:



Advertisements
Similar presentations
Aaron Johnson with Joan Feigenbaum Paul Syverson
Advertisements

Provable Unlinkability Against Traffic Analysis Ron Berman Joint work with Amos Fiat and Amnon Ta-Shma School of Computer Science, Tel-Aviv University.
A Probabilistic Analysis of Onion Routing in a Black-box Model 10/29/2007 Workshop on Privacy in the Electronic Society Aaron Johnson (Yale) with Joan.
A Formal Analysis of Onion Routing 10/26/2007 Aaron Johnson (Yale) with Joan Feigenbaum (Yale) Paul Syverson (NRL)
Towards a Theory of Onion Routing Aaron Johnson Yale University 5/27/2008.
Definition of the Anonymity of Mix Network Runs Andrei Serjantov University of Cambridge Computer Laboratory.
LASTor: A Low-Latency AS-Aware Tor Client
Modelling and Analysing of Security Protocol: Lecture 10 Anonymity: Systems.
Trust-based Anonymous Communication: Models and Routing Algorithms Aaron Johnson Paul Syverson Roger Dingledine Nick Mathewson U.S. Naval Research Laboratory.
Onion Routing Security Analysis Aaron Johnson U.S. Naval Research Laboratory DC-Area Anonymity, Privacy, and Security Seminar.
Anonymity Analysis of Onion Routing in the Universally Composable Framework Joan Feigenbaum Aaron Johnson Paul Syverson Yale University U.S. Naval Research.
How Much Anonymity does Network Latency Leak? Paper by: Nicholas Hopper, Eugene Vasserman, Eric Chan-Tin Presented by: Dan Czerniewski October 3, 2011.
Message Splitting Against the Partial Adversary Andrei Serjantov The Free Haven Project (UK) Steven J Murdoch University of Cambridge Computer Laboratory.
1 Analyzing Anonymity Protocols 1.Analyzing onion-routing security 1.Anonymity Analysis of Onion Routing in the Universally Composable Framework in Provable.
Reusable Anonymous Return Channels
Slicing the Onion: Anonymous Routing without PKI Saurabh Shrivastava CS 259
CSCE 715 Ankur Jain 11/16/2010. Introduction Design Goals Framework SDT Protocol Achievements of Goals Overhead of SDT Conclusion.
Building a Peer-to-Peer Anonymizing Network Layer Michael J. Freedman NYU Dept of Computer Science Public Design Workshop September 13,
Xinwen Fu Anonymous Communication & Computer Forensics Computer & Network Forensics.
The Case for Network-Layer, Peer-to-Peer Anonymization Michael J. Freedman Emil Sit, Josh Cates, Robert Morris MIT Lab for Computer Science IPTPS’02March.
I NTERNET A NONYMITY By Esra Erdin. Introduction Types of Anonymity Systems TOR Overview Working Mechanism of TOR I2P Overview Working Mechanism of I2P.
Analysis of Onion Routing Presented in by Jayanthkumar Kannan On 10/8/03.
CMSC 414 Computer and Network Security Lecture 21 Jonathan Katz.
Pseudo Trust: Zero-Knowledge Based Authentication in Anonymous Peer-to-Peer Protocols Li Lu, Lei Hu State Key Lab of Information Security, Graduate School.
Modelling and Analysing of Security Protocol: Lecture 9 Anonymous Protocols: Theory.
Anonymous Communication Luis von Ahn Andrew Bortz Nick Hopper The Aladdin Center Carnegie Mellon University.
Information Theory and Security
Network Measurement Bandwidth Analysis. Why measure bandwidth? Network congestion has increased tremendously. Network congestion has increased tremendously.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 22, 2005
Anonymity on the Web: A Brief Overview By: Nipun Arora uni-na2271.
0x1A Great Papers in Computer Security Vitaly Shmatikov CS 380S
Towards an Analysis of Onion Routing Security Syverson, Tsudik, Reed, and Landwehr PET 2000 Presented by: Adam Lee 1/26/2006 Syverson, Tsudik, Reed, and.
Preventing Active Timing Attacks in Low- Latency Anonymous Communication The 10 th Privacy Enhancing Technologies Symposium July 2010 Joan Feigenbaum Yale.
Aaron Johnson U.S. Naval Research Laboratory CSci 6545 George Washington University 11/18/2013.
Toward Prevention of Traffic Analysis Fengfeng Tu 11/26/01.
On the Anonymity of Anonymity Systems Andrei Serjantov (anonymous)
A Tale of Research: From Crowds to Deeper Understandings Matthew Wright Jan. 25, : Adv. Network Security.
Slicing the Onion: Anonymity Using Unreliable Overlays Sachin Katti Jeffrey Cohen & Dina Katabi.
Provable Protocols for Unlinkability Ron Berman, Amos Fiat, Amnon Ta-Shma Tel Aviv University.
CSCI 5234 Web Security1 Privacy & Anonymity in the WWW Ch. 12, Oppliger.
Privacy and Anonymity CS432 - Security in Computing Copyright © 2005, 2006 by Scott Orr and the Trustees of Indiana University.
CSE 486/586, Spring 2012 CSE 486/586 Distributed Systems Case Study: TOR Anonymity Network Bahadir Ismail Aydin Computer Sciences and Engineering University.
Anonymous routing and mix nets (Tor) Yongdae Kim Significant fraction of these slides are borrowed from CS155 at Stanford 1.
Provable Unlinkability Against Traffic Analysis Amnon Ta-Shma Joint work with Ron Berman and Amos Fiat School of Computer Science, Tel-Aviv University.
Crowds: Anonymity for Web Transactions Michael K. Reiter Aviel D. Rubin Jan 31, 2006Presented by – Munawar Hafiz.
CSE 592 INTERNET CENSORSHIP (FALL 2015) LECTURE 20 PHILLIPA GILL - STONY BROOK U.
Anonymity - Background R. Newman. Topics Defining anonymity Need for anonymity Defining privacy Threats to anonymity and privacy Mechanisms to provide.
Onion Routing R. Newman. Topics Defining anonymity Need for anonymity Defining privacy Threats to anonymity and privacy Mechanisms to provide anonymity.
Mix networks with restricted routes PET 2003 Mix Networks with Restricted Routes George Danezis University of Cambridge Computer Laboratory Privacy Enhancing.
Modified Onion Routing GYANRANJAN HAZARIKA AND KARAN MIRANI.
Efficient Geographic Routing in Multihop Wireless Networks Seungjoon Lee*, Bobby Bhattacharjee*, and Suman Banerjee** *Department of Computer Science University.
1 Anonymous Communications CSE 5473: Network Security Lecture due to Prof. Dong Xuan Some material from Prof. Joan Feigenbaum.
1 Anonymity. 2 Overview  What is anonymity?  Why should anyone care about anonymity?  Relationship with security and in particular identification 
Modified Onion Routing GYANRANJAN HAZARIKA AND KARAN MIRANI.
Benjamin Knapic Nicholas Johnson.  “Tor is free software and an open network that helps you defend against a form of network surveillance that threatens.
Aaron Johnson Rob Jansen Aaron D. Jaggard Joan Feigenbaum
The Onion Router Hao-Lun Hsu
OblivP2P: An Oblivious Peer-to-Peer Content Sharing System
Anonymous Communication
Anonymity Metrics R. Newman.
OblivP2P: An Oblivious Peer-to-Peer Content Sharing System
Some slides borrowed from Philippe Golle, Markus Jacobson
Protocols for Anonymous Communication
Reliable MIX Cascade Networks Through Reputation
Towards Measuring Anonymity
0x1A Great Papers in Computer Security
Free-route Mixes vs. Cascades
Anonymous Communication
Anonymous Communications
Anonymous Communication
Presentation transcript:

1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum WITS’08; Princeton NJ; June 18, 2008 Acknowledgement: Aaron Johnson

2 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

3 Anonymity: What and Why The adversary cannot tell who is communicating with whom. Not the same as confidentiality (and hence not solved by encryption). Pro: Facilitates communication by whistle blowers, political dissidents, members of 12-step programs, etc. Con: Inhibits accountability

4 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

5 Anonymity Systems R ers / Mix Networks –anon.penet.fi –MixMaster –Mixminion Low-latency communication –Anonymous proxies, anonymizer.net –Freedom –Tor –JAP Data Publishing –FreeNet

6 Mix Networks First outlined by Chaum in 1981 Provide anonymous communication –High latency –Message-based (“message-oriented”) –One-way or two-way

7 Mix Networks UsersMixesDestinations

8 Mix Networks Adversary UsersMixesDestinations

9 Mix Networks UsersMixesDestinations Protocol Adversary

10 Mix Networks 1.User selects a sequence of mixes and a destination. M1M1 M2M2 M3M3 ud Protocol Adversary UsersMixesDestinations

11 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. M1M1 M2M2 M3M3 ud Protocol Adversary UsersMixesDestinations

12 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. Adversary UsersMixesDestinations

13 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. {{{ ,d} M 3,M 3 } M 2,M 2 } M 1 Adversary UsersMixesDestinations

14 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. 3.Send the message, removing a layer of encryption at each mix. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. {{{ ,d} M 3,M 3 } M 2,M 2 } M 1 Adversary UsersMixesDestinations

15 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. 3.Send the message, removing a layer of encryption at each mix. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. {{ ,d} M 3,M 3 } M 2 Adversary UsersMixesDestinations

16 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. 3.Send the message, removing a layer of encryption at each mix. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. { ,d} M 3 Adversary UsersMixesDestinations

17 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. 3.Send the message, removing a layer of encryption at each mix. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix.  Adversary UsersMixesDestinations

18 Mix Networks ud Adversary Anonymity? 1.No one mix knows both source and destination. UsersMixesDestinations

19 Mix Networks ud Adversary Anonymity? 1.No one mix knows both source and destination. 2.Adversary cannot follow multiple messages through the same mix. v f UsersMixesDestinations

20 Mix Networks ud Adversary Anonymity? 1.No one mix knows both source and destination. 2.Adversary cannot follow multiple messages through the same mix. 3.More users provides more anonymity. ve wf UsersMixesDestinations

21 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

22 Provable Anonymity in Mix Networks N users Passive, local adversary – Adversary observes some of the mixes and the links. – Fraction f of links are not observed by adversary. Users and mixes are roughly synchronized. Users choose mixes uniformly at random. Setting

23 Provable Anonymity in Mix Networks Users should be unlinkable to their destinations. Let  be a random permutation that maps users to destinations. Let C be the traffic matrix observed by the adversary during the protocol. C ei = # of messages on link e in round i Definition e1e1 e2e

24 Use information theory to quantify information gain from observing C. H(X) =  x -Pr[X=x] log(Pr[X=x]) is the entropy of r.v. X I(X : Y) is the mutual information between X and Y. I(X : Y) = H(X) – H(X | Y) =  x,y -Pr[X=x  Y=y] log(Pr[X=x  Y=y]) Provable Anonymity in Mix Networks Information-theory background

25 Provable Anonymity in Synchronous Protocols Definition: The protocol is  (N)-unlinkable if I(C :  )   (N). Definition: An  (N)-unlinkable protocol is efficient if: 1. It takes T(N) = O(polylog(N/  (N))) rounds. 2. It uses O(N  T(N)) messages. Theorem (Berman, Fiat, and Ta-Shma, 2004): The basic mixnet protocol is  (N)-unlinkable and efficient when T(N) =  (log(N) log 2 (N/  (N))).

26 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

27 Onion Routing [GRS’96] Practical design with low latency and overhead Connection-oriented, two-way communication Open source implementation ( Over 1000 volunteer routers Estimated 200,000 users

28 How Onion Routing Works User u running client Internet destination d Routers running servers ud

29 How Onion Routing Works ud u creates 3-hop circuit through routers (u.a.r.).

30 How Onion Routing Works ud u creates 3-hop circuit through routers (u.a.r.).

31 How Onion Routing Works ud u creates 3-hop circuit through routers (u.a.r.).

32 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. ud

33 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {{{  } 3 } 4 } 1 ud

34 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {{  } 3 } 4 ud

35 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {}3{}3 ud

36 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged.  ud

37 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. ’’ ud

38 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {  ’} 3 ud

39 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {{  ’} 3 } 4 ud

40 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {{{  ’} 3 } 4 } 1 ud

41 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. 4.Stream is closed. ud

42 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. 4.Stream is closed. 5.Circuit is changed every few minutes. ud

43 Adversary u d Active & Local

44 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

45 Formal Analysis (F., Johnson, and Syverson, 2007) u d v w e f Timing attacks result in four cases:

46 1.First router compromised Timing attacks result in four cases: u d v w e f Formal Analysis (F., Johnson, and Syverson, 2007)

47 1.First router compromised 2.Last router compromised Timing attacks result in four cases: u d v w e f Formal Analysis (F., Johnson, and Syverson, 2007)

48 1.First router compromised 2.Last router compromised 3.First and last compromised 4. Timing attacks result in four cases: u d v w e f Formal Analysis (F., Johnson, and Syverson, 2007)

49 1.First router compromised 2.Last router compromised 3.First and last compromised 4.Neither first nor last compromised Timing attacks result in four cases: u d v w e f Formal Analysis (F., Johnson, and Syverson, 2007)

50 Black-Box, Onion-Routing Model Let U be the set of users. Let  be the set of destinations. Let the adversary control a fraction b of the routers. Configuration C User destinations C D : U  Observed inputs C I : U  {0,1} Observed outputs C O : U  {0,1} Let X be a random configuration such that: Pr[X=C] =  u [p u C D (u) ][b C I (u) (1-b) 1-C I (u) ][b C O (u) (1-b) 1-C O (u) ]

51 Indistinguishability ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations

52 Indistinguishability ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations Note: Indistinguishable configurations form an equivalence relation.

53 Probabilistic Anonymity The metric Y for the linkability of u and d in C is: Y(C) = Pr[X D (u)=d | X  C]

54 Probabilistic Anonymity The metric Y for the linkability of u and d in C is: Y(C) = Pr[X D (u)=d | X  C] Note: This is different from the metric of mutual information used to analyze mix nets.

55 Probabilistic Anonymity The metric Y for the linkability of u and d in C is: Y(C) = Pr[X D (u)=d | X  C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary

56 Probabilistic Anonymity The metric Y for the linkability of u and d in C is: Y(C) = Pr[X D (u)=d | X  C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary Linkability given that u visits d: E[Y | X D (u)=d]

57 Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d

58 Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d 2.Upper bounds: a. p v  =1 for all v  u, where p v   p v e for e  d b. p v d =1 for all v  u

59 Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d 2.Upper bounds: a. p v  =1 for all v  u, where p v   p v e for e  d E[Y | X D (u)=d]  b + (1-b) p u d + O(  logn/n) b. p v d =1 for all v  u E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d + O(  logn/n)

60 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d

61 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof:

62 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]

63 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]

64 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let {C i } be the configuration equivalence classes. Let D i be the event C i  X D (u)=d.

65 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let {C i } be the configuration equivalence classes. Let D i be the event C i  X D (u)=d. E[Y | X D (u)=d  X I (u)=0]  =  i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d]

66 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let {C i } be the configuration equivalence classes. Let D i be the event C i  X D (u)=d. E[Y | X D (u)=d  X I (u)=0]  =  i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d]   (  i Pr[D i ]  Pr[C i ] /  Pr[C i ] ) 2 Pr[X D (u)=d] by Cauchy- Schwarz

67 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let {C i } be the configuration equivalence classes. Let D i be the event C i  X D (u)=d. E[Y | X D (u)=d  X I (u)=0]  =  i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d]   (  i Pr[D i ]  Pr[C i ] /  Pr[C i ] ) 2 Pr[X D (u)=d] = p u d by Cauchy- Schwarz

68 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]

69 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]  b 2 + b(1-b) p u d + (1-b) p u d

70 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]  b 2 + b(1-b) p u d + (1-b) p u d = b 2 + (1-b 2 ) p u d

71 Upper Bound

72 Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v  u occurs when 1. p v  =1 for all v  u OR 2. p v d =1 for all v  u Let p u 1  p u 2  p u d-1  p u d+1  …  p u 

73 Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v  u occurs when 1. p v  =1 for all v  u OR 2. p v d =1 for all v  u Let p u 1  p u 2  p u d-1  p u d+1  …  p u  Show max. occurs when, for all v  u, p v e v = 1 for some e v.

74 Show max. occurs when, for all v  u, e v = d or e v = . Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v  u occurs when 1. p v  =1 for all v  u OR 2. p v d =1 for all v  u Let p u 1  p u 2  p u d-1  p u d+1  …  p u  Show max. occurs when, for all v  u, p v e v = 1 for some e v.

75 Show max. occurs when, for all v  u, e v = d or e v = . Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v  u occurs when 1. p v  =1 for all v  u OR 2. p v d =1 for all v  u Let p u 1  p u 2  p u d-1  p u d+1  …  p u  Show max. occurs when, for all v  u, p v e v = 1 for some e v. Show max. occurs when e v =d for all v  u, or when e v =  for all v  u.

76 Upper-bound Estimates Let n be the number of users.

77 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ] Let n be the number of users.

78 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ] Theorem 5: When p v d =1 for all v  u: E[Y | X D (u)=d] = b 2 + b(1-b)p u d + (1-b) p u d /(1-(1- p u d ) b ) + O(  logn/n) ] Let n be the number of users.

79 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ] Let n be the number of users.

80 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ]  b + (1-b) p u d Let n be the number of users. For p u  small

81 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ]  b + (1-b) p u d E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let n be the number of users. For p u  small

82 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ]  b + (1-b) p u d E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let n be the number of users. Increased chance of total compromise from b 2 to b. For p u  small

83 Conclusions Many challenges remain in the design, implementation, and analysis of anonymous-communication systems. It is hard to prove theorems about real systems – or even to figure out what to prove. “Nothing is more practical than a good theory!” (Tanya Berger-Wolfe, UIC)