Towards a Theory of Onion Routing Aaron Johnson Yale University 5/27/2008.

Slides:



Advertisements
Similar presentations
Aaron Johnson with Joan Feigenbaum Paul Syverson
Advertisements

한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님
Advanced Piloting Cruise Plot.
1
1 Vorlesung Informatik 2 Algorithmen und Datenstrukturen (Parallel Algorithms) Robin Pomplun.
A Probabilistic Analysis of Onion Routing in a Black-box Model 10/29/2007 Workshop on Privacy in the Electronic Society Aaron Johnson (Yale) with Joan.
A Formal Analysis of Onion Routing 10/26/2007 Aaron Johnson (Yale) with Joan Feigenbaum (Yale) Paul Syverson (NRL)
© 2008 Pearson Addison Wesley. All rights reserved Chapter Seven Costs.
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Processes and Operating Systems
1 Copyright © 2013 Elsevier Inc. All rights reserved. Chapter 1 Embedded Computing.
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 6 Author: Julia Richards and R. Scott Hawley.
Author: Julia Richards and R. Scott Hawley
1 Copyright © 2013 Elsevier Inc. All rights reserved. Chapter 3 CPUs.
STATISTICS Random Variables and Distribution Functions
Properties Use, share, or modify this drill on mathematic properties. There is too much material for a single class, so you’ll have to select for your.
UNITED NATIONS Shipment Details Report – January 2006.
RXQ Customer Enrollment Using a Registration Agent (RA) Process Flow Diagram (Move-In) Customer Supplier Customer authorizes Enrollment ( )
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
Properties of Real Numbers CommutativeAssociativeDistributive Identity + × Inverse + ×
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
PUBLIC KEY CRYPTOSYSTEMS Symmetric Cryptosystems 6/05/2014 | pag. 2.
REVIEW: Arthropod ID. 1. Name the subphylum. 2. Name the subphylum. 3. Name the order.
Block Cipher Modes of Operation and Stream Ciphers
Turing Machines.
PP Test Review Sections 6-1 to 6-6
EU market situation for eggs and poultry Management Committee 20 October 2011.
EU Market Situation for Eggs and Poultry Management Committee 21 June 2012.
Bright Futures Guidelines Priorities and Screening Tables
5-1 Chapter 5 Theory & Problems of Probability & Statistics Murray R. Spiegel Sampling Theory.
Bellwork Do the following problem on a ½ sheet of paper and turn in.
IP Multicast Information management 2 Groep T Leuven – Information department 2/14 Agenda •Why IP Multicast ? •Multicast fundamentals •Intradomain.
BEEF & VEAL MARKET SITUATION "Single CMO" Management Committee 18 April 2013.
VOORBLAD.
Chapter 4 Gates and Circuits.
Name Convolutional codes Tomashevich Victor. Name- 2 - Introduction Convolutional codes map information to code bits sequentially by convolving a sequence.
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Factor P 16 8(8-5ab) 4(d² + 4) 3rs(2r – s) 15cd(1 + 2cd) 8(4a² + 3b²)
Basel-ICU-Journal Challenge18/20/ Basel-ICU-Journal Challenge8/20/2014.
1..
Routing and Congestion Problems in General Networks Presented by Jun Zou CAS 744.
CONTROL VISION Set-up. Step 1 Step 2 Step 3 Step 5 Step 4.
© 2012 National Heart Foundation of Australia. Slide 2.
Understanding Generalist Practice, 5e, Kirst-Ashman/Hull
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Synthetic.
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
25 seconds left…...
10 -1 Chapter 10 Amortized Analysis A sequence of operations: OP 1, OP 2, … OP m OP i : several pops (from the stack) and one push (into the stack)
Analyzing Genes and Genomes
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
Essential Cell Biology
Intracellular Compartments and Transport
PSSA Preparation.
Essential Cell Biology
Immunobiology: The Immune System in Health & Disease Sixth Edition
1 Chapter 13 Nuclear Magnetic Resonance Spectroscopy.
Energy Generation in Mitochondria and Chlorplasts
The Small World Phenomenon: An Algorithmic Perspective Speaker: Bradford Greening, Jr. Rutgers University – Camden.
Impossibility of Consensus in Asynchronous Systems (FLP) Ali Ghodsi – UC Berkeley / KTH alig(at)cs.berkeley.edu.
TCP/IP Protocol Suite 1 Chapter 18 Upon completion you will be able to: Remote Login: Telnet Understand how TELNET works Understand the role of NVT in.
Distributed Computing 5. Snapshot Shmuel Zaks ©
Anonymity Analysis of Onion Routing in the Universally Composable Framework Joan Feigenbaum Aaron Johnson Paul Syverson Yale University U.S. Naval Research.
1 Analyzing Anonymity Protocols 1.Analyzing onion-routing security 1.Anonymity Analysis of Onion Routing in the Universally Composable Framework in Provable.
1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum WITS’08; Princeton NJ; June 18, 2008 Acknowledgement:
Aaron Johnson U.S. Naval Research Laboratory CSci 6545 George Washington University 11/18/2013.
On the Anonymity of Anonymity Systems Andrei Serjantov (anonymous)
Presentation transcript:

Towards a Theory of Onion Routing Aaron Johnson Yale University 5/27/2008

Overview 1.Anonymous communication and onion routing 2.Formally model and analyze onion routing (Financial Cryptography 2007) 3.Probabilistic analysis of onion routing (Workshop on Privacy in the Electronic Society 2007) 1

Anonymous Communication: What? Setting 2

Anonymous Communication: What? Setting –Communication network 2

Anonymous Communication: What? Setting –Communication network –Adversary 2

Anonymous Communication: What? Setting –Communication network –Adversary Anonymity 2

Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity 2

Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity –Receiver anonymity 2

Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity –Receiver anonymity w.r.t. a message 2

Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity –Receiver anonymity –Unlinkability w.r.t. a message 2

Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity –Receiver anonymity –Unlinkability w.r.t. a message w.r.t. all communication 2

Anonymous Communication: Why? 3

Useful –Individual privacy online –Corporate privacy –Government and foreign intelligence –Whistleblowers 3

Anonymous Communication: Why? Useful –Individual privacy online –Corporate privacy –Government and foreign intelligence –Whistleblowers Interesting –How to define? –Possible in communication networks? –Cryptography from anonymity 3

Anonymous Communication Protocols Mix Networks (1981) Dining cryptographers (1988) Onion routing (1999) Anonymous buses (2002) 4

Anonymous Communication Protocols Mix Networks (1981) Dining cryptographers (1988) Onion routing (1999) Anonymous buses (2002) Crowds (1998) PipeNet (1998) Xor-trees (2000) 4 Tarzan (2002) Hordes (2002) Salsa (2006) ISDN,pool, Stop-and-Go, timed,cascade mixes etc.

Deployed Anonymity Systems anon.penet.fi Freedom Mixminion Mixmaster Tor JAP FreeNet anonymizer.com and other single-hop proxies I2P MUTE Nodezilla etc. 5

Onion Routing Practical design with low latency and overhead Open source implementation ( Over 1000 volunteer routers Estimated 200,000 users Sophisticated design 6

Anonymous Communication Mix Networks Dining cryptographers Onion routing Anonymous buses DeployedAnalyzed 7

A Model of Onion Routing with Provable Anonymity Johnson, Feigenbaum, and Syverson Financial Cryptography 2007 Formally model onion routing using input/output automata Characterize the situations that provide possibilistic anonymity 8

How Onion Routing Works User u running client Internet destination d Routers running servers ud

How Onion Routing Works ud 1. u creates l-hop circuit through routers

How Onion Routing Works ud 1. u creates l-hop circuit through routers

How Onion Routing Works ud 1. u creates l-hop circuit through routers

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {{{m} 3 } 4 }

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {{m} 3 }

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {m}

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged m

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged m

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {m}

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {{m} 3 }

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {{{m} 3 } 4 }

How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged. 4.Stream is closed

How Onion Routing Works u 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged. 4.Stream is closed. 5.Circuit is changed every few minutes d 9

How Onion Routing Works u d 10

How Onion Routing Works u d 11

How Onion Routing Works u d Theorem 1: Adversary can only determine parts of a circuit it controls or is next to. 11

How Onion Routing Works u d Theorem 1: Adversary can only determine parts of a circuit it controls or is next to. u12 11

Model Constructed with I/O automata (Lynch & Tuttle, 1989) –Models asynchrony –Relies on abstract properties of cryptosystem Simplified onion-routing protocol –Each user constructs a circuit to one destination –No separate destinations –No circuit teardowns Circuit identifiers 12

Automata Protocol u v w 13

Automata Protocol u v w 13

Automata Protocol u v w 13

Automata Protocol u v w 13

Automata Protocol u v w 13

Automata Protocol u v w 13

Automata Protocol u v w 13

Automata Protocol u v w 13

Automata Protocol u v w 13

Automata Protocol u v w 13

Creating a Circuit u123 15

Creating a Circuit [0,{CREATE} 1 ] 1.CREATE/CREATED u123 15

Creating a Circuit [0,CREATED] 1.CREATE/CREATED u123 15

Creating a Circuit 1.CREATE/CREATED u123 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [0,{[EXTEND,2, {CREATE} 2 ]} 1 ] u123 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [l 1,{CREATE} 2 ] u123 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [l 1,CREATED] u123 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [0,{EXTENDED} 1 ] u123 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [0,{{[EXTEND,3, {CREATE} 3 ]} 2 } 1 ] u123 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] u123 [l 1,{[EXTEND,3, {CREATE} 3 ]} 2 ] 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [l 2,{CREATE} 3 ] u123 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [l 2,CREATED] u123 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [l 1,{EXTENDED} 2 ] u123 15

Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [0,{{EXTENDED} 2 } 1 ] u123 15

Input/Ouput Automata States Actions transition between states Alternating state/action sequence is an execution In fair executions actions enabled infinitely often occur infinitely often In cryptographic executions no encrypted protocol messages are sent before they are received unless the sender possesses the key 14

I/O Automata Model Automata –User –Server –Complete network of FIFO Channels –Adversary replaces some servers with arbitrary automata Notation –U is the set of users –R is the set of routers –N = U R is the set of all agents –A N is the adversary –K is the keyspace –l is the (fixed) circuit length –k(u,c,i) denotes the ith key used by user u on circuit c 16

User automaton 17

Server automaton 18

Anonymity 19 Definition (configuration): A configuration is a function U R l mapping each user to his circuit.

Anonymity Definition (indistinguishable executions): Executions and are indistinguishable to adversary A when his actions in are the same as in after possibly applying the following: : A permutation on the keys not held by A. : A permutation on the messages encrypted by a key not held by A. Definition (configuration): A configuration is a function U R l mapping each user to his circuit. 19

Anonymity 20 Definition (indistinguishable configurations): Configurations C and D are indistinguishable to adversary A when, for every fair, cryptographic execution C, there exists a fair, cryptographic execution D that is indistinguishable to A.

Anonymity Definition (unlinkability): User u is unlinkable to d in configuration C with respect to adversary A if there exists an indistinguishable configuration D in which u does not talk to d. 20 Definition (indistinguishable configurations): Configurations C and D are indistinguishable to adversary A when, for every fair, cryptographic execution C, there exists a fair, cryptographic execution D that is indistinguishable to A.

C u v Main Theorems

3 2 D 21 Main Theorems C u v

21 Main Theorems C u v D v u 225 4

21 C u v Main Theorems D u v

Theorem 1: Let C and D be configurations for which there exists a permutation : U U such that C i (u) = D i ( (u)) if C i (u) or D i ( (u)) is compromised or is adjacent to a compromised router. Then C and D are indistinguishable. 21 Main Theorems

Theorem 1: Let C and D be configurations for which there exists a permutation : U U such that C i (u) = D i ( (u)) if C i (u) or D i ( (u)) is compromised or is adjacent to a compromised router. Then C and D are indistinguishable. 21 Main Theorems Theorem 2: Given configuration C, let (r i-1,r i,r i+1 ) be three consecutive routers in a circuit such that {r i-1,r i,r i+1 } A=. Let D be identical to configuration C except r i has been replaced with r i A. Then C and D are indistinguishable.

Theorem 1: Let C and D be configurations for which there exists a permutation : U U such that C i (u) = D i ( (u)) if C i (u) or D i ( (u)) is compromised or is adjacent to a compromised router. Then C and D are indistinguishable. 21 Main Theorems Theorem 2: Given configuration C, let (r i-1,r i,r i+1 ) be three consecutive routers in a circuit such that {r i-1,r i,r i+1 } A=. Let D be identical to configuration C except r i has been replaced with r i A. Then C and D are indistinguishable. Theorem 3: If configurations C and D are indistinguishable, then D can be reached from C by applying a sequence transformations of the type described in Theorems 1 and 2.

Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A. 22

Proof: Given execution of C, construct : 1. Replace any message sent or received between u (v) and C 1 (u) (C 1 (v)) in with a message sent or received between v (u) and C 1 (u) (C 1 (v)). 22 Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.

Proof: Given execution of C, construct : 1. Replace any message sent or received between u (v) and C 1 (u) (C 1 (v)) in with a message sent or received between v (u) and C 1 (u) (C 1 (v)). 2. Let the permutation send u to v and v to u and other users to themselves. Apply to the encryption keys. 22 Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.

Proof: Given execution of C, construct : 1. Replace any message sent or received between u (v) and C 1 (u) (C 1 (v)) in with a message sent or received between v (u) and C 1 (u) (C 1 (v)). 2. Let the permutation send u to v and v to u and other users to themselves. Apply to the encryption keys. i. is an execution of D. ii. is fair. iii. is cryptographic. iv. is indistinguishable. 22 Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.

Unlinkability Corollary: A user is unlinkable to its destination when: 23

Unlinkability 2 3 u 4? 5? The last router is unknown. Corollary: A user is unlinkable to its destination when: 23

OR Unlinkability 2 3 u 4? 5? The last router is unknown The user is unknown and another unknown user has an unknown destination. 5 2? 5? 4? Corollary: A user is unlinkable to its destination when: 23

OR The user is unknown and another unknown user has a different destination Unlinkability 2 3 u 4? 5? The last router is unknown The user is unknown and another unknown user has an unknown destination. 5 2? 5? 4? Corollary: A user is unlinkable to its destination when: 23

Model Robustness Only single encryption still works Can include data transfer Can allow users to create multiple circuits 24

A Probabilistic Analysis of Onion Routing in a Black-box Model Johnson, Feigenbaum, and Syverson Workshop on Privacy in the Electronic Society 2007 Use a black-box abstraction to create a probabilistic model of onion routing Analyze unlinkability Provide upper and lower bounds on anonymity Examine a typical case 25

Anonymity u d v w e f 26

Anonymity u d 1.First router compromised v w e f 26

Anonymity u d 1.First router compromised 2.Last router compromised v w e f 26

Anonymity u d 1.First router compromised 2.Last router compromised 3.First and last compromised 4. v w e f 26

Anonymity u d 1.First router compromised 2.Last router compromised 3.First and last compromised 4.Neither first nor last compromised v w e f 26

Black-box Abstraction ud v w e f 27

Black-box Abstraction ud v w e f 1. Users choose a destination 27

Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed 27

Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed 3.Some outputs are observed 27

Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. 28

Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary. 28

Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary. 28

Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary. 28

Probabilistic Black-box ud v w e f 29

Probabilistic Black-box ud v w e f Each user v selects a destination from distribution p v pupu 29

Probabilistic Black-box ud v w e f Each user v selects a destination from distribution p v Inputs and outputs are observed independently with probability b pupu 29

Black Box Model Let U be the set of users. Let be the set of destinations. Configuration C User destinations C D : U Observed inputs C I : U {0,1} Observed outputs C O : U {0,1} Let X be a random configuration such that: Pr[X=C] = u [p u C D (u) ][b C I (u) (1-b) 1-C I (u) ][b C O (u) (1-b) 1-C O (u) ] 30

Probabilistic Anonymity ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations 31 Conditional distribution: Pr[u d] = 1

Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary Unlinkability given that u visits d: E[Y | X D (u)=d] 32

Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d 33

Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d 2.Upper bounds: a. p v =1 for all v u, where p v p v e for e d b. p v d =1 for all v u 33

Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d 2.Upper bounds: a. p v =1 for all v u, where p v p v e for e d E[Y | X D (u)=d] b + (1-b) p u d + O( logn/n) b. p v d =1 for all v u E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d + O( logn/n) 33

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let C i be the configuration equivalence classes. Let D i be the event C i X D (u)=d. 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let C i be the configuration equivalence classes. Let D i be the event C i X D (u)=d. E[Y | X D (u)=d X I (u)=0] = i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d] 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let C i be the configuration equivalence classes. Let D i be the event C i X D (u)=d. E[Y | X D (u)=d X I (u)=0] = i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d] ( i Pr[D i ] Pr[C i ] / Pr[C i ] ) 2 Pr[X D (u)=d] by Cauchy- Schwartz 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let C i be the configuration equivalence classes. Let D i be the event C i X D (u)=d. E[Y | X D (u)=d X I (u)=0] = i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d] ( i Pr[D i ] Pr[C i ] / Pr[C i ] ) 2 Pr[X D (u)=d] = p u d by Cauchy- Schwartz 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] b 2 + b(1-b) p u d + (1-b) p u d 34

Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] b 2 + b(1-b) p u d + (1-b) p u d = b 2 + (1-b 2 ) p u d 34

Upper Bound 35

Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u 35

Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u Show max. occurs when, for all v u, p v e v = 1 for some e v. 35

Show max. occurs when, for all v u, e v = d or e v =. Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u Show max. occurs when, for all v u, p v e v = 1 for some e v. 35

Show max. occurs when, for all v u, e v = d or e v =. Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u Show max. occurs when, for all v u, p v e v = 1 for some e v. Show max. occurs when e v =d for all v u, or when e v = for all v u. 35

Upper-bound Estimates Let n be the number of users. 36

Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users. 36

Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Theorem 5: When p v d =1 for all v u: E[Y | X D (u)=d] = b 2 + b(1-b)p u d + (1-b) p u d /(1-(1- p u d ) b ) + O( logn/n) ] Let n be the number of users. 36

Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users. 36

Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d Let n be the number of users. For p u small 36

Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users. For p u small 36

Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users. Increased chance of total compromise from b 2 to b. For p u small 36

Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 6: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n) 37

Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 6: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n) E[Y | X D (u)=d] b 2 + ( 1 b 2 )p u d 37

Future Work Investigate improved protocols to defeat timing attacks. Examine how quickly users distribution are learned. Formally analyze scalable, P2P designs. 38

Related work A Formal Treatment of Onion Routing Jan Camenisch and Anna Lysyanskaya CRYPTO 2005 A formalization of anonymity and onion routing S. Mauw, J. Verschuren, and E.P. de Vink ESORICS 2004 I/O Automaton Models and Proofs for Shared- Key Communication Systems Nancy Lynch CSFW

Overview Formally model onion routing using input/output automata –Simplified onion-routing protocol –Non-cryptographic analysis Characterize the situations that provide anonymity 6

Overview Formally model onion routing using input/output automata –Simplified onion-routing protocol –Non-cryptographic analysis Characterize the situations that provide anonymity –Send a message, receive a message, communicate with a destination –Possibilistic anonymity 6

Future Work Construct better models of time Exhibit a cryptosystem with the desired properties Incorporate probabilistic behavior by users 26

Related Work A Model of Onion Routing with Provable Anonymity J. Feigenbaum, A. Johnson, and P. Syverson FC 2007 Towards an Analysis of Onion Routing Security P. Syverson, G. Tsudik, M. Reed, and C. Landwehr PET 2000 An Analysis of the Degradation of Anonymous Protocols M. Wright, M. Adler, B. Levine, and C. Shields NDSS 2002