Download presentation
Presentation is loading. Please wait.
Published byKatrine Sivertsen Modified over 5 years ago
1
Anonymity Modified from Christo Wilson, Levente Buttyan, Michael K. Reiter and Aviel D. Rubin
2
What do we mean by privacy?
Louis Brandeis (1890) “right to be left alone” protection from institutional threat: government, press Alan Westin (1967) “right to control, edit, manage, and delete information about themselves and decide when, how, and to what extent information is communicated to others”
3
Privacy vs Security Privacy: what information goes where?
Security: protection against unauthorized access Security helps enforce privacy policies Can be at odds with each other e.g., screening to make us more “secure”
4
Anonymity vs Security Anonymity: being unidentifiable in interactions
Security: confidentiality of communication You may need one or both Pope-mobile is completely protected but everyone knows him and can see him Riding metro during rush hour provides little security or privacy but provides some anonymity Riding uber provides privacy and some security but not anonymity
5
Trust vs Security Trust: Assurance of proper implementation of promised security services Security: Use of security services
6
Privacy-sensitive information
Personally Identifiable Information (PII) Name, address, phone number, etc. Location Activity web history, contact history online purchases Health records health related searches … and more
7
Tracking on the web IP address
Number identifying your device on the Internet Visible to site you are visiting Not always permanent Cookies Used to save preferences, shopping cart, etc. Text stored on your computer by site Sent back to site by your browser Can track you even if IP changes Internet
8
User privacy – approaches
data avoidance “I don’t tell you, so you can’t abuse it.” effective but not always applicable often requires anonymity examples: cash transactions, public phones data protection “If ever you abuse it, you will be punished.” well-established approach difficult to define, enforce, and control requires legislation or voluntary restrictions multilateral security cooperation of more than two parties shared responsibilities and partial knowledge combinations of the above
9
Anonymous Communication
Hiding the identitie(s) of the parties involved in digital communications from each other, or from third-parties “Who you are” from the communicating party “Who you are talking to” from everyone else
10
But, It’s Hard to be Anonymous!
Your network location (IP address) can be linked directly to you ISPs store communications records Usually for several years (Data Retention) Law enforcement can subpoena these records Your application is being tracked Cookies, Flash, E-Tags, HTML5 Storage Centralized services like Skype, Google voice Browser fingerprinting Ad networks Your activities can be used to identify you Unique websites and apps that you use Types of links that you click
11
Anonymous Communication
From whom do we want to hide these information? communication partner (sender anonymity) external attackers local eavesdropper sniffing on a particular link (e.g., LAN) global eavesdropper observing traffic in the whole network internal attackers (colluding) compromised system elements (e.g., routers)
12
Who uses Anonymity Systems
If you are a cyber-criminal! “If you’re not doing anything wrong, you shouldn’t have anything to hide.” Implies that anonymous communication is for criminals DRM infringement, hacker, spammer, terrorist, etc. But, also if you are: Journalist Whistleblower Human rights activist Business executive Military/intelligence personnel Abuse victims Tor was developed via a Navy Sponsored project
13
Who uses Anonymity Systems
How about normal people? Avoid tracking by advertising companies Protect sensitive personal information from businesses, like insurance companies, banks, etc. Viewing sensitive content Information on medical conditions Advice on bankruptcy Express unpopular or controversial opinions Not every country guarantees free speech Have a dual life A professor who is also a pro in World of Warcraft! Try uncommon things Consulates … It feels good to have some privacy!
14
Types of Attackers Collaborating crowd members Local eavesdropper
crowd members that can pool their information and deviate from the protocol Local eavesdropper can observe communication to and from the users computer End server the web server to which the transaction is directed
15
Definitions Unlinkability Unobservability
From the adversaries perspective, the inability the link two or more items of interest E.g. packets, events, people, actions, etc. Three parts: Sender anonymity (who sent this?) Receiver anonymity (who is the destination?) Relationship anonymity (are sender A and receiver B linked?) Unobservability From the adversaries perspective, items of interest are indistinguishable from all other items
16
Quantifying Anonymity
How can we calculate how anonymous we are? Anonymity Sets Suspects (Anonymity Set) Who sent this message? Larger anonymity set = stronger anonymity
17
Crypto (SSL) Content is unobservable
Data Traffic Content is unobservable Due to encryption Source and destination are trivially linkable No anonymity!
18
Anonymizing Proxies Source is known Destination anonymity
HTTPS Proxy No anonymity! Source is known Destination anonymity Destination is known Source anonymity
19
Anonymizing VPNs Source is known Destination anonymity No anonymity!
VPN Gateway No anonymity! Source is known Destination anonymity Destination is known Source anonymity
20
Using Content to Deanonymize
HTTPS Proxy Reading Gmail Looking up directions to home Updating your G+ profile Etc… No anonymity! Fact: the NSA leverages common cookies from ad networks, social networks, etc. to track users
21
Dining Cryptographers
Cryptographers are having dinner Either NSA is paying for the dinner, or One of them is paying, but wishes to remain anonymous Each diner flips a coin and shows it to his left neighbor Every diner will see two coins: his own and his right neighbor’s Each diner announces whether the two coins are same If he is the payer, he lies (says the opposite) Odd number of “same” NSA is paying; Even number of “same” one of them is paying But a non-payer cannot tell which of the other two is paying! About 3. Odd number of “same” arises only if everyone tells the truth (non-payers tell the truth)
22
Dining Cryptographers
“Same” “Same” Act I: Three of a kind Flip Coins Make Observations Count Observations Result: #Diffs: 0 (Even) NSA Paid “Same”
23
Dining Cryptographers
“Different” “Same” Act II: Two of a kind Flip Coins Make Observations Count Observations Result: #Diffs: 2 (Even) NSA Paid “Different”
24
Dining Cryptographers
“Different” “Different” * Inverted * Act III: Two of a kind + Inversion I’m paying, but no one knows it’s me! Flip Coins Make Observations Count Observations Result: #Diffs: 3 (Odd) Cryptographer Paid “Different”
25
Non-Payer’s View: Same Coins
“different” ? “same” “different” ? payer payer Even number of “same” so I know someone is paying, but who? I cannot see inside the cloud. If it were H, then the one on the left must be lying, if T then the one on the right. But H and T are equally likely. Without knowing the coin toss between the other two, non-payer cannot tell which of them is lying
26
Non-Payer’s View: Different Coins
“same” “same” ? “same” ? payer payer Without knowing the coin toss between the other two, non-payer cannot tell which of them is lying
27
Proof Sketch (By Induction)
Diffs H T Diffs H T All heads or all tails: 0 Diffs One tail, rest heads: 2 Diffs On each side of tail Two tails, rest heads: 2 cases: Two tails are adjacent: 2 Diffs Two tails nonadjacent: 4 Diffs N+1 tails, rest heads: three cases: New tail is adjacent to one string of tails: No change New tail is nonadjacent to any string of tails: Two more diffs New tail connects two strings of tails: Two fewer diffs Result: If everyone tells the truth, there will always be an even number of differences T H T Diffs H T Diffs T
28
Superposed Sending This idea generalizes to any group of size N
For each bit of the message, every user generates 1 random bit and sends it to 1 neighbor Every user learns 2 bits (his own and his neighbor’s) Each user announces own bit XOR neighbor’s bit Sender announces own bit XOR neighbor’s bit XOR message bit XOR of all announcements = message bit Every randomly generated bit occurs in this sum twice (and is canceled by XOR), message bit occurs once
29
Graph Theory Interpretation
Anonymity Set Persons=Nodes Keys=Edges Shared by nodes Anonymity Set: The set of nodes whose transmissions are indistinguishable Collusion Sharing keys to expose another person’s transmissions Anonymity Set 1 Anonymity Set 0 Partial Collusion: Not all keys shared
30
Keys and Compromises A “key” is really just a history of all the quarters that will ever be flipped between two participants. E.g., a string of bits Key compromise means that a third party also knows the results of each flip. Shared Keys
31
Dining Cryptographers
Clever idea how to make a message public in a perfectly untraceable manner “The dining cryptographers problem: unconditional sender and recipient untraceability.” Journal of Cryptology, 1988 Guarantees information-theoretic anonymity for message senders This is an unusually strong form of security defeats adversary who has unlimited computational power Impractical, requires huge amount of randomness In group of size N, need N random bits to send 1 bit
32
Crypto Algorithms Symmetric algorithms Asymmetric algorithms
Conventional algorithms Encryption and decryption keys are the same Examples: DES, 3DES, AES, Blowfish Asymmetric algorithms Commonly known as Public Key algorithms Encryption key and decryption key are different Examples: Diffie-Hellman, RSA, Elliptic curve
33
Symmetric encryption is reversible
Symmetric Key Crypto Let’s say we have a plaintext message M … a symmetric encryption algorithm E … and a key K M E(K, M) = C E(K, C) = M Advantages: Fast and easy to use Disadvantages How to securely communicate the key? Cyphertext C Symmetric encryption is reversible
34
Public Key Crypto Let’s say we have plaintext message M
… a public key algorithm F … and two keys KP (public) and KS (private) M F(KP, M) = C F(KS, C) = M M F(KS, M) = C F(KP, C) = M Encrypt with the public key… Decrypt with the private key
35
Public Key Crypto in Action
KP <KP, KS> KP Safe to distribute the public key KP Can only decrypt with the private key KS Computationally infeasible to derive KS from KP
36
Anonymity loves company
The sole mechanism of anonymity is blending and obfuscation. The Crowds approach Data may be in clear text Hide in a group and make everyone in the group equally responsible for an act The Mix approach Obfuscate the data Blend the data with cover traffic The Onion Routing approach Obfuscate the data Use cell padding to make data look similar
38
Crowds
39
Crowds Example Links between users use public key crypto
Users may appear on the path multiple times Final Destination
40
Anonymity in Crowds No source anonymity
Target receives m incoming messages (m may = 0) Target sends m + 1 outgoing messages Thus, the target is sending something Destination anonymity is maintained If the source isn’t sending directly to the receiver
41
Anonymity in Crowds Source and destination are anonymous
Source and destination are jondo proxies Destination is hidden by encryption
42
Anonymity in Crowds Destination is known Source is anonymous Obviously
O(n) possible sources, where n is the number of jondos
43
Anonymity in Crowds Destination is known Source is somewhat anonymous
Evil jondo is able to decrypt the message Source is somewhat anonymous Suppose there are c evil jondos in the system If pf > 0.5, and n > 3(c + 1), then the source cannot be inferred with probability > 0.5
44
Other Implementation Details
Crowds requires a central server called a Blender Keep track of who is running jondos Kind of like a BitTorrent tracker Broadcasts new jondos to existing jondos Facilitates exchanges of public keys
45
Summary of Crowds The good: The bad: Crowds has excellent scalability
Each user helps forward messages and handle load More users = better anonymity for everyone Strong source anonymity guarantees The bad: Very weak destination anonymity Evil jondos can always see the destination Weak unlinkability guarantees
46
Mix Networks A different approach to anonymity than Crowds
Originally designed for anonymous David Chaum, 1981 Concept has since been generalized for TCP traffic Hugely influential ideas Onion routing Traffic mixing Dummy traffic (a.k.a. cover traffic)
47
MIX Network goal implementation
sender anonymity (for communication partner) unlinkability (for global eavesdropper) implementation { r, m }KMIX MIX m where m is the message and r is a random number MIX batches messages discards repeats changes order changes encoding
48
Mix Proxies and Onion Routing
Encrypted Tunnels Mix [KP , KP , KP] <KP, KS> <KP, KS> <KP, KS> <KP, KS> <KP, KS> <KP, KS> <KP, KS> <KP, KS> Non-encrypted data E(KP , E(KP , E(KP , M))) = C Mixes form a cascade of anonymous proxies All traffic is protected with layers of encryption
49
Another View of Encrypted Paths
<KP, KS> <KP, KS> <KP, KS>
50
MIX chaining defense against colluding compromised MIXes
if a single MIX behaves correctly, unlinkability is still achieved MIX MIX MIX
51
Return Traffic In a mix network, how can the destination respond to the sender? During path establishment, the sender places keys at each mix along the path Data is re-encrypted as it travels the reverse path <KP1 , KS1> KP1 KP2 KP3 <KP2 , KS2> <KP3 , KS3>
52
Traffic Mixing Hinders timing attacks Problems: 1 1 4 2 2 3 3 4
Mix collects messages for t seconds Messages are randomly shuffled and sent in a different order Hinders timing attacks Messages may be artificially delayed Temporal correlation is warped Problems: Requires lots of traffic Adds latency to network flows Arrival Order Send Order 1 1 4 2 2 3 3 4
53
Dummy / Cover Traffic Simple idea:
Send useless traffic to help obfuscate real traffic
54
Anonymizer www.anonymizer.com special protection for HTTP traffic
acts as a proxy for browser requests rewrites links in web pages and adds a form where URLs can be entered for quick jump disadvantages: must be trusted single point of failure/attack browser request anonymizer request server reply reply href =“ href =“
55
Onion routing Hides a user’s communication Messages are encrypted
through a set of relay nodes even from the destination Messages are encrypted Relay Directory Server Message Source Destination Entrance relay Exit relay
56
Onion routing general purpose infrastructure for anonymous comm.
supports several types of applications through the use of application specific proxies operates over a (logical) network of onion routers onion routers are real-time Chaum MIXes messages are passed on nearly in real-time this may limit mixing and weaken the protection! onion routers are under the control of different administrative domains makes collusion less probable anonymous connections through onion routers are built dynamically to carry application data distributed, fault tolerant, and secure
57
Overview of architecture
long-term socket connections application (initiator) onion router application proxy - prepares the data stream for transfer - sanitizes appl. data - processes status msg sent by the exit funnel application (responder) exit funnel - demultiplexes connections from the OR network - opens connection to responder application and reports a one byte status msg back to the application proxy onion proxy - opens the anonymous connection via the OR network - encrypts/decrypts data entry funnel - multiplexes connections from onion proxies
58
Onions an onion is a multi-layered data structure
it encapsulates the route of the anonymous connection within the OR network each layer contains backward crypto function (DES-OFB, RC4) forward crypto function (DES-OFB, RC4) IP address and port number of the next onion router expiration time key seed material used to generate the keys for the backward and forward crypto functions each layer is encrypted with the public key of the onion router for which data in that layer is intended bwd fn | fwd fn | next = blue | keys bwd fn | fwd fn | next = green | keys bwd fn | fwd fn | next = 0 | keys
59
Anonymous connection setup
onion proxy onion application (responder)
60
Anonymous connection setup
onion proxy application (responder) onion bwd: entry funnel, crypto fns and keys fwd: blue, ACI = 12, crypto fns and keys
61
Anonymous connection setup
onion proxy onion ACI = 12 application (responder)
62
Anonymous connection setup
onion proxy application (responder) onion bwd: magenta, ACI = 12, crypto fns and keys fwd: green, ACI = 8, crypto fns and keys
63
Anonymous connection setup
onion proxy onion ACI = 8 application (responder)
64
Anonymous connection setup
onion proxy application (responder) onion bwd: blue, ACI = 8, crypto fns and keys fwd: exit funnel
65
Anonymous connection setup
bwd: entry funnel, crypto fns and keys fwd: blue, ACI = 12, crypto fns and keys onion proxy standard structure bwd: blue, ACI = 8, crypto fns and keys status fwd: exit funnel open socket application (responder) bwd: magenta, ACI = 12, crypto fns and keys fwd: green, ACI = 8, crypto fns and keys
66
Data movement forward direction backward direction
the onion proxy adds all layers of encryption as defined by the anonymous connection each onion router on route removes one layer of encryption responder application receives plaintext data backward direction the responder application sends plaintext data to the last onion router of the connection due to sender anonymity it doesn’t even know who is the real initiator application each onion router adds one layer of encryption the onion proxy removes all layers of encryption
67
Tor: The 2nd Generation Onion Router
Basic design: a mix network with improvements Perfect forward secrecy Introduces guards to improve source anonymity Takes bandwidth into account when selecting relays Mixes in Tor are called relays Introduces hidden services Servers that are only accessible via the Tor overlay
68
Deployment and Statistics
Largest, most well deployed anonymity preserving service on the Internet Publicly available since 2002 Continues to be developed and improved Currently, ~7.5K Tor relays and ~ 3K bridges All relays are run by volunteers It is suspected that some are controlled by intelligence agencies Over 2M daily users 100 Gbit/sec consumed bandwith 0.8 Gbit/sec hidden service traffic
69
How Do You Use Tor? Download, install, and execute the Tor client
The client acts as a SOCKS proxy The client builds and maintains circuits of relays Configure your browser to use the Tor client as a proxy Any app that supports SOCKS proxies will work with Tor All traffic from the browser will now be routed through the Tor overlay
70
Design Overlay network Onion routers route traffic
Onion Proxy fetches directories and creates circuits on the network Uses TCP All data is sent in fixed size cells CircID Data CMD CircID StreamID Digest Len CMD Data Relay
71
Tor Components Entrance Node Exit Node Directory Servers
The first node in a circuit Knows the user Exit Node Final node in the circuit Knows the destination May see actual message Directory Servers Keep list of which onion routers are up, their locations, current keys, exit policies, etc Control which nodes can join network
72
Selecting Relays How do clients locate the Tor relays?
Tor Consensus File Hosted by trusted directory servers Lists all known relays IP address, uptime, measured bandwidth, etc. Not all relays are created equal Entry/guard and exit relays are specially labelled Why? Tor does not select relays randomly Chance of selection is proportional to bandwidth Why? Is this a good idea?
73
How Tor Works? A circuit is built incrementally one hop by one hop
Alice Bob C2 √ C1 C3 M M OR2 M C2 C3 M OR1 OR3 C1 C2 C3 Port A circuit is built incrementally one hop by one hop Onion-like encryption Alice negotiates an AES key with each router Messages are divided into equal sized cells Each router knows only its predecessor and successor Only the Exit router (OR3) can see the message, however it does not know where the message is from
74
Attacks Against Tor Circuits
Source: known Dest: unknown Source: unknown Dest: unknown Source: known Dest: known Source: unknown Dest: known Entry/ Guard Middle Exit Tor users can choose any number of relays Default configuration is 3 Why would higher or lower number be better or worse?
75
Flow Multiplication Attack
Assumptions: N total relays M of which are controlled by an attacker Attacker goal: control the first and last relay M/N chance for first relay (M-1)/(N-1) chance for the last relay Roughly (M/N)2 chance overall, for a single circuit However, client periodically builds new circuits Over time, the chances for the attacker to be in the correct positions improves!
76
Timing Attack
77
Intersection Attack It can take many observation periods before useable results appear Tor is a large network where suspect set is very large
78
Fingerprinting Attack
Dynamic websites are difficult to fingerprint
79
Congestion Attacks Trudy
80
Application Level Attacks
Plugins Flash, Java, Quicktime DNS BitTorrent URI iTunes Code injection Active documents PDF, word, excel Clickjacking Software Vulnerabilities
81
Guard Relays Guard relays help prevent attackers from becoming the first relay Tor selects 3 guard relays and uses them for 3 months After 3 months, 3 new guards are selected Only relays that: Have long and consistent uptimes… Have high bandwidth… And are manually vetted may become guards Problem: what happens if you choose an evil guard? M/N chance of full compromise
82
Hidden Services Tor is very good at hiding the source of traffic
But the destination is often an exposed website What if we want to run an anonymous service? i.e. a website, where nobody knows the IP address? Tor supports Hidden Services Allows you to run a server and have people connect … without disclosing the IP or DNS name
83
Hidden Service Example
Introduction Points Hidden Service Rendezvous Point Onion URL is a hash, allows any Tor user to find the introduction points
84
Perfect Forward Secrecy
In traditional mix networks, all traffic is encrypted using public/private keypairs Problem: what happens if a private key is stolen? All future traffic can be observed and decrypted If past traffic has been logged, it can also be decrypted Tor implements Perfect Forward Secrecy (PFC) The client negotiates a new public key pair with each relay Original keypairs are only used for signatures i.e. to verify the authenticity of messages An attacker who compromises a private key can still eavesdrop on future traffic … but past traffic is encrypted with ephemeral keypairs that are not stored
85
Tor Bridges Anyone can look up the IP addresses of Tor relays
Public information in the consensus file Many countries block traffic to these IPs Essentially a denial-of-service against Tor Solution: Tor Bridges Essentially, Tor proxies that are not publicly known Used to connect clients in censored areas to the rest of the Tor network Tor maintains bridges in many countries
86
Obfuscating Tor Traffic
Bridges alone may be insufficient to get around all types of censorship DPI can be used to locate and drop Tor frames Iran blocked all encrypted packets for some time Tor adopts a pluggable transport design Tor traffic is forwarded to an obfuscation program Obfuscator transforms the Tor traffic to look like some other protocol BitTorrent, HTTP, streaming audio, etc. Deobfuscator on the receiver side extracts the Tor data from the encoding
87
Invisible Internet Project (I2P)
An anonymizing Peer-to-Peer network providing end to end protection utilizes decentralized structure to protect identity of both the sender and the receiver , torrents, web browsing, IM and more UDP based unlike Tor’s TCP streams
88
I2P Terminology Router Tunnel Gateway
the software which participates in the network Tunnel a unidirectional path through several routers Every router has several incoming connections (inbound tunnels) and outgoing connections (outbound tunnels) Tunnels use layered encryption Gateway first router in a tunnel Inbound Tunnel: first router of the tunnel Outbound Tunnel: creator of the tunnel
89
I2P Tunnels
90
I2P Encryption I2P works by routing traffic through other peers
All traffic is encrypted end-to-end
91
Joining the Network
92
Establishing a Tunnel
93
Establishing a Connection
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.