Anonymity and Covert Channels in Simple, Timed Mix-firewalls

Slides:



Advertisements
Similar presentations
Enhancing Secrecy With Channel Knowledge
Advertisements

Chapter 6 Information Theory
Visual Recognition Tutorial
Poorvi Vora/CTO/IPG/HP 01/03 1 The channel coding theorem and the security of binary randomization Poorvi Vora Hewlett-Packard Co.
ENGS Lecture 8 ENGS 4 - Lecture 8 Technology of Cyberspace Winter 2004 Thayer School of Engineering Dartmouth College Instructor: George Cybenko,
A Bit of Information Theory Unsupervised Learning Working Group Assaf Oron, Oct Based mostly upon: Cover & Thomas, “Elements of Inf. Theory”,
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
Visual Recognition Tutorial
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.
1 Statistical NLP: Lecture 5 Mathematical Foundations II: Information Theory.
Basic Concepts in Information Theory
Information Theory and Security Prakash Panangaden McGill University First Canada-France Workshop on Foundations and Practice of Security Montréal 2008.
2. Mathematical Foundations
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
On the Anonymity of Anonymity Systems Andrei Serjantov (anonymous)
§4 Continuous source and Gaussian channel
1 Quasi-Anonymous Channels Ira S. Moskowitz --- NRL Richard E. Newman --- UF Paul F. Syverson --- NRL Center for High Assurance Computer Systems Code 5540.
Channel Capacity.
Rei Safavi-Naini University of Calgary Joint work with: Hadi Ahmadi iCORE Information Security.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
1 Anonymity and Covert Channels in Simple, Timed Mix-firewalls Richard E. Newman --- UF Vipan R. Nalla -- UF Ira S. Moskowitz --- NRL
1 Covert Channels and Anonymizing Networks Ira S. Moskowitz --- NRL Richard E. Newman --- UF Daniel P. Crepeau --- NRL Allen R. Miller --- just hanging.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
CMSC 691 IAUMBC Analysis and Detection of Network Covert Channels Sweety Chauhan CMSC 691 IA 30 th Nov. 2005
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Mix networks with restricted routes PET 2003 Mix Networks with Restricted Routes George Danezis University of Cambridge Computer Laboratory Privacy Enhancing.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Statistical methods in NLP Course 2
Lecture 1.31 Criteria for optimal reception of radio signals.
12. Principles of Parameter Estimation
Introduction to Information theory
Anonymity Metrics R. Newman.
UCLA Progress Report OCDMA Channel Coding
SocialMix: Supporting Privacy-aware Trusted Social Networking Services
Hiroki Sayama NECSI Summer School 2008 Week 3: Methods for the Study of Complex Systems Information Theory p I(p)
Towards Measuring Anonymity
Multiple Access Covert Channels
Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity and Identity Management – A Consolidated Proposal for Terminology Authors: Andreas.
Error rate due to noise In this section, an expression for the probability of error will be derived The analysis technique, will be demonstrated on a binary.
COT 5611 Operating Systems Design Principles Spring 2012
Information Based Criteria for Design of Experiments
COT 5611 Operating Systems Design Principles Spring 2014
Subject Name: Information Theory Coding Subject Code: 10EC55
A Brief Introduction to Information Theory
Tim Holliday Peter Glynn Andrea Goldsmith Stanford University
Free-route Mixes vs. Cascades
Quantum Information Theory Introduction
Multiple Access Covert Channels
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Nyquist and Shannon Capacity
Distributed Compression For Binary Symetric Channels
Capacity of Ad Hoc Networks
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
A model for data revelation
Miguel Griot, Andres I. Vila Casado, and Richard D. Wesel
Information-Theoretic Security
Parametric Methods Berlin Chen, 2005 References:
Topics discussed in this section:
One Way Functions Motivation Complexity Theory Review, Motivation
Modeling Entropy in Onion Routing Networks
Information Theoretical Analysis of Digital Watermarking
12. Principles of Parameter Estimation
Watermarking with Side Information
Timing Channels, Anonymity, Mixes, and Spikes
Anonymity – Generalizing Mixes
Presentation transcript:

Anonymity and Covert Channels in Simple, Timed Mix-firewalls Richard E. Newman --- UF Vipan R. Nalla -- UF Ira S. Moskowitz --- NRL {nemo,vreddy}@cise.ufl.edu, moskowitz@itd.nrl.navy.mil http://chacs.nrl.navy.mil

Motivation Anonymity --- Linkages – sender/message/recipient optional desire or mandated necessity? Hide who is sending what to whom. What – covered by crypto. Who/which/whom – covered by Mix networks. Even if one cannot associate a particular message with a sender, it is still possible to leak information from sender to observer – covert channel.

Mixes A Mix is a device intended to hide source/message/destination associations. A Mix can use crypto, delay, shuffling, padding, etc. to accomplish this. Others have studied ways to “beat the Mix” --active attacks to flush the Mix. --passive attacks may study probabilities.

Prior measures of anonymity AT&T Crowds-degree of anonymity, pfoward message Not Mix-based Dresden: Anonymity (set of senders) Set size N, log(N) Does not include observations by Eve Cambridge: effective size, assign probs to senders between 0 and log(N) We show (later): maximal entropy (most noise) does not assure anonymity K.U. Leuven: normalize above We want something that measures before & after That is Shannon’s information theory

Aim of this Work We wish to provide another tool better to understand and to measure anonymity Limits of anonymity Application of classical techniques Follows WPES, CNIS work

Covert Channels A communication channel that exists, contrary to system design, in a computer system or network Typically in the realm of MLS systems: non-interference Classically measure threat by capacity

Quasi-Anonymous Channels Less than perfect anonymity = quasi-anonymity Quasi-anonymity allows covert channel = quasi-anonymous channel Quasi-anonymous channel is Illegal communication channel in its own right A way of measuring anonymity

NRL Covert Channel Analysis Lab John McDermott & Bruce Montrose Actual network set-up to exploit these quasi-anonymous channels First attempt: detect gross changes in traffic volume Future work may be a more fine-tuned detection of the mathematical channels discussed here

Our Earlier Scenario WPES 2003 Mix Firewalls separating 2 enclaves. Eve Enclave 2 Enclave 1 covert channel Alice & Cluelessi overt channel --- anonymous Timed Mix, total flush per tick Eve: counts # message per tick – perfect sync, knows # Cluelessi Cluelessi are IID, p = probability that Cluelessi does not send a message Alice is clueless w.r.t to Cluelessi

This System Model Alice (malicious insider) and N other senders (Cluelessi’s, 1=1,…,N) M observable destinations (Rj, j=1,…,M) “Nobody” destination R0 Each tick, each sender can send a message (to a destination Rj) or not (“send” to R0) Cluelessi are i.i.d. Eve sees message counts to Rj’s each tick

Multiple Receiver Model Eve [Nobody = R0] Clueless1 Clueless2 R1 … Mix-firewall R2 Alice … … CluelessN RN

Toy Scenario – N=1, M=1 Alice can: not send a message (0), or send (1) Only two input symbols to the (covert) channel What does Eve see? 0,1, or 2 messages. p q Eve Alice 1 p 1 q 2

Discrete Memoryless Channel X Y anonymizing network Y 1 2 p q X is the random variable representing Alice, the transmitter to the cc X has a prob dist P(X=0) = x P(X=1) = 1-x Y represents Eve prob dist derived from X and channel matrix X

Channel Capacity In general P(X = xi) = p(xi), similarly p(yk) H(X) = -∑i p(xi)log[p(xi)] Entropy of X H(X|Y) = -∑kp(yk) ∑ip(xi|yk)log[p(xi|yk)] Mutual information I(X,Y) = H(X) – H(X|Y) = H(Y)-H(Y|X) Capacity is the maximum over dist X of I

Capacity for Toy Scenario C = max x { -( pxlogpx +[qx+p(1-x)]log[qx+p(1-x)] +q(1-x)logq(1-x) ) –h(p) } where h(p) = -{ p logp + (1-p) log(1-p) }

Capacity and optimal x vs. p

Earlier Scenario: 1 Receiver, N Cluelessi pN NpN-1q 1 . pN qN NqN-1p N 1 qN N+1

Capacity vs. N (M=1)

Observations Highest capacity when very low or very high clueless traffic Capacity (of p) bounded below by C(0.5) x=.5 thus even at maximal entropy, not anonymous Capacity monotonically decreases to 0 with N C(p) is a continuous function of p Alice’s optimal bias is function of p, and is always near 0.5

Comments Lack of anonymity leads to comm. channel Use this quasi-anonymous channel to measure the anonymity Capacity is not always the correct measure---might want just mutual info, or number of bits passed

New Results Analysis for M>1 receivers Numerical (but not theoretical) results show best for Clueless to be uniform Numerical results for Clueless uniform over actual receivers (not R0) Numerical results for Alice uniform over actual receivers (not R0) Best for Alice to be uniform

Earlier Scenario Revisited: 1 Receiver, N Cluelessi pN NpN-1q <N,1> . pN qN NqN-1p <1,N> 1 qN <0,N+1>

M=2 Receivers, N=1 Cluelessi <2,0,0> p q/2 <1,1,0> q/2 <1,0,1> p q/2 1 <0,2,0> q/2 p <0,1,1> 2 q/2 q/2 <0,0,2>

Channel Matrix for N=1, M=2 <2,0,0><1,1,0><1,0,1><0,2,0><0,1,1><0,0,2> p q/2 q/2 0 0 0 0 p 0 q/2 q/2 0 0 0 p 0 q/2 q/2 ( ) M1,2 = (Note: typo in pre-proceedings section 3.2, M0.2[i,j]=Pr(ej|A=i), not A=ai)

Capacity for N=1,M=2 C = max A I(A,E) = max x1,x2 - {px0logpx0 +[qx0/2+p(x1)]log[qx0/2+p(x1)] +[qx0/2+p(x2)]log[qx0/2+p(x2)] +[qx1/2]log[qx1/2] +[qx1/2+ qx2/2]log[qx1/2+ qx2/2] +[qx2/2]log[qx2/2] –h2(p) } where h2(p) = -(1-p) log (1-p)/2 – p log p

Capacity LB vs. p (N=1-4,M=2) P=1 is like no other senders – log (M+1) P=0 has noise if M>1, since there are multiple receivers, capacity < log(M+1) Cap_1-4_2

Mutual Info vs. X0, N=1, M=2 Worst case is p=0.33 (lowest max) Alice’s best in that case is also x0 = 0.33, which is not bad for other p’s either Cap_1_2_four-cases

Mutual Info vs. p, N=2, M=2 Highest minimum is p=0.33 (=1/(M+1)) at x0=0.33 also Note that x0=0.33 is never far from best (of these shown) Cap_2_2_x0_cases

Best x0 vs. p for M=3,N=1-4 Best for x0 is 0.25 = 1/(M+1), regardless of N, for p=0.25 Cap_3_x0_vs_p_combined

Effect of Suboptimal x0 (M=3) If x0=0.25, then capacity is never far from best – how far depends on N Cap_3_normalizedMI_vs_p_combined

Capacity LB vs. p (N=1, M=1-5) Note minima are at 1/(M+1) N=1 Cap_1_1-5

Capacity (N,M) Cap_0-9_1-10

Equivalent Sender Group Size Cap_1-4_2_last

Conclusions Highest capacity when very low or very high clueless traffic Multiple receivers induces asymmetry for clueless sending vs. not sending Capacity monotonically decreases to 0 with N Capacity monotonically increases with M, bounded by log(M+1) Alice’s optimal bias is function of p, and is always near 1/(M+1)

Future Work Relax IID assumption on Cluelessi More realistic distributions for Cluelessi If Alice has knowledge of Cluelessi behavior… More general timed Mixes Threshold Mixes, pool Mixes, Mix networks Effective sender set size Relationship of CC capacity to anonymity