Multiple Access Covert Channels

Slides:



Advertisements
Similar presentations
Mobility Increase the Capacity of Ad-hoc Wireless Network Matthias Gossglauser / David Tse Infocom 2001.
Advertisements

Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Sampling and Pulse Code Modulation
Capacity of Wireless Channels
Enhancing Secrecy With Channel Knowledge
Lab 2 COMMUNICATION TECHNOLOGY II. Capacity of a System The bit rate of a system increases with an increase in the number of signal levels we use to denote.
Chapter 6 Information Theory
Optimization of pilot Locations in Adaptive M-PSK Modulation in a Rayleigh Fading Channel Khaled Almustafa Information System Prince Sultan University.
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
Module 3.0: Data Transmission
Noise, Information Theory, and Entropy
X= {x 0, x 1,….,x J-1 } Y= {y 0, y 1, ….,y K-1 } Channel Finite set of input (X= {x 0, x 1,….,x J-1 }), and output (Y= {y 0, y 1,….,y K-1 }) alphabet.
Noise, Information Theory, and Entropy
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Review: The application layer. –Network Applications see the network as the abstract provided by the transport layer: Logical full mesh among network end-points.
When rate of interferer’s codebook small Does not place burden for destination to decode interference When rate of interferer’s codebook large Treating.
1 Quasi-Anonymous Channels Ira S. Moskowitz --- NRL Richard E. Newman --- UF Paul F. Syverson --- NRL Center for High Assurance Computer Systems Code 5540.
Computer Communication & Networks Lecture # 05 Physical Layer: Signals & Digital Transmission Nadeem Majeed Choudhary
Channel Capacity.
1 A Randomized Space-Time Transmission Scheme for Secret-Key Agreement Xiaohua (Edward) Li 1, Mo Chen 1 and E. Paul Ratazzi 2 1 Department of Electrical.
1.Check if channel capacity* can cope with source information rate, if yes, source coding can proceed. Understand why. (i) Calculate Source entropy from.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
1 Anonymity and Covert Channels in Simple, Timed Mix-firewalls Richard E. Newman --- UF Vipan R. Nalla -- UF Ira S. Moskowitz --- NRL
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
DIGITAL COMMUNICATIONS Linear Block Codes
University of Massachusetts Amherst · Department of Computer Science Square Root Law for Communication with Low Probability of Detection on AWGN Channels.
1 Covert Channels and Anonymizing Networks Ira S. Moskowitz --- NRL Richard E. Newman --- UF Daniel P. Crepeau --- NRL Allen R. Miller --- just hanging.
CMSC 691 IAUMBC Analysis and Detection of Network Covert Channels Sweety Chauhan CMSC 691 IA 30 th Nov. 2005
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information and Channel Capacity Multimedia Security.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
PROJECT DOMAIN : NETWORK SECURITY Project Members : M.Ananda Vadivelan & E.Kalaivanan Department of Computer Science.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Analysis of LumiCal data from the 2010 testbeam
J. Miranda University of Ottawa 21 November 2003
Introduction to Information theory
Advanced Wireless Networks
Anonymity Metrics R. Newman.
DIGITIAL COMMUNICATION
Tilted Matching for Feedback Channels
Factor Graphs and the Sum-Product Algorithm
Towards Measuring Anonymity
Distributed MIMO Patrick Maechler April 2, 2008.
Independent Encoding for the Broadcast Channel
Cryptography Lecture 4.
Digital Multimedia Coding
Chapter 6.
COT 5611 Operating Systems Design Principles Spring 2012
Hidden Terminal Decoding and Mesh Network Capacity
COT 5611 Operating Systems Design Principles Spring 2014
Tim Holliday Peter Glynn Andrea Goldsmith Stanford University
Quantum Information Theory Introduction
Multiple Access Covert Channels
PCM & DPCM & DM.
Nyquist and Shannon Capacity
Capacity of Ad Hoc Networks
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Information-Theoretic Security
<month year> <doc.: IEEE doc> January 2013
Anonymity and Covert Channels in Simple, Timed Mix-firewalls
Reliability and Channel Coding
Topics discussed in this section:
Modeling Entropy in Onion Routing Networks
Chapter Three: Signals and Data Transmission
Timing Channels, Anonymity, Mixes, and Spikes
Anonymity – Generalizing Mixes
Presentation transcript:

Multiple Access Covert Channels Ira Moskowitz Naval Research Lab moskowitz@nrl.itd.navy.mil Richard Newman Univ. of Florida nemo@cise.ufl.edu

Focus Review covert channels from high assurance computing and anonymity Define quasi-anonymous channel Review analysis of single sender DMC Analyze 2-sender DMC

Covert Channels CC = communication contrary to design Storage channels and timing channels Storage channel capacity given by mutual information, in bits per symbol Timing channel capacity analysis requires optimizing ratio of mutual information to expected time cost

Storage Channel Example File system full/not full High fills/leaves space in FS to signal 1 or 0 Low tries to obtain space and fails or succeeds to “read” 1 or 0 Low returns system to previous state Picture here would be nice

Timing Channel Example High uses full time quantum in time sharing host to send 1, gives up CPU early to send 0 Low measures time gaps between accesses to “read” 1 or 0 Picture of Hi and Lo timing…

Anonymity Systems Started with Chaum Mixes Mix receives encrypted, padded msg Decrypts/re-encrypts padded msg Delays forwarding msg Scrambles order of msg forwarding Picture of mix taking messages and scrambling them

Mixes Mix may be timed (count number of msgs forwarded each time it fires) Mix may fire when threshold reached (count time between firings) Mixes may be chained Studied timed Mix-firewalls and covert channels – now for threshold Mix-firewalls

Mix-firewall CC Model Alice behind M-F Eve listening to output of M-F Clueless senders behind M-F Each sender (Alice or Clueless) may either send or not send a msg each tick Alice modulates her behavior to try to communicate with Eve Show CC from Alice to Eve

Channel Model Discrete storage channel Each clueless sends 0 or 1 msg per tick Clueless are i.i.d. Bernouli random vars Alice sends 0 or 1 msg per tick Eve counts msgs per Mix firing Clueless act as noise, rate decreases to zero as N increases (for fixed p)

Two Transmitter Model Now two Alices, Alice1 and Alice2 Each Alice has a quasi-anomymous channel to Eve Alices act as noise with respect to each other For theta=1, C=1, for 2, C = .6942, for 10, C=.26, for 50, 0.0832

NRL Pump NRL Network Pump considered multiple senders before Lows send to Highs, with the timing of ACKs forming a CC from Highs to Lows Pump modulates ACK timing to reduce the CC rate (but not eliminate it) Highs interfere with each other’s timing Pump uses timing channels – can’t apply For theta=1, C=1, for 2, C = .6942, for 10, C=.26, for 50, 0.0832

Degree of Collusion If Alices work perfectly together, then can achieve C=log 3 bits/tick data rate (assuming no clueless) “Existence assumption” - assume Alices know of each other (stationary), and pre-arrange coding, but do not collude once transmission begins Give picture of neuron

Shannon Channel Distributions X, Y Mutual Information I(X;Y) = I(Y;X) I(X;Y) = H(X) – H(X|Y) Entropy H(X) and H(X|Y) conditional H Capacity C = maxX I(X,Y) Give picture spike voltage change – action potential

Multiple Access Channels Now have two inputs, X1 and X2 Existence assumption, with a priori knowledge Achievable error-free rates are joint Rate pair (R1,R2) Capacity estimated (incorrectly) as: C = log n / [(TM + TR )/2] Incorrect numerator, should be n+1, denominator assumes uniform distribution of symbols.

Multiple Access Channels Mutual Information for A, B, C I(A;B|C) = H(A|C) – H(A|B,C) I(A,B;C) = H(A,B) – H(A,B|C) Rate pair (R1,R2) must satisfy: 0 <= R1 <= I(X1;Y|X2), and 0 <= R2 <= I(X2;Y|X1), and 0 <= R1 + R2 <= I(X1 ,X2;Y) Incorrect numerator, should be n+1, denominator assumes uniform distribution of symbols.

Channel Transitions 0,0 ! 0 0,1 & 1 1,0 % 1,1 ! 2 0,0 ! 0 0,1 & 1 1,0 % 1,1 ! 2 Graph of capacities of correct and M&M vs. log(n)

Collaborating Alices Can conspire to send data at rate 3/2 Max possible is log2 3 = 1.58 With feedback, can do better than 3/2: each at rate .76! (Gaarder & Wolf) Graph of capacities of correct and M&M vs. log(n)

Conclusions Introduced multiple access channels into analysis of covert channels Analyzed simple (noiseless) channel with two Alices Noted effects of varying levels of collusion Noted difficulties with timing channels Can’t study CCs in isolation! Graph of capacities of correct and M&M vs. log(n)