Rate-distortion Theory for Secrecy Systems

Slides:



Advertisements
Similar presentations
Applied Algorithmics - week7
Advertisements

Sampling and Pulse Code Modulation
Information Theory EE322 Al-Sanie.
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY Causal Secrecy: An Informed Eavesdropper.
Hybrid Codes and the Point-to-Point Channel Paul Cuff Curt Schieler.
Paul Cuff THE SOURCE CODING SIDE OF SECRECY TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA.
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY Information Theory for Secrecy and Control.
Enhancing Secrecy With Channel Knowledge
Equivocation A Special Case of Distortion-based Characterization.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Information Theoretical Security and Secure Network Coding NCIS11 Ning Cai May 14, 2011 Xidian University.
1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007.
CWIT Robust Entropy Rate for Uncertain Sources: Applications to Communication and Control Systems Charalambos D. Charalambous Dept. of Electrical.
Chapter 6 Information Theory
SWE 423: Multimedia Systems
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY A Framework for Partial Secrecy.
Fundamental limits in Information Theory Chapter 10 :
1 Network Source Coding Lee Center Workshop 2006 Wei-Hsin Gu (EE, with Prof. Effros)
Distributed Video Coding Bernd Girod, Anne Margot Aagon and Shantanu Rane, Proceedings of IEEE, Jan, 2005 Presented by Peter.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
BASiCS Group University of California at Berkeley Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan.
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Variable-Length Codes: Huffman Codes
EE 3220: Digital Communication Dr Hassan Yousif 1 Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Aldwasser.
Xiaohua (Edward) Li1 and E. Paul Ratazzi2
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Collecting Correlated Information from a Sensor Network Micah Adler University of Massachusetts, Amherst.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Noise, Information Theory, and Entropy
Computer Security CS 426 Lecture 3
Secure Communication for Signals Paul Cuff Electrical Engineering Princeton University.
PAUL CUFF ELECTRICAL ENGINEERING PRINCETON UNIVERSITY Secure Communication for Distributed Systems.
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
Information Theory & Coding…
EEET 5101 Information Theory Chapter 1
ENTROPIC CHARACTERISTICS OF QUANTUM CHANNELS AND THE ADDITIVITY PROBLEM A. S. Holevo Steklov Mathematical Institute, Moscow.
Dr.-Ing. Khaled Shawky Hassan
Network Coding and Information Security Raymond W. Yeung The Chinese University of Hong Kong Joint work with Ning Cai, Xidian University.
When rate of interferer’s codebook small Does not place burden for destination to decode interference When rate of interferer’s codebook large Treating.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
MD-based scheme could outperform MR-based scheme while preserving the source- channel interface Rate is not sufficient as source- channel interface, ordering.
Secure Communication for Distributed Systems Paul Cuff Electrical Engineering Princeton University.
Toward a Secure Data-Rate Theorem Paul Cuff. Control Setting Controller Encoder System (Plant) Sensors Rate R UiUi XiXi YiYi.
1 A Randomized Space-Time Transmission Scheme for Secret-Key Agreement Xiaohua (Edward) Li 1, Mo Chen 1 and E. Paul Ratazzi 2 1 Department of Electrical.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
University of Massachusetts Amherst · Department of Computer Science Square Root Law for Communication with Low Probability of Detection on AWGN Channels.
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
Transmission over composite channels with combined source-channel outage: Reza Mirghaderi and Andrea Goldsmith Work Summary STATUS QUO A subset Vo (with.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Secret Sharing in Distributed Storage Systems Illinois Institute of Technology Nexus of Information and Computation Theories Paris, Feb 2016 Salim El Rouayheb.
Entropy estimation and lossless compression Structure and Entropy of English How much lossless compression can be achieved for a given image? How much.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
Refinement of Two Fundamental Tools in Information Theory Raymond W. Yeung Institute of Network Coding The Chinese University of Hong Kong Joint work with.
Information Theoretical Security
Sampling of min-entropy relative to quantum knowledge Robert König in collaboration with Renato Renner TexPoint fonts used in EMF. Read the TexPoint.
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Using Secret Key to Foil an Eavesdropper
Compute-and-Forward Can Buy Secrecy Cheap
Watermarking with Side Information
Presentation transcript:

Rate-distortion Theory for Secrecy Systems Paul Cuff Electrical Engineering Princeton University

Information Theory Channel Coding Source Coding Secrecy Secrecy Source

Source Coding Describe an information signal (source) with a message. Encoder Decoder Message Information Reconstruction

Entropy If Xn is i.i.d. according to pX R > H(X) is necessary and sufficient for lossless reconstruction Space of Xn sequences Enumerate the typical set

Many Methods For lossless source coding, the encoding method is not so important It should simply use the full entropy of the bits

Single Letter Encoding (method 1) Encode each Xi separately Under the constraints of decodability, Huffman codes are optimal Expected length is within one bit of entropy Encode tuples of symbols to get closer to the entropy limit

Random Binning (method 2) Assign to each Xn sequence a random bit sequence (hash function) 0100110101011 Space of Xn sequences 0110100010010 1101011000100

Linear Transformation (method 3) Source Message Random Matrix J Xn

Summary For lossless source coding, structure of communication doesn’t matter much Information Gathered H(Xn) Message Bits Received

Lossy Source Coding What if the decoder must reconstruct with less than complete information? Error probability will be close to one Distortion as a performance metric 1 𝑛 𝑖=1 𝑛 𝑑( 𝑋 𝑖 , 𝑌 𝑖 )

Poor Performance 𝑚𝑖𝑛 𝑦 E d(X,y) 𝐻( 𝑋 𝑛 ) Random Binning and Random Linear Transformations are useless! Distortion 𝑚𝑖𝑛 𝑦 E d(X,y) Time Sharing Massey Conjecture: Optimal for linear codes 𝐻( 𝑋 𝑛 ) Message Bits Received

Puzzle Describe an n-bit random sequence Allow 1 bit of distortion Send only 1 bit

Rate Distortion Theorem [Shannon] Choose p(y|x): 𝑅>𝐼 𝑋;𝑌 𝐷>𝐸 𝑑(𝑋,𝑌)

Structure of Useful Partial Information Coordination (Given source PX construct Yn ~ PY|X ) Empirical 1 𝑛 𝑖=1 𝑛 1 𝑋 𝑖 , 𝑌 𝑖 = 𝑎,𝑏 ≈ 𝑃 𝑋,𝑌 (𝑎,𝑏) Strong 𝑃 𝑋 𝑛 𝑌 𝑛 ≈ 𝑃 𝑋,𝑌

Empirical Coordination Codes Codebook Random subset of Yn sequences Encoder Find the codeword that has the right joint first-order statistics with the source

Strong Coordination PY|X Black box acts like a memoryless channel X and Y are an i.i.d. multisource Communication Resources Source Output

Strong Coordination Synthetic Channel PY|X Related to: Reverse Shannon Theorem [Bennett et. al.] Quantum Measurements [Winter] Communication Complexity [Harsha et. al.] Strong Coordination [C.-Permuter-Cover] Generating Correlated R.V. [Anantharam, Gohari, et. al.] Common Randomness Source Message Output Node A Node B

Structure of Strong Coord. K

Information Theoretic Security

Wiretap Channel [Wyner 75]

Wiretap Channel [Wyner 75]

Wiretap Channel [Wyner 75]

Confidential Messages [Csiszar, Korner 78]

Confidential Messages [Csiszar, Korner 78]

Confidential Messages [Csiszar, Korner 78]

Merhav 2008

Villard-Piantanida 2010

Other Examples of “rate-equivocation” theory Gunduz-Erkip-Poor 2008 Lia-H. El-Gamal 2008 Tandon-Ulukus-Ramchandran 2009 …

Rate-distortion theory (secrecy)

Achievable Rates and Payoff Given [Schieler, Cuff 2012 (ISIT)]

How to Force High Distortion Randomly assign bins Size of each bin is Adversary only knows bin Adversary has no knowledge of only knowledge of

Causal Disclosure

Causal Disclosure (case 1)

Causal Disclosure (case 2)

Example Source distribution is Bernoulli(1/2). Payoff: One point if Y=X but Z≠X.

Rate-payoff Regions

General Disclosure Causal or non-causal

Strong Coord. for Secrecy Node A Node B Information Action Adversary Attack Channel Synthesis Not optimal use of resources!

Strong Coord. for Secrecy Node A Node B Information Action Adversary Attack Channel Synthesis Un Reveal auxiliary Un “in the clear”

Payoff-Rate Function Theorem: Maximum achievable average payoff Markov relationship: Theorem:

Structure of Secrecy Code K

Equivocation next Intermission

Log-loss Distortion Reconstruction space of Z is the set of distributions.

Best Reconstruction Yields Entropy

Log-loss 𝜋 1 (disclose X causally)

Log-loss 𝜋 2 (disclose Y causally)

Log-loss 𝜋 3 (disclose X and Y)

Result 1 from Secrecy R-D Theory

Result 2 from Secrecy R-D Theory

Result 3 from Secrecy R-D Theory

Some Difficulties In point-to-point, optimal communication produces a stationary performance. The following scenarios lend themselves to time varying performance.

Secure Channel Adversary does not observe the message Only access to causal disclosure Problem: Not able to isolate strong and empirical coordination. Empirical coordination provides short-duration strong coordination. Hard to prove optimality.

Side Information at the intended receiver Again, even a communication scheme built only on empirical coordination (covering) provides a short duration of strong coordination Performance reduces in stages throughout the block.

Cascade Network

Inner and Outer Bounds

Summary To assist an intended receiver with partial information while hindering an adversary with partial secrecy, a new encoding method is needed. Equivocation is characterized by this rate-distortion theory Main new encoding feature: Strong Coordination superpositioned over revealed information (a.k.a. Reverse Shannon Theorem or Distributed Channel Synthesis) In many cases (e.g. side information; secure communication channel; cascade network), this distinct layering may not be possible.

Restate Problem---Example 1 (RD Theory) Existence of Distributions Can we design: such that Does there exists a distribution: Standard f g

Restate Problem---Example 2 (Secrecy) Existence of Distributions Can we design: such that Does there exists a distribution: Standard Score f g Eve [Cuff 10]

Tricks with Total Variation Technique Find a distribution p1 that is easy to analyze and satisfies the relaxed constraints. Construct p2 to satisfy the hard constraints while maintaining small total variation distance to p1. How? Property 1:

Tricks with Total Variation Technique Find a distribution p1 that is easy to analyze and satisfies the relaxed constraints. Construct p2 to satisfy the hard constraints while maintaining small total variation distance to p1. Why? Property 2 (bounded functions):

Summary Achievability Proof Techniques: Pose problems in terms of existence of joint distributions Relax Requirements to “close in total variation” Main Tool --- Reverse Channel Encoder Easy Analysis of Optimal Adversary Secrecy Example: For arbitrary ², does there exist a distribution satisfying:

Cloud Overlap Lemma Previous Encounters Wyner, 75 --- used divergence Han-Verdú, 93 --- general channels, used total variation Cuff 08, 09, 10, 11 --- provide simple proof and utilize for secrecy encoding PX|U(x|u) Memoryless Channel

Reverse Channel Encoder For simplicity, ignore the key K, and consider Ja to be the part of the message that the adversary obtains. (i.e. J = (Ja, Js), and ignore Js for now) Construct a joint distribution between the source Xn and the information Ja (revealed to the Adversary) using a memoryless channel. PX|U(x|u) Memoryless Channel

Simple Analysis This encoder yields a very simple analysis and convenient properties If |Ja| is large enough, then Xn will be nearly i.i.d. in total variation Performance: PX|U(x|u) Memoryless Channel Notice that this simplifies to a single letter expression.

Summary Achievability Proof Techniques: Pose problems in terms of existence of joint distributions Relax Requirements to “close in total variation” Main Tool --- Reverse Channel Encoder Easy Analysis of Optimal Adversary I’ve outlines tools and techniques for designing optimal encoders for source coding for game theoretic secrecy. The main ideas were to pose the operational question in terms of existence of a joint distribution. Then we show that most hard constraints can be relaxed. This was important for removing the causal nature of the problem statement. Then we constructed a joint distribution using a memoryless channel and the cloud overlap lemma which was very easy to analyze for the worst-case adversary. The resulting “reverse channel encoder” behaves somewhat like a rate-distortion encoder but is random.