Reliable Deniable Communication: Hiding Messages in Noise

Slides:



Advertisements
Similar presentations
Cognitive Radio Communications and Networks: Principles and Practice By A. M. Wyglinski, M. Nekovee, Y. T. Hou (Elsevier, December 2009) 1 Chapter 11 Information.
Advertisements

Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
II. Modulation & Coding. © Tallal Elshabrawy Design Goals of Communication Systems 1.Maximize transmission bit rate 2.Minimize bit error probability 3.Minimize.
Locally Decodable Codes from Nice Subsets of Finite Fields and Prime Factors of Mersenne Numbers Kiran Kedlaya Sergey Yekhanin MIT Microsoft Research.
I NFORMATION CAUSALITY AND ITS TESTS FOR QUANTUM COMMUNICATIONS I- Ching Yu Host : Prof. Chi-Yee Cheung Collaborators: Prof. Feng-Li Lin (NTNU) Prof. Li-Yi.
Error Correcting Codes Stanley Ziewacz 22M:151 Spring 2009.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Qiwen Wang, Sidharth Jaggi, Shuo-Yen Robert Li Institute of Network Coding (INC) The Chinese University of Hong Kong October 19, 2011 IEEE Information.
OFDM Modulated Cooperative Multiple-Access Channel With Network-Channel Coding.
Reliable Deniable Communication: Hiding Messages in Noise Mayank Bakshi Mahdi Jafari Siavoshani ME Sidharth Jaggi The Chinese University of Hong Kong The.
Size of giant component in Random Geometric Graphs
Noisy Group Testing (Quick and Efficient) Sheng Cai, Mayank Bakshi, Sidharth Jaggi The Chinese University of Hong Kong Mohammad Jahangoshahi Sharif University.
Chapter 6 Information Theory
Poorvi Vora/CTO/IPG/HP 01/03 1 The channel coding theorem and the security of binary randomization Poorvi Vora Hewlett-Packard Co.
10/15/01 1 The Continuing Miracle of Information Storage Technology Paul H. Siegel Director, CMRR University of California, San Diego.
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
The 1’st annual (?) workshop. 2 Communication under Channel Uncertainty: Oblivious channels Michael Langberg California Institute of Technology.
Resilient Network Coding in the presence of Byzantine Adversaries Michelle Effros Michael Langberg Tracey Ho Sachin Katti Muriel Médard Dina Katabi Sidharth.
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
Tracey Ho Sidharth Jaggi Tsinghua University Hongyi Yao California Institute of Technology Theodoros Dikaliotis California Institute of Technology Chinese.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
Space Time Block Codes Poornima Nookala.
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
Noise, Information Theory, and Entropy
林茂昭 教授 台大電機系 個人專長 錯誤更正碼 數位通訊
Boulat A. Bash Dennis Goeckel Don Towsley LPD Communication when the Warden Does Not Know When.
Channel Coding Part 1: Block Coding
Daphne Koller Message Passing Loopy BP and Message Decoding Probabilistic Graphical Models Inference.
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
Resilient Network Coding in the Presence of Eavesdropping Byzantine Adversaries Michael Langberg Sidharth Jaggi Open University of Israel ISIT 2007 Tsinghua.
Toward a Secure Data-Rate Theorem Paul Cuff. Control Setting Controller Encoder System (Plant) Sensors Rate R UiUi XiXi YiYi.
Wireless Communication Elec 534 Set I September 9, 2007 Behnaam Aazhang.
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Correction of Adversarial Errors in Networks Sidharth Jaggi Michael Langberg Tracey Ho Michelle Effros Submitted to ISIT 2005.
Stochastic Threshold Group Testing Chun Lam Chan, Sheng Cai, Mayank Bakshi, Sidharth Jaggi The Chinese University of Hong Kong Venkatesh Saligrama Boston.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
Coding Theory Efficient and Reliable Transfer of Information
University of Massachusetts Amherst · Department of Computer Science Square Root Law for Communication with Low Probability of Detection on AWGN Channels.
The Classically Enhanced Father Protocol
Optimal Trading of Classical Communication, Quantum Communication, and Entanglement Mark M. Wilde arXiv: ERATO-SORST Min-Hsiu Hsieh 4 th Workshop.
Reliable Deniable Communication: Hiding Messages in Noise The Chinese University of Hong Kong The Institute of Network Coding Pak Hou Che Mayank Bakshi.
Reliable Deniable Communication: Hiding Messages from Noise Pak Hou Che Joint Work with Sidharth Jaggi, Mayank Bakshi and Madhi Jafari Siavoshani Institute.
RELIABLE COMMUNICATION 1 IN THE PRESENCE OFLIMITEDADVERSARIES.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
ON AVCS WITH QUADRATIC CONSTRAINTS Farzin Haddadpour Joint work with Madhi Jafari Siavoshani, Mayank Bakshi and Sidharth Jaggi Sharif University of Technology,
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
Progress Report for the UCLA OCDMA Project UCLA Graduate School of Engineering - Electrical Engineering Program Communication Systems Laboratory Miguel.
Dennis Goeckel Electrical and Computer Engineering (ECE) University of Massachusetts at Amherst Fundamental Limits of Covert Communications Joint work.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
Refinement of Two Fundamental Tools in Information Theory Raymond W. Yeung Institute of Network Coding The Chinese University of Hong Kong Joint work with.
The Concavity of the Auxiliary Function for Classical-Quantum Channels Speaker: Hao-Chung Cheng Co-work: Min-Hsiu Hsieh Date: 01/09/
RELIABLE COMMUNICATION
The Viterbi Decoding Algorithm
Information Theory Michael J. Watts
Independent Encoding for the Broadcast Channel
Effcient quantum protocols for XOR functions
Information-Theoretic Secrecy
Logarithms (2) Laws of logarithms.
Introduction Results Proofs Summary
II. Modulation & Coding.
An AWGN Multiaccess Channel*
Information-Theoretic Security
Unequal Error Protection for Video Transmission over Wireless Channels
Theory of Information Lecture 13
Lihua Weng Dept. of EECS, Univ. of Michigan
Presentation transcript:

Reliable Deniable Communication: Hiding Messages in Noise Pak Hou Che Mayank Bakshi Sidharth Jaggi The Chinese University of Hong Kong The Institute of Network Coding

Alice Bob Reliability

Alice Bob Reliability Deniability Willie (the Warden)

Alice’s Encoder M 𝐼𝑓 𝐓=0, 𝐗 = 𝟎 𝐼𝑓 𝐓=1, 𝐗 =𝐸𝑛𝑐(𝐌) 𝐗 T 𝑚𝑒𝑠𝑠𝑎𝑔𝑒 𝐌∈{1, …, 𝑁} t𝑟𝑎𝑛𝑠. 𝑠𝑡𝑎𝑡𝑢𝑠 𝐓∈{0, 1} 𝑁= 2 𝜃( 𝑛 ) 𝑇ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 𝜏= log 𝑁 𝑅𝑒𝑙𝑎𝑡𝑖𝑣𝑒 𝑡ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 𝑟= log 𝑁 𝑛

Alice’s Encoder M 𝐼𝑓 𝐓=0, 𝐗 = 𝟎 𝐼𝑓 𝐓=1, 𝐗 =𝐸𝑛𝑐(𝐌) Bob’s Decoder 𝐗 𝐘 𝑏 BSC(pb) 𝐌 =𝐷𝑒𝑐( 𝐘 𝑏 ) 𝐌 T 1−𝜖 𝑟𝑒𝑙𝑖𝑎𝑏𝑙𝑒 Pr 𝐌 =𝐌 >1−𝜖 Message 𝐌∈{1, …, 𝑁} Trans. Status 𝐓∈{0, 1} 𝑁= 2 𝜃( 𝑛 )

Alice’s Encoder M 𝐼𝑓 𝐓=0, 𝐗 = 𝟎 𝐼𝑓 𝐓=1, 𝐗 =𝐸𝑛𝑐(𝐌) Bob’s Decoder 𝐗 𝐘 𝑏 BSC(pb) 𝐌 =𝐷𝑒𝑐( 𝐘 𝑏 ) 𝐌 T 1−𝜖 𝑟𝑒𝑙𝑖𝑎𝑏𝑙𝑒 Pr 𝐌 =𝐌 >1−𝜖 Message 𝐌∈{1, …, 𝑁} Trans. Status 𝐓∈{0, 1} 𝑁= 2 𝜃( 𝑛 ) BSC(pw) 𝐘 𝑤 𝐓 =𝐷𝑒𝑐( 𝐘 𝑤 ) Willie’s (Best) Estimator 𝐓  

Bash, Goeckel & Towsley [1] Shared secret 𝑂( 𝑛 log 𝑛 ) bits AWGN channels But capacity only 𝑂 𝑛 bits [1] B. A. Bash, D. Goeckel and D. Towsley, “Square root law for communication with low probability of detection on AWGN channels,” in Proceedings of the IEEE International Symposium on Information Theory (ISIT), 2012, pp. 448–452.

This work No shared secret BSC(pb) pb < pw BSC(pw)

Alice’s Encoder M 𝐼𝑓 𝐓=0, 𝐗 = 𝟎 𝐼𝑓 𝐓=1, 𝐗 =𝐸𝑛𝑐(𝐌) Bob’s Decoder 𝐗 𝐘 𝑏 BSC(pb) 𝐌 =𝐷𝑒𝑐( 𝐘 𝑏 ) 𝐌 T 1−𝜖 𝑟𝑒𝑙𝑖𝑎𝑏𝑙𝑒 Pr 𝐌 =𝐌 >1−𝜖 Message 𝐌∈{1, …, 𝑁} Trans. Status 𝐓∈{0, 1} 𝑁= 2 𝜃( 𝑛 ) BSC(pw) 𝐘 𝑤 𝐓 =𝐷𝑒𝑐( 𝐘 𝑤 ) Willie’s (Best) Estimator 𝐓  

Alice’s Transmission Status Hypothesis Testing Willie’s Estimate Alice’s Transmission Status 𝛼=Pr 𝐓 =1 𝐓=0 , 𝛽=Pr 𝐓 =0 𝐓=1

Alice’s Transmission Status Hypothesis Testing Willie’s Estimate Alice’s Transmission Status  

Alice’s Transmission Status Hypothesis Testing Willie’s Estimate Alice’s Transmission Status  

Alice’s Transmission Status Hypothesis Testing Willie’s Estimate Alice’s Transmission Status  

Intuition 𝐓=0, 𝐲 𝑤 = 𝐳 𝑤 ~Binomial(𝑛, 𝑝 𝑤 )

Intuition 𝐓=0, 𝐲 𝑤 = 𝐳 𝑤 ~Binomial 𝑛, 𝑝 𝑤 𝑊ℎ𝑒𝑛 𝐓=1,

Theorem 1 (Wt(c.w.)) (high deniability => low weight codewords) Too many codewords with weight “much” greater than 𝑐 𝑛 , then the system is “not very” deniable

Theorems 2 & 3 (Converse & achievability for reliable & deniable comm

Theorems 2 & 3 𝑝 𝑤 1/2 pb>pw 𝑝 𝑏 1/2

Theorems 2 & 3 𝑝 𝑤 1/2 𝑁=0 𝑝 𝑏 1/2

Theorems 2 & 3 𝑝 𝑤 pw=1/2 1/2 𝑝 𝑏 1/2

Theorems 2 & 3 𝑝 𝑤 1/2 (BSC(pb)) 𝑝 𝑏 1/2

Theorems 2 & 3 𝑝 𝑤 1/2 pb=0 𝑝 𝑏 1/2

Theorems 2 & 3 𝑝 𝑤 𝑁 = 2 𝑂( 𝑛 log 𝑛 ) , 𝑛 𝑛 = 2 𝑂( 𝑛 log 𝑛 ) 1/2 𝑝 𝑏 𝑝 𝑏 1/2

Theorems 2 & 3 𝑝 𝑤 1/2 pw>pb 𝑝 𝑏 1/2

Theorems 2 & 3 𝑝 𝑤 𝑁 = 2 𝑂( 𝑛 ) 1/2 “Standard” IT inequalities + 𝑁 = 2 𝑂( 𝑛 ) 1/2 “Standard” IT inequalities + Wt(“most codewords”)<√n (Thm 1) 𝑝 𝑏 1/2

Theorems 2 & 3 Main thm: 𝑝 𝑤 1/2 Achievable region 𝑁 = 2 Ω ( 𝑛 ) 𝑝 𝑏 𝑁 = 2 Ω ( 𝑛 ) 𝑝 𝑏 1/2

log 𝑛 𝑛/2 ≈𝑛 logarithm of # codewords n 𝑤 𝑡 𝐻 ( 𝒚 𝑤 )

log(# codewords) 𝑛𝐻( 𝑝 𝑤 ) 𝐱 = 0 Pr 𝐙 𝑤 ⁡(𝑤 𝑡 𝐻 𝐲 𝑤 ) 𝑂( 1 𝑛 ) 𝑝 𝑤 𝑛 𝑝 𝑤 𝑛+𝑂( 𝑛 ) n 𝑤 𝑡 𝐻 ( 𝐲 𝑤 ) 𝑝 𝑤 𝑛−𝑂( 𝑛 )

log(# codewords) 𝑛𝐻( 𝑝 𝑤 ∗𝜌) 𝑐 𝑛 Pr 𝐌, 𝐙 𝑤 ⁡(𝑤 𝑡 𝐻 𝐲 𝑤 ) 𝑂( 1 𝑛 ) n 𝑤 𝑡 𝐻 ( 𝐲 𝑤 ) (𝑝 𝑤 ∗𝜌)𝑛−𝑂( 𝑛 ) (𝑝 𝑤 ∗𝜌)𝑛 (𝑝 𝑤 ∗𝜌)𝑛+𝑂( 𝑛 )

Theorem 3 – Reliability proof sketch Weight 𝑂( 𝑛 ) Random code 1000001000000000100100000010000000100 0001000000100000010000000010000000001 0010000100000001010010000000100010011 . 2 𝑂( 𝑛 ) codewords 0000100000010000000000010000000010000

Theorem 3 – Reliability proof sketch Weight 𝑂( 𝑛 ) E(Intersection of 2 codewords) = O(1) “Most” codewords “well-isolated” 1000001000010000100100000010000000100 0001000000100000010000000010000000001 0010000100000001010010000000100010011 . 0000100000010000000000010000000010000

Theorem 3 – dmin decoding x + 𝑂( 𝑛 ) x’ Pr(x decoded to x’) < 2 −𝑂( 𝑛 )

Theorem 3 – Deniability proof sketch Recall: want to show 𝑉 𝐏 0 , 𝐏 1 <𝜖

log(# codewords) 𝑛𝐻( 𝑝 𝑤 ∗𝜌) 𝑐 𝑛 Pr 𝐌, 𝐙 𝑤 ⁡(𝑤 𝑡 𝐻 𝐲 𝑤 ) 𝑂( 1 𝑛 ) n 𝑤 𝑡 𝐻 ( 𝐲 𝑤 ) (𝑝 𝑤 ∗𝜌)𝑛−𝑂( 𝑛 ) (𝑝 𝑤 ∗𝜌)𝑛 (𝑝 𝑤 ∗𝜌)𝑛+𝑂( 𝑛 )

Theorem 3 – Deniability proof sketch Recall: want to show 𝑉 𝐏 0 , 𝐏 1 <𝜖 𝐏 0 𝐏 1

Theorem 3 – Deniability proof sketch log(# codewords) n Pr 𝑪, 𝐙 𝑤 ⁡(𝑤 𝑡 𝐻 𝐲 𝑤 ) 𝑂( 1 𝑛 ) 𝑤 𝑡 𝐻 ( 𝐲 𝑤 ) (𝑝 𝑤 ∗𝜌)𝑛−𝑂( 𝑛 ) (𝑝 𝑤 ∗𝜌)𝑛 (𝑝 𝑤 ∗𝜌)𝑛+𝑂( 𝑛 )

Theorem 3 – Deniability proof sketch logarithm of # codewords n 𝑤 𝑡 𝐻 ( 𝒚 𝑤 )

Theorem 3 – Deniability proof sketch 𝑬 𝑪 (𝐏 1 )!!! 𝐏 0 𝐏 1

Theorem 3 – Deniability proof sketch 𝑉 𝐏 0 , 𝐏 1 ≤𝑉 𝐏 0 , 𝑬 𝑪 (𝐏 1 ) +𝑉 𝑬 𝑪 (𝐏 1 ), 𝐏 1 𝑬 𝑪 (𝐏 1 )!!! 𝐏 0 𝐏 1

Theorem 3 – Deniability proof sketch 𝑬 𝑪 (𝐏 1 ) 𝐏 1

Theorem 3 – Deniability proof sketch logarithm of # codewords 𝑝 𝑤 𝑛 𝑝 𝑤 𝑛+𝑂( 𝑛 ) n 𝑤 𝑡 𝐻 ( 𝒚 𝑤 ) 𝑝 𝑤 𝑛−𝑂( 𝑛 )

Theorem 4 logarithm of # codewords n 𝑤 𝑡 𝐻 ( 𝒚 𝑤 )

Theorem 4 Too few codewords => Not deniable logarithm of n 𝑤 𝑡 𝐻 ( 𝒚 𝑤 )

Summary

Summary