Information-Theoretic Secrecy

Slides:



Advertisements
Similar presentations
Section 1.2 Suppose A 1, A 2,..., A k, are k events. The k events are called mutually exclusive if The k events are called mutually exhaustive if A i 
Advertisements

Ref. Cryptography: theory and practice Douglas R. Stinson
Communication Theory of Secrecy Systems On a paper by Shannon (and the industry it didn’t spawn) Gilad Tsur Yossi Oren December 2005 Gilad Tsur Yossi Oren.
Information Theory and Security pt. 2. Lecture Motivation Previous lecture talked about a way to measure “information”. In this lecture, our objective.
Information Theory and Security. Lecture Motivation Up to this point we have seen: –Classical Crypto –Symmetric Crypto –Asymmetric Crypto These systems.
Princeton University COS 433 Cryptography Fall 2005 Boaz Barak COS 433: Cryptography Princeton University Fall 2005 Boaz Barak Lecture 2: Perfect Secrecy.
CSE331: Introduction to Networks and Security Lecture 17 Fall 2002.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #32-1 Chapter 32: Entropy and Uncertainty Conditional, joint probability Entropy.
Shannon ’ s theory part II Ref. Cryptography: theory and practice Douglas R. Stinson.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
CryptographyPerfect secrecySlide 1 Today What does it mean for a cipher to be: –Computational secure? Unconditionally secure? Perfect secrecy –Conditional.
Princeton University COS 433 Cryptography Fall 2005 Boaz Barak COS 433: Cryptography Princeton University Fall 2005 Boaz Barak Lecture 2: Perfect Secrecy.
Information Theory and Security
Computer Security CS 426 Lecture 3
Cryptography Week-6.
CS526Topic 3: One-time Pad and Perfect Secrecy 1 Information Security CS 526 Topic 3 Cryptography: One-time Pad, Information Theoretic Security, and Stream.
EE5552 Network Security and Encryption block 4 Dr. T.J. Owens CEng MIET Dr T. Itagaki MIET, MIEEE, MAES.
§1 Entropy and mutual information
Chapter 2 Basic Encryption and Decryption. csci5233 computer security & integrity 2 Encryption / Decryption encrypted transmission AB plaintext ciphertext.
David Evans CS551: Security and Privacy University of Virginia Computer Science Lecture 2: Breaking Unbreakable Ciphers.
Topic 21 Cryptography CS 555 Topic 2: Evolution of Classical Cryptography CS555.
(Important to algorithm analysis )
Part 9, Basic Cryptography 1. Introduction A cryptosystem is a tuple: ( M,K,C, E,D) where M is the set of plaintexts K the set of keys C the set of ciphertexts.
Cryptography Part 1: Classical Ciphers Jerzy Wojdyło May 4, 2001.
JHU CS /Jan Hajic 1 Introduction to Natural Language Processing ( ) Essential Information Theory I AI-lab
Part 9, Basic Cryptography 1. Introduction A cryptosystem is a tuple: ( M,K,C, E,D) where M is the set of plaintexts K the set of keys C the set of ciphertexts.
CRYPTANALYSIS OF STREAM CIPHER Bimal K Roy Cryptology Research Group Indian Statistical Institute Kolkata.
1 Information Theory Nathanael Paul Oct. 09, 2002.
Cryptography and Authentication A.J. Han Vinck Essen, 2008
CS555Spring 2012/Topic 31 Cryptography CS 555 Topic 3: One-time Pad and Perfect Secrecy.
Cryptography Lecture 2 Arpita Patra. Recall >> Crypto: Past and Present (aka Classical vs. Modern Cryto) o Scope o Scientific Basis (Formal Def. + Precise.
CS526Topic 2: Classical Cryptography1 Information Security CS 526 Topic 2 Cryptography: Terminology & Classic Ciphers.
Essential Probability & Statistics (Lecture for CS397-CXZ Algorithms in Bioinformatics) Jan. 23, 2004 ChengXiang Zhai Department of Computer Science University.
Substitution Ciphers Reference –Matt Bishop, Computer Security, Addison Wesley, 2003.
CHAPTER 14 ENCRYPTION AND DECRYPTION Sajina Pradhan
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Chapter 2 Basic Encryption and Decryption
Shannon Entropy Shannon worked at Bell Labs (part of AT&T)
Introduction to Information theory
Transmission over noisy channels. Channel capacity, Shannon’s theorem.
Applied Algorithmics - week7
Hashing Alexandra Stefan.
Chapter 32: Entropy and Uncertainty
Lecture 2 Shannon’s Theory
B504/I538: Introduction to Cryptography
Cryptography Lecture 2 Arpita Patra © Arpita Patra.
Cryptography Lecture 4.
Polynomials, Secret Sharing, And Error-Correcting Codes
Topic 3: Perfect Secrecy
Cryptography Lecture 2 Arpita Patra © Arpita Patra.
Topic 7: Pseudorandom Functions and CPA-Security
Polynomials, Secret Sharing, And Error-Correcting Codes
Using Secret Key to Foil an Eavesdropper
Lecture 2: Perfect Ciphers (in Theory, not Practice)
Distributed Compression For Binary Symetric Channels
Classical Ciphers – I Terminology CSCI284 Spring 2004 GWU Shift Cipher
Hash Functions Motivation Hash Functions: collision, pre-images SHA-1
Entropy and Uncertainty
Cryptography Lecture 4.
Shannon Secrecy CSCI284/162 Spring 2009 GWU.
Information Security CS 526 Topic 3
One Way Functions Motivation Complexity Theory Review, Motivation
Cryptology Design Fundamentals
Cryptography Lecture 3.
Lecture 7 Information Sources; Average Codeword Length (Section 2.1)
CSE 589 Applied Algorithms Spring 1999
2. Perfect Secret Encryption
Crypto for CTFs.
CIS 5371 Cryptography 2. Perfect Secret Encryption
Presentation transcript:

Information-Theoretic Secrecy Probability Theory: Bayes’ theorem Perfect secrecy: definition, some proofs, examples: one-time pad; simple secret sharing Entropy: Definition, Huffman coding property, unicity distance Information-Theoretic Secrecy CSCI381 Fall 2005 GWU Reference: Stinson

CS284/Spring04/GWU/Vora/Shannon Secrecy Bayes’ Theorem If Pr[y] > 0 then Pr[x|y] = Pr[x]Pr[y|x]/  xX Pr[x]Pr[y|x] What is the probability that the 1st dice throw is 2 when the sum of two dice throws is 5? What is the probability that the 2nd dice throw is 3 when the product of the two dice throws is: 6, and 5? 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

(Im)Perfect Secrecy: Example P = {1, 2, 3} K = {K1, K2, K3} and C = {2, 3, 4, 5, 6} 1 2 3 K1 4 K2 5 K3 6 Keys chosen equiprobably Pr[1] = Pr[2] = Pr[3] = 1/3 Pr[c=3] = ? Pr[m|c=3] = ? Pr[k|c=3] = ? 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

(Im)Perfect Secrecy: Example 1 2 3 K1 4 K2 5 K3 6 a b c K1 1 2 3 K2 4 K3 How should the above ciphers be changed to improve the cryptosystem? What defines a good cryptosystem? 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

(Im)Perfect Secrecy: Example Latin Square 1 2 3 K1 K2 K3 Assume all keys and messages equiprobable What’s good about this? P(k|c) P(m|c) 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

Perfect Secrecy: Definition A cryptosystem has perfect secrecy if Pr[x|y] = Pr[x]  xP, yC a posteriori probability = a priori probability posterior = prior 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

CS284/Spring04/GWU/Vora/Shannon Secrecy Example: one-time pad P = C = Z2n dK=eK(x1, x2, …xn) = (x1+K1, x2+K2, …xn+Kn) mod 2 Show that it provides perfect secrecy 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

CS284/Spring04/GWU/Vora/Shannon Secrecy Some proofs: Thm. 2.4 Thm 2.4: Suppose (P, C, K, E, D) is a cryptosystem where |K| = |P| = |C|. Then the cryptosystem provides perfect secrecy if and only if every key is used with equal probability 1/|K|, and x P and y C, there is a unique key K such that eK(x) = y (eg: Latin square) 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

CS284/Spring04/GWU/Vora/Shannon Secrecy Entropy H(X) = - pi  log2 pi Example: pi = 1/n Examples: ciphertext and plaintext entropies for examples. 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

Huffman encoding f: X*  {0, 1}* String of random variables to string of bits e.g. X = {a, b, c, d} f(a) = 1, f(b) = 10, f(c) = 100, f(d) = 1000 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

Huffman encoding algorithm b c d e 0.05 0.1 0.12 0.13 0.6 1 0.15 0.25 0.4 X = {a, b, c, d, e} p(a) = 0.05 p(b) = 0.1 p(c) = 0.12 p(d) = 0.13 p(e) = 0.6 a: 000, b: 001, c: 010, d: 011, e: 1 Average length = ? Entropy = ? 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

CS284/Spring04/GWU/Vora/Shannon Secrecy Theorem H(X)  average length of Huffman encoding  H(X) + 1 Without proof 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

CS284/Spring04/GWU/Vora/Shannon Secrecy Properties of Entropy H(X)  log2 n H(X, Y)  H(X) + H(Y) H(X, Y) = H(X) + H(Y|X) = H(Y) + H(X|Y) Where H(X|Y) = - x y p(y)p(x|y)log2p(x|y) H(X|Y)  H(X) With proofs and examples 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

Theorem H(K|C) = H(K) + H(P) – H(C) Examples: Previous imperfect squares Proof: H(K, P, C) = H(K, C) = H(K, P) 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

Language Entropy and Redudancy HL = Lim n H(Pn ) /n (lies between 1 and 1.5 for English) RL = 1 – HL /log2 |P| (the amount of “space” in a letter of English for other information) Need, on average, about n ciphertext characters to break a substitution cipher where: n = key entropy / RL log2 |P| n is “unicity distance” of cryptosystem 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy

CS284/Spring04/GWU/Vora/Shannon Secrecy Proof H(K|Cn) = H(K) + H(Pn) – H(Cn) 11/22/2018 CS284/Spring04/GWU/Vora/Shannon Secrecy