1 Private codes or Succinct random codes that are (almost) perfect Michael Langberg California Institute of Technology.

Slides:



Advertisements
Similar presentations
Efficiency vs. Assumptions in Secure Computation Yuval Ishai Technion & UCLA.
Advertisements

Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Gillat Kol joint work with Ran Raz Locally Testable Codes Analogues to the Unique Games Conjecture Do Not Exist.
Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
Locally Decodable Codes from Nice Subsets of Finite Fields and Prime Factors of Mersenne Numbers Kiran Kedlaya Sergey Yekhanin MIT Microsoft Research.
Probabilistic verification Mario Szegedy, Rutgers www/cs.rutgers.edu/~szegedy/07540 Lecture 4.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Information Theory EE322 Al-Sanie.
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Information Theory Introduction to Channel Coding Jalal Al Roumy.
Chapter 6 Information Theory
Lecture 40 CSE 331 Dec 11, Announcements Solutions to HW 10 and graded HW 9 at end of the lecture Review session on Monday: see blog for details.
1 Staleness vs.Waiting time in Universal Discrete Broadcast Michael Langberg California Institute of Technology Joint work with Jehoshua Bruck and Alex.
Constant Degree, Lossless Expanders Omer Reingold AT&T joint work with Michael Capalbo (IAS), Salil Vadhan (Harvard), and Avi Wigderson (Hebrew U., IAS)
The 1’st annual (?) workshop. 2 Communication under Channel Uncertainty: Oblivious channels Michael Langberg California Institute of Technology.
Asymmetric Cryptography part 1 & 2 Haya Shulman Many thanks to Amir Herzberg who donated some of the slides from
CS151 Complexity Theory Lecture 10 April 29, 2004.
CMSC 414 Computer and Network Security Lecture 6 Jonathan Katz.
BB84 Quantum Key Distribution 1.Alice chooses (4+  )n random bitstrings a and b, 2.Alice encodes each bit a i as {|0>,|1>} if b i =0 and as {|+>,|->}
Variable-Length Codes: Huffman Codes
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
CS555Spring 2012/Topic 41 Cryptography CS 555 Topic 4: Computational Approach to Cryptography.
Linear-Time Encodable and Decodable Error-Correcting Codes Jed Liu 3 March 2003.
Lo-Chau Quantum Key Distribution 1.Alice creates 2n EPR pairs in state each in state |  00 >, and picks a random 2n bitstring b, 2.Alice randomly selects.
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
1 CIS 5371 Cryptography 3. Private-Key Encryption and Pseudorandomness B ased on: Jonathan Katz and Yehuda Lindel Introduction to Modern Cryptography.
Information and Coding Theory Linear Block Codes. Basic definitions and some examples. Juris Viksna, 2015.
Information and Coding Theory Transmission over noisy channels. Channel capacity, Shannon’s theorem. Juris Viksna, 2015.
A Linear Lower Bound on the Communication Complexity of Single-Server PIR Weizmann Institute of Science Israel Iftach HaitnerJonathan HochGil Segev.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
Uncorrectable Errors of Weight Half the Minimum Distance for Binary Linear Codes Kenji Yasunaga * Toru Fujiwara + * Kwansei Gakuin University, Japan +
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
ERROR CONTROL CODING Basic concepts Classes of codes: Block Codes
Toward a Secure Data-Rate Theorem Paul Cuff. Control Setting Controller Encoder System (Plant) Sensors Rate R UiUi XiXi YiYi.
CRYPTANALYSIS OF STREAM CIPHER Bimal K Roy Cryptology Research Group Indian Statistical Institute Kolkata.
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Correction of Adversarial Errors in Networks Sidharth Jaggi Michael Langberg Tracey Ho Michelle Effros Submitted to ISIT 2005.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
1 Coded modulation So far: Binary coding Binary modulation Will send R bits/symbol (spectral efficiency = R) Constant transmission rate: Requires bandwidth.
The Price of Uncertainty in Communication Brendan Juba (Washington U., St. Louis) with Mark Braverman (Princeton)
Some Computation Problems in Coding Theory
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
CS555Spring 2012/Topic 31 Cryptography CS 555 Topic 3: One-time Pad and Perfect Secrecy.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Welcome This is a template to create an Instructional Design Document of the concept you have selected for creating animation. This will take you through.
Quantum Cryptography Antonio Acín
RELIABLE COMMUNICATION 1 IN THE PRESENCE OFLIMITEDADVERSARIES.
1 On MultiCuts and Related Problems Michael Langberg Joint work with Adi Avidor On MultiCuts and Related Problems Michael Langberg California Institute.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Codes for Symbol-Pair Read Channels Yuval Cassuto EPFL – ALGO Formerly: Hitachi GST Research November 3, 2010 IPG Seminar.
8 Coding Theory Discrete Mathematics: A Concept-based Approach.
RELIABLE COMMUNICATION
The Viterbi Decoding Algorithm
Sublinear-Time Error-Correction and Error-Detection
Sublinear-Time Error-Correction and Error-Detection
Modern symmetric-key Encryption
Error-Correcting Codes:
A Brief Introduction to Information Theory
When are Fuzzy Extractors Possible?
K-wise vs almost K-wise permutations, and general group actions
Summarizing Data by Statistics
Information-Theoretic Study of Optical Multiple Access
When are Fuzzy Extractors Possible?
Information-Theoretic Security
Cryptography Lecture 5.
How many deleted bits can one recover?
Theory of Information Lecture 13
Presentation transcript:

1 Private codes or Succinct random codes that are (almost) perfect Michael Langberg California Institute of Technology

2 Coding theory AB w  {0,1} k Noise C(w)  {0,1} n c C: {0,1} k {0,1} n Error correcting codes w decode

3 Consider: 2 types of channels Design of C depends on properties of channel. BSC p : Binary Symmetric Channel. Each bit flipped with probability p. ADVC p : Adversarial Channel. p-fraction of bits are flipped maliciously. AB Noise

4 BSC p What’s known: ? Thm. [Shannon] : Can construct codes that allow communication over BSC p for any p<½ with rate k/n~1-H(p). Thm. [Shannon] : Can construct codes that allow communication over BSC p for any p<½ with rate k/n~1-H(p). In particular: there exist codes for BSC ½- . C: {0,1} k {0,1} n AB eC(w)C(w)+e

5 ADVC p Can we match these results in presence of ADVC p ? Consider for example p=½-  : Need codes of minimum distance = 2pn ~ n. Need codes of minimum distance = 2pn ~ n. Do not exist (with constant rate) ! Do not exist (with constant rate) ! In general: for p<½ we need codes of minimum distance 2pn and rate k/n~1-H(p). In general: for p<½ we need codes of minimum distance 2pn and rate k/n~1-H(p). Such codes are close to being perfect and are known not to exist (asymptotically). Such codes are close to being perfect and are known not to exist (asymptotically). AB eC(w)C(w)+e No!

6 This talk Seen: BSC strictly weaker than ADVC. Seen: BSC strictly weaker than ADVC. Goal: Relax framework as to allow communication over ADVC with parameters of BSC. Goal: Relax framework as to allow communication over ADVC with parameters of BSC. Relaxation: Introduce “private randomness”. Relaxation: Introduce “private randomness”. Assume that the sender and receiver have a shared random string (hidden from channel). Assume that the sender and receiver have a shared random string (hidden from channel). Q: Can we match parameters of BSC ? (e.g. ADVC ½-  ?)

7 The model: Private codes AB w  {0,1} k C: {0,1} k x {0,1} m {0,1} n m random bits r Adversary c  {0,1} n D(c,r) w C(w,r)  {0,1} n

8 Private codes Roughly speaking: Private codes are said to allow communication over ADVC p if for every w and for any adversary: The communication of w will succeed with high probability over the shared random string r.  D  w  ADV Pr[D( C(w,r)+error, r)=w]=large m random bits AB r C(w,r) e C(w,r)+e

9 Private codes: related work Private codes have been studied in the past Private codes have been studied in the past[Shannon,BlackwellBreimanThomasian,Ahlswede]. Private codes in the presence of adversarial channels have also been studied: Private codes in the presence of adversarial channels have also been studied: [ Lipton ]: “Code scrambling”. [ Lipton ]: “Code scrambling”.

10 Private codes: properties Do private codes enable communication over ADVC ½-  ? Yes!!  private codes that allow communication over ADVC p with rate k/n~1-H(p). Yes!!  private codes that allow communication over ADVC p with rate k/n~1-H(p). Matching parameters in BSC p model. Matching parameters in BSC p model. m random bits AB r

11 Our results Study framework of private codes. Study framework of private codes. Match parameters obtainable in BSC model. Match parameters obtainable in BSC model. [Lipton] : many shared random bits, m ~ nlog(n). [Lipton] : many shared random bits, m ~ nlog(n). Analyze the amount of shared randomness needed to obtain private codes that match BSC parameters. Analyze the amount of shared randomness needed to obtain private codes that match BSC parameters. We show that a shared random string of size ~ log(n) is necessary and sufficient. We show that a shared random string of size ~ log(n) is necessary and sufficient. Present connection between list decodable codes and private codes. m random bits AB r

12 List decoding vs. Private decoding Thm: List decoding implies (unique) private codes. Using shared randomness: Using shared randomness: Any list decodable code can be used to construct a uniquely decodable private code. Any list decodable code can be used to construct a uniquely decodable private code. Reduction is efficient and needs only log(n) shared random bits. Reduction is efficient and needs only log(n) shared random bits.

13 Proof technique AB r {0,1} n Let C be standard code. Let C be standard code. Use C to construct private code C*(w,r). Use C to construct private code C*(w,r). Use C to construct standard codes C*| r. Use C to construct standard codes C*| r. Define C*| r as a subcode of C. Define C*| r as a subcode of C. Desired properties of C*| r : Desired properties of C*| r : Ideally - Unique decoding:  r  B only one codeword in ball of radius pn. Ideally - Unique decoding:  r  B only one codeword in ball of radius pn. Sufficient cond.: “hide” r + unique decoding on average:  B and most r only one codeword in ball. Sufficient cond.: “hide” r + unique decoding on average:  B and most r only one codeword in ball. C is list decodable: sufficient condition can be obtained efficiently with poly # of subcodes! C is list decodable: sufficient condition can be obtained efficiently with poly # of subcodes! C X X X X Radius pn: List size ≤ L X C*: {0,1} k x {0,1} m {0,1} n C*| r : {0,1} k {0,1} n

14 Concluding remarks Study private codes. Study private codes. Match param. of BSC model w/ log(n) shared bits. Match param. of BSC model w/ log(n) shared bits. Shared randomness: enables unique decoding whenever list decoding was possible. Shared randomness: enables unique decoding whenever list decoding was possible. Multiple messages: Multiple messages: Need fresh randomness for each message. Need fresh randomness for each message. May assume cryptographic private key setting. May assume cryptographic private key setting. Public key setting [MicaliPeikertSudanWilson]. Public key setting [MicaliPeikertSudanWilson]. Thanks. random bits AB r

15 Lower bounds Elias-Bassalygo. Elias-Bassalygo. Plotkin. Plotkin.