11/3/2008Communication & Computation1 A need for a new unifying theory Madhu Sudan MIT CSAIL.

Slides:



Advertisements
Similar presentations
Of 23 09/24/2013HLF: Reliable Meaningful Communication1 Reliable Meaningful Communication Madhu Sudan Microsoft, Cambridge, USA.
Advertisements

PROOFS OF RETRIEVABILITY VIA HARDNESS AMPLIFICATION Yevgeniy Dodis, Salil Vadhan and Daniel Wichs.
Of 31 09/19/2011UIUC: Communication & Computation1 Communication & Computation A need for a new unifying theory Madhu Sudan Microsoft, New England.
Multiplicity Codes Swastik Kopparty (Rutgers) (based on [K-Saraf-Yekhanin ’11], [K ‘12], [K ‘14])
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Quantum Information and the PCP Theorem Ran Raz Weizmann Institute.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
Locally Decodable Codes from Nice Subsets of Finite Fields and Prime Factors of Mersenne Numbers Kiran Kedlaya Sergey Yekhanin MIT Microsoft Research.
Information and Coding Theory
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Semantic Communication Madhu Sudan (based on joint work with Brendan Juba (MIT); and upcoming work with Oded Goldreich (Weizmann) & J. )
Universal Communication Brendan Juba (MIT) With: Madhu Sudan (MIT)
Of 30 09/04/2012ITW 2012: Uncertainty in Communication1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: Universal Semantic.
Of 29 May 2, 2011 Semantic Northwestern1 Universal Semantic Communication Madhu Sudan Microsoft Research Joint with Oded Goldreich (Weizmann)
Of 27 01/06/2015CMI: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011) Juba,
NON-MALLEABLE EXTRACTORS AND SYMMETRIC KEY CRYPTOGRAPHY FROM WEAK SECRETS Yevgeniy Dodis and Daniel Wichs (NYU) STOC 2009.
Of 13 October 6-7, 2010Emerging Frontiers of Information: Kickoff 1 Madhu Sudan Microsoft Research + MIT TexPoint fonts used in EMF. TexPoint fonts used.
Of 14 01/03/2015ISCA-2015: Reliable Meaningful Communication1 Reliable Meaningful Communication Madhu Sudan Microsoft, Cambridge, USA.
Probabilistically Checkable Proofs Madhu Sudan MIT CSAIL 09/23/20091Probabilistic Checking of Proofs TexPoint fonts used in EMF. Read the TexPoint manual.
Computer Science and Computer Engineering. parts of the computer.
Computational problems, algorithms, runtime, hardness
1 Introduction to Computability Theory Lecture13: Mapping Reductions Prof. Amos Israeli.
Session 5 Hash functions and digital signatures. Contents Hash functions – Definition – Requirements – Construction – Security – Applications 2/44.
6/20/2015List Decoding Of RS Codes 1 Barak Pinhas ECC Seminar Tel-Aviv University.
Chapter 11: Limitations of Algorithmic Power
Lecture 20: April 12 Introduction to Randomized Algorithms and the Probabilistic Method.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
On the Complexity of Approximating the VC Dimension Chris Umans, Microsoft Research joint work with Elchanan Mossel, Microsoft Research June 2001.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
Of 33 March 1, 2011 Semantic UCLA1 Universal Semantic Communication Madhu Sudan Microsoft Research + MIT Joint with Oded Goldreich (Weizmann)
Of 35 05/16/2012CTW: Communication and Computation1 Communication amid Uncertainty Madhu Sudan Microsoft, Cambridge, USA Based on: Universal Semantic Communication.
Of 28 Probabilistically Checkable Proofs Madhu Sudan Microsoft Research June 11, 2015TIFR: Probabilistically Checkable Proofs1.
How to play ANY mental game
Cryptography Lecture 8 Stefan Dziembowski
Of 29 August 4, 2015SIAM AAG: Algebraic Codes and Invariance1 Algebraic Codes and Invariance Madhu Sudan Microsoft Research.
Communication & Computing Madhu Sudan ( MSR New England ) Theories of.
/425 Declarative Methods - J. Eisner1 Encodings and reducibility.
Cryptography Dec 29. This Lecture In this last lecture for number theory, we will see probably the most important application of number theory in computer.
1 Models of Computation o Turing Machines o Finite store + Tape o transition function: o If in state S1 and current cell is a 0 (1) then write 1 (0) and.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
Simple Iterative Sorting Sorting as a means to study data structures and algorithms Historical notes Swapping records Swapping pointers to records Description,
2/2/2009Semantic Communication: MIT TOC Colloquium1 Semantic Goal-Oriented Communication Madhu Sudan Microsoft Research + MIT Joint with Oded Goldreich.
Nawaf M Albadia
Of 27 August 6, 2015KAIST: Reliable Meaningful Communication1 Reliable Meaningful Communication Madhu Sudan Microsoft Research.
The Price of Uncertainty in Communication Brendan Juba (Washington U., St. Louis) with Mark Braverman (Princeton)
Umans Complexity Theory Lectures Lecture 7b: Randomization in Communication Complexity.
Of 22 10/07/2015UMass: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011)
Of 22 10/30/2015WUSTL: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Harvard Based on Juba, S. (STOC 2008, ITCS 2011) Juba, S. (STOC.
10/14/2008Semantic Communication1 Universal Semantic Communication Madhu Sudan MIT CSAIL Joint work with Brendan Juba (MIT CSAIL).
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
RS – Reed Solomon Error correcting code. Error-correcting codes are clever ways of representing data so that one can recover the original information.
Advanced Algorithms Analysis and Design
Universal Semantic Communication
Sublinear-Time Error-Correction and Error-Detection
Error-Correcting Codes:
Universal Semantic Communication
Communication & Computation A need for a new unifying theory
General Strong Polarization
Local Error-Detection and Error-correction
Objective of This Course
Locally Decodable Codes from Lifting
Imperfectly Shared Randomness
Universal Semantic Communication
Universal Semantic Communication
Information-Theoretic Security
Universal Semantic Communication
Universal Semantic Communication
General Strong Polarization
Presentation transcript:

11/3/2008Communication & Computation1 A need for a new unifying theory Madhu Sudan MIT CSAIL

11/3/2008Communication & Computation2 Theory of Computing Turing architecture Turing architecture FiniteState Control Control R/W UniversalMachine Encodings of other machines One machine to rule them all! → von Neumann architecture CPU RAM

11/3/2008Communication & Computation3 Theory of Communication Shannon’s architecture for communication over noisy channel Shannon’s architecture for communication over noisy channel Yields reliable communication Yields reliable communication (and storage (= communication across time)). (and storage (= communication across time)). Noisy Channel Encoder Decoder Y Ŷ m = E(m) D(Ŷ) = m?

11/3/2008Communication & Computation4 Turing Shannon Turing Turing Assumes perfect storage Assumes perfect storage and perfect communication and perfect communication To get computation To get computation Shannon Shannon Assumes computation Assumes computation To get reliable storage + communication To get reliable storage + communication Chicken vs. Egg? Chicken vs. Egg? Fortunately both realized! Fortunately both realized! Encoder Decoder

11/3/2008Communication & Computation5 Modern Theory (of Comm. & Comp.) Network (society?) of communicating computers Network (society?) of communicating computers Diversity of Diversity of Capability Capability Protocols Protocols Objectives Objectives Concerns Concerns Alice Bob Charlie Dick Fred Eve

11/3/2008Communication & Computation6 Modern Challenges (to communication) Nature of communication is more complex. Nature of communication is more complex. Channels are more complex (composed of many smaller, potentially clever sub-channels) Channels are more complex (composed of many smaller, potentially clever sub-channels) Alters nature of errors Alters nature of errors Scale of information being stored/communicated is much larger. Scale of information being stored/communicated is much larger. Does scaling enhance reliability or decrease it? Does scaling enhance reliability or decrease it? The Meaning of Information The Meaning of Information Entities constantly evolving. Can they preserve meaning of information? Entities constantly evolving. Can they preserve meaning of information?

11/3/2008Communication & Computation7 Part I: Modeling errors

11/3/2008Communication & Computation8 Shannon (1948) vs. Hamming (1950) q-ary channel: q-ary channel: Input: n element string Y over Σ= {1,…, q} Input: n element string Y over Σ= {1,…, q} Output: n element string Ŷ over Σ= {1,…, q} Output: n element string Ŷ over Σ= {1,…, q} Shannon: Errors = Random Shannon: Errors = Random Ŷ i = Y i w.p. 1 – p, uniform in Σ – {Y i } w.p. p. Ŷ i = Y i w.p. 1 – p, uniform in Σ – {Y i } w.p. p. Hamming: Errors = Adversarial Hamming: Errors = Adversarial p-fraction of i’s satisfy Ŷ i ≠ Y i p-fraction of i’s satisfy Ŷ i ≠ Y i p can never exceed ½! p can never exceed ½! Channel can be used reliably p < 1 ¡ 1 q ) q ! 1 ) p ! 1

11/3/2008Communication & Computation9 Shannon (1948) vs. Hamming (1950) q-ary channel: q-ary channel: Input: n element string Y over Σ= {1,…, q} Input: n element string Y over Σ= {1,…, q} Output: n element string Ŷ over Σ= {1,…, q} Output: n element string Ŷ over Σ= {1,…, q} Shannon: Errors = Random Shannon: Errors = Random Ŷ i = Y i w.p. 1 – p, uniform in Σ – {Y i } w.p. p. Ŷ i = Y i w.p. 1 – p, uniform in Σ – {Y i } w.p. p. Hamming: Errors = Adversarial Hamming: Errors = Adversarial p-fraction of i’s satisfy Ŷ i ≠ Y i p-fraction of i’s satisfy Ŷ i ≠ Y i p can never exceed ½! p can never exceed ½! Channel can be used reliably p < 1 ¡ 1 q ) q ! 1 ) p ! 1

11/3/2008Communication & Computation10 Which is the right model? 60 years of wisdom … 60 years of wisdom … Error model can be fine-tuned … Error model can be fine-tuned … Fresh combinatorics, algorithms, probabilistic models can be built … Fresh combinatorics, algorithms, probabilistic models can be built … … to fit Shannon Model. … to fit Shannon Model. An alternative – List-Decoding [Elias ’56]! An alternative – List-Decoding [Elias ’56]! allowed to produce list {m 1,…,m l } allowed to produce list {m 1,…,m l } “Successful” if {m 1,…,m l } contains m. “Successful” if {m 1,…,m l } contains m. “60 years of wisdom” ⇒ this is good enough! “60 years of wisdom” ⇒ this is good enough! [70s]: Corrects as many adversarial errors as random ones! [70s]: Corrects as many adversarial errors as random ones! Decoder

11/3/2008Communication & Computation11 Challenges in List-decoding! Algorithms? Algorithms? Correcting a few errors is already challenging! Correcting a few errors is already challenging! Can we really correct 70% errors? 80% errors? Can we really correct 70% errors? 80% errors? When an adversary injects them? When an adversary injects them? Note: More errors than data! Note: More errors than data! Till 1988 … no list-decoding algorithms. Till 1988 … no list-decoding algorithms. [Goldreich-Levin ’88] – Raised question [Goldreich-Levin ’88] – Raised question Gave non-trivial algorithm (for weak code). Gave non-trivial algorithm (for weak code). Gave cryptographic applications. Gave cryptographic applications.

11/3/2008Communication & Computation12 Algorithms for List-decoding [S. ’96], [Guruswami + S. ’98]: [S. ’96], [Guruswami + S. ’98]: List-decoding of Reed-Solomon codes. List-decoding of Reed-Solomon codes. Corrected p-fraction error with linear “rate”. Corrected p-fraction error with linear “rate”. [’98 – ’06] Many algorithmic innovations … [’98 – ’06] Many algorithmic innovations … [ Guruswami, Shokrollahi, Koetter-Vardy, Indyk ] [ Guruswami, Shokrollahi, Koetter-Vardy, Indyk ] [Parvaresh-Vardy ’05 + Guruswami-Rudra ’06] [Parvaresh-Vardy ’05 + Guruswami-Rudra ’06] List-decoding of new variant of Reed-Solomon codes. List-decoding of new variant of Reed-Solomon codes. Correct p-fraction error with optimal “rate”. Correct p-fraction error with optimal “rate”.

11/3/2008Communication & Computation13 Reed-Solomon List-Decoding Problem Given: Given: Parameters: n,k,t Parameters: n,k,t Points: (x 1,y 1 ),…,(x n,y n ) in the plane Points: (x 1,y 1 ),…,(x n,y n ) in the plane (over finite fields, actually) Find: Find: All degree k polynomials that pass through t of the n points. All degree k polynomials that pass through t of the n points. i.e., p such that deg(p) ≤ k deg(p) ≤ k |{i s.t. p(x i ) = y i }| ≥ t |{i s.t. p(x i ) = y i }| ≥ t

11/3/2008Communication & Computation14 Decoding by Example + Picture [S. ’96] n=14;k=1;t=5 Algorithm Idea: Find algebraic explanation Find algebraic explanation of all points. of all points. Stare at it! Stare at it! x 4 ¡ y 4 + x 2 ¡ y 2 = 0 Factor the polynomial! ( x 2 + y 2 ¡ 1 )( x + y )( x ¡ y )

11/3/2008Communication & Computation15 Decoding Algorithm Fact: There is always a degree 2√n polynomial thru n points Fact: There is always a degree 2√n polynomial thru n points Can be found in polynomial time (solving linear system). Can be found in polynomial time (solving linear system). [80s]: Polynomials can be factored in polynomial time [Grigoriev, Kaltofen, Lenstra] [80s]: Polynomials can be factored in polynomial time [Grigoriev, Kaltofen, Lenstra] Leads to (simple, efficient) list-decoding correcting p fraction errors for p → 1 Leads to (simple, efficient) list-decoding correcting p fraction errors for p → 1

11/3/2008Communication & Computation16 Conclusion More errors (than data!) can be dealt with … More errors (than data!) can be dealt with … More computational power leads to better error-correction. More computational power leads to better error-correction. Theoretical Challenge: List-decoding on binary channel (with optimal (Shannon) rates). Theoretical Challenge: List-decoding on binary channel (with optimal (Shannon) rates). Important to clarify the right model. Important to clarify the right model.

11/3/2008Communication & Computation17 Part II: Massive Data; Local Algorithms

11/3/2008Communication & Computation18 Reliability vs. Size of Data Q: How reliably can one store data as the amount of data increases? Q: How reliably can one store data as the amount of data increases? [Shannon]: Can store information at close to “optimal” rate, and prob. decoding error drops exponentially with length of data. [Shannon]: Can store information at close to “optimal” rate, and prob. decoding error drops exponentially with length of data. Surprising at the time? Surprising at the time? Decoding time grows with length of data Decoding time grows with length of data Exponentially in Shannon Exponentially in Shannon Subsequently polynomial, even linear. Subsequently polynomial, even linear. Is the bad news necessary? Is the bad news necessary?

11/3/2008Communication & Computation19 Sublinear time algorithmics Algorithms don’t always need to run in linear time (!), provided … Algorithms don’t always need to run in linear time (!), provided … They have random access to input, They have random access to input, Output is short (relative to input), Output is short (relative to input), Answers don’t have usual, exact, guarantee! Answers don’t have usual, exact, guarantee! Applies, in particular, to Applies, in particular, to Given CD, “test” to see if it has (too many) errors? [Locally Testable Codes] Given CD, “test” to see if it has (too many) errors? [Locally Testable Codes] Given CD, recover particular block. [Locally Decodable Codes] Given CD, recover particular block. [Locally Decodable Codes] Decoder

11/3/2008Communication & Computation20 Progress [ ] Question raised in context of results in complexity and privacy Question raised in context of results in complexity and privacy Probabilistically checkable proofs Probabilistically checkable proofs Private Information Retrieval Private Information Retrieval Summary: Summary: Many non-trivial tradeoffs possible. Many non-trivial tradeoffs possible. Locality can be reduced to n є at O(1) penalty to rate, fairly easily. Locality can be reduced to n є at O(1) penalty to rate, fairly easily. Much better effects possible with more intricate constructions. Much better effects possible with more intricate constructions. [Ben-Sasson+S. ’05, Dinur ’06]: O(1)-local testing with poly(log n) penalty in rate. [Ben-Sasson+S. ’05, Dinur ’06]: O(1)-local testing with poly(log n) penalty in rate. [Yekhanin ’07, Raghavendra ’07, Efremenko ’08]: 3- local decoding with subexponential penalty in rate. [Yekhanin ’07, Raghavendra ’07, Efremenko ’08]: 3- local decoding with subexponential penalty in rate.

11/3/2008Communication & Computation21 Challenges ahead Technical challenges Technical challenges Linear rate testability? Linear rate testability? Polynomial rate decodability? Polynomial rate decodability? Bigger Challenge Bigger Challenge What is the model for the future storage of information? What is the model for the future storage of information? How are we going to cope with increasing drive to digital information? How are we going to cope with increasing drive to digital information?

11/3/2008Communication & Computation22 Part III: The Meaning of Information

11/3/2008Communication & Computation23 The Meaning of Bits Is this perfect communication? Is this perfect communication? What if Alice is trying to send instructions? What if Alice is trying to send instructions? In other words … an algorithm In other words … an algorithm Does Bob understand the correct algorithm? Does Bob understand the correct algorithm? What if Alice and Bob speak in different (programming) languages? What if Alice and Bob speak in different (programming) languages? Channel Alice Bob Bob Bob Freeze!

11/3/2008Communication & Computation24 Motivation: Better Computing Networked computers use common languages: Networked computers use common languages: Interaction between computers (getting your computer onto internet). Interaction between computers (getting your computer onto internet). Interaction between pieces of software. Interaction between pieces of software. Interaction between software, data and devices. Interaction between software, data and devices. Getting two computing environments to “talk” to each other is getting problematic: Getting two computing environments to “talk” to each other is getting problematic: time consuming, unreliable, insecure. time consuming, unreliable, insecure. Can we communicate more like humans do? Can we communicate more like humans do?

11/3/2008Communication & Computation25 Some modelling Say, Alice and Bob know different programming languages. Alice wishes to send an algorithm A to Bob. Say, Alice and Bob know different programming languages. Alice wishes to send an algorithm A to Bob. Bad News: Can’t be done Bad News: Can’t be done For every Bob, there exist algorithms A and A’, and Alices, Alice and Alice’, such that Alice sending A is indistinguishable (to Bob) from Alice’ sending A’ For every Bob, there exist algorithms A and A’, and Alices, Alice and Alice’, such that Alice sending A is indistinguishable (to Bob) from Alice’ sending A’ Good News: Need not be done. Good News: Need not be done. From Bob’s perspective, if A and A’ are indistinguishable, then they are equally useful to him. From Bob’s perspective, if A and A’ are indistinguishable, then they are equally useful to him. Question: What should be communicated? Why? Question: What should be communicated? Why?

11/3/2008Communication & Computation26 Ongoing Work [Juba & S.] Assertion/Assumption: Communication happens when communicators have (explicit) goals. Assertion/Assumption: Communication happens when communicators have (explicit) goals. Goals: Goals: (Remote) Control: (Remote) Control: Actuating some change in environment Actuating some change in environment Example Example Printing on printer Printing on printer Buying from Amazon Buying from Amazon Intellectual: Intellectual: Learn something from (about?) environment Learn something from (about?) environment Example Example This lecture (what’s in it for you? For me?) This lecture (what’s in it for you? For me?)

11/3/2008Communication & Computation27 Example: Computational Goal Bob (weak computer) communicating with Alice (strong computer) to solve hard problem. Bob (weak computer) communicating with Alice (strong computer) to solve hard problem. Alice “Helpful” if she can help some (weak) Bob’ solve the problem. Alice “Helpful” if she can help some (weak) Bob’ solve the problem. Theorem [Juba & S.]: Bob can use Alice’s help to solve his problem iff problem is verifiable (for every Helpful Alice). Theorem [Juba & S.]: Bob can use Alice’s help to solve his problem iff problem is verifiable (for every Helpful Alice). “Misunderstanding” = “Mistrust” “Misunderstanding” = “Mistrust”

11/3/2008Communication & Computation28 Example Problems Bob wishes to … Bob wishes to … … solve undecidable problem (virus-detection) … solve undecidable problem (virus-detection) Not verifiable; so solves problems incorrectly for some Alices. Not verifiable; so solves problems incorrectly for some Alices. Hence does not learn her language. Hence does not learn her language. … break cryptosystem … break cryptosystem Verifiable; so Bob can use her help. Verifiable; so Bob can use her help. Must be learning her language! Must be learning her language! … Sort integers … Sort integers Verifiable; so Bob does solve her problem. Verifiable; so Bob does solve her problem. Trivial: Might still not be learning her language. Trivial: Might still not be learning her language.

11/3/2008Communication & Computation29 Generalizing Generic Goals Generic Goals Typical goals: Wishful Typical goals: Wishful Is Alice a human? or computer? Is Alice a human? or computer? Does she understand me? Does she understand me? Will she listen to me (and do what I say)? Will she listen to me (and do what I say)? Achievable goals: Verifiable Achievable goals: Verifiable Bob should be able to test achievement by looking at his input/output exchanges with Alice. Bob should be able to test achievement by looking at his input/output exchanges with Alice. Question: Which wishful goals are verifiable? Question: Which wishful goals are verifiable?

11/3/2008Communication & Computation30 Concluding More, complex, errors can be dealt with, thanks to improved computational abilities More, complex, errors can be dealt with, thanks to improved computational abilities Need to build/study tradeoffs between global reliability and local computation. Need to build/study tradeoffs between global reliability and local computation. Meaning of information needs to be preserved! Meaning of information needs to be preserved! Need to merge computation and communication more tightly! Need to merge computation and communication more tightly!

11/3/2008Communication & Computation31 Thank You!