What should I talk about? Aspects of Human Communication

Slides:



Advertisements
Similar presentations
Estimating Distinct Elements, Optimally
Advertisements

1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Optimal Space Lower Bounds for All Frequency Moments David Woodruff MIT
The Complexity of Agreement A 100% Quantum-Free Talk Scott Aaronson MIT.
Xiaoming Sun Tsinghua University David Woodruff MIT
Tight Lower Bounds for the Distinct Elements Problem David Woodruff MIT Joint work with Piotr Indyk.
Lower Bounds for Non-Black-Box Zero Knowledge Boaz Barak (IAS*) Yehuda Lindell (IBM) Salil Vadhan (Harvard) *Work done while in Weizmann Institute. Short.
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Quantum Information and the PCP Theorem Ran Raz Weizmann Institute.
Gillat Kol joint work with Ran Raz Locally Testable Codes Analogues to the Unique Games Conjecture Do Not Exist.
Distributional Property Estimation Past, Present, and Future Gregory Valiant (Joint work w. Paul Valiant)
Gillat Kol (IAS) joint work with Ran Raz (Weizmann + IAS) Interactive Channel Capacity.
Universal Communication Brendan Juba (MIT) With: Madhu Sudan (MIT)
The Unique Games Conjecture with Entangled Provers is False Julia Kempe Tel Aviv University Oded Regev Tel Aviv University Ben Toner CWI, Amsterdam.
Dana Moshkovitz. Back to NP L  NP iff members have short, efficiently checkable, certificates of membership. Is  satisfiable?  x 1 = truex 11 = true.
Of 27 01/06/2015CMI: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011) Juba,
An Introduction to Black-Box Complexity
1 Algorithmic Aspects in Property Testing of Dense Graphs Oded Goldreich – Weizmann Institute Dana Ron - Tel-Aviv University.
Of 28 Probabilistically Checkable Proofs Madhu Sudan Microsoft Research June 11, 2015TIFR: Probabilistically Checkable Proofs1.
Correlation testing for affine invariant properties on Shachar Lovett Institute for Advanced Study Joint with Hamed Hatami (McGill)
Of 19 June 15, 2015CUHK: Communication Amid Uncertainty1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on joint works with Brendan.
Of 27 12/03/2015 Boole-Shannon: Laws of Communication of Thought 1 Laws of Communication of Thought? Madhu Sudan Harvard.
Of 22 10/07/2015UMass: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Microsoft Research Based on Juba, S. (STOC 2008, ITCS 2011)
Of 22 10/30/2015WUSTL: Uncertain Communication1 Communication Amid Uncertainty Madhu Sudan Harvard Based on Juba, S. (STOC 2008, ITCS 2011) Juba, S. (STOC.
Massive Data Sets and Information Theory Ziv Bar-Yossef Department of Electrical Engineering Technion.
Data Stream Algorithms Lower Bounds Graham Cormode
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.
Imperfectly Shared Randomness
Communication Amid Uncertainty
Information Complexity Lower Bounds
New Characterizations in Turnstile Streams with Applications
On Testing Dynamic Environments
Locality in Coding Theory
And now for something completely different!
Communication Amid Uncertainty
General Strong Polarization
Topic 14: Random Oracle Model, Hashing Applications
Communication Amid Uncertainty
Dana Moshkovitz The Institute For Advanced Study
Algorithms for Routing Node-Disjoint Paths in Grids
Universal Semantic Communication
Universal Semantic Communication
Local Decoding and Testing Polynomials over Grids
Effcient quantum protocols for XOR functions
General Strong Polarization
CS 154, Lecture 6: Communication Complexity
Local Error-Detection and Error-correction
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
Communication Amid Uncertainty
Linear sketching with parities
Locally Decodable Codes from Lifting
Near-Optimal (Euclidean) Metric Compression
Uncertain Compression
The Communication Complexity of Distributed Set-Joins
Linear sketching over
General Strong Polarization
Linear sketching with parities
Classical Algorithms from Quantum and Arthur-Merlin Communication Protocols Lijie Chen MIT Ruosong Wang CMU.
Imperfectly Shared Randomness
Universal Semantic Communication
Universal Semantic Communication
Fiat-Shamir for Highly Sound Protocols is Instantiable
Universal Semantic Communication
Communication Amid Uncertainty
Universal Semantic Communication
General Strong Polarization
General Strong Polarization
Proofs of Space 徐昊 2017/5/31.
Stronger Connections Between Circuit Analysis and Circuit Lower Bounds, via PCPs of Proximity Lijie Chen Ryan Williams.
Presentation transcript:

What should I talk about? Aspects of Human Communication Madhu Sudan Harvard University Based on many joint works … February 13, 2019 CINCS - Human Communication

Ingredients in Human Communication Ability to start with (nearly) zero context and “learning to communicate by communicating”. Children learn to speak … (not by taking courses) Focus of works with Juba and Goldreich “What is possible?” (a la computability) Ability to leverage large uncertain context. E.g. … This talk today … Assumes … English, Math, TCS, Social info, Geography. Aside … what is “self-contained”? “How to make communication efficient (using context)?” (a la complexity) February 13, 2019 CINCS - Human Communication

Context in Communication Empirically, Informally: Huge piece of information (much larger than “content” of communication) Not strictly needed for communication … … But makes communication efficient, when shared by communicating players … helps even if context not shared perfectly. Challenge: Formalize? Work so far … some (toy?) settings February 13, 2019 CINCS - Human Communication

Underlying Model: Communication Complexity The model (with shared randomness) 𝑥 𝑦 𝑓: 𝑥,𝑦 ↦Σ 𝑅=$$$ Usually studied for lower bounds. This talk: CC as +ve model. Alice Bob 𝐶𝐶 𝑓 =# bits exchanged by best protocol 𝑓(𝑥,𝑦) w.p. 2/3 February 13, 2019 CINCS - Human Communication

Aside: Easy CC Problems [Ghazi,Kamath,S’15] ∃ Problems with large inputs and small communication? Equality testing: 𝐸𝑄 𝑥,𝑦 =1⇔𝑥=𝑦; 𝐶𝐶 𝐸𝑄 =𝑂 1 Hamming distance: 𝐻 𝑘 𝑥,𝑦 =1⇔Δ 𝑥,𝑦 ≤𝑘; 𝐶𝐶 𝐻 𝑘 =𝑂(𝑘 log 𝑘 ) [Huang etal.] Small set intersection: ∩ 𝑘 𝑥,𝑦 =1⇔ wt 𝑥 , wt 𝑦 ≤𝑘 & ∃𝑖 𝑠.𝑡. 𝑥 𝑖 = 𝑦 𝑖 =1. 𝐶𝐶 ∩ 𝑘 =𝑂(𝑘) [Håstad Wigderson] Gap (Real) Inner Product: 𝑥,𝑦∈ ℝ 𝑛 ; 𝑥 2 , 𝑦 2 =1; 𝐺𝐼 𝑃 𝑐,𝑠 𝑥,𝑦 =1 if 𝑥,𝑦 ≥𝑐; =0 if 𝑥,𝑦 ≤𝑠; 𝐶𝐶 𝐺𝐼 𝑃 𝑐,𝑠 =𝑂 1 𝑐−𝑠 2 ; [Alon, Matias, Szegedy] Protocol: Fix ECC 𝐸: 0,1 𝑛 → 0,1 𝑁 ; Shared randomness: 𝑖← 𝑁 ; Exchange 𝐸 𝑥 𝑖 , 𝐸 𝑦 𝑖 ; Accept iff 𝐸 𝑥 𝑖 =𝐸 𝑦 𝑖 . 𝑝𝑜𝑙𝑦 𝑘 Protocol: Use common randomness to hash 𝑛 →[ 𝑘 2 ] 𝑥=( 𝑥 1 , …, 𝑥 𝑛 ) 𝑦= 𝑦 1 , …, 𝑦 𝑛 〈𝑥,𝑦〉≜ 𝑖 𝑥 𝑖 𝑦 𝑖 Main Insight: If 𝐺←𝑁 0,1 𝑛 , then 𝔼 𝐺,𝑥 ⋅ 𝐺,𝑦 =〈𝑥,𝑦〉 Unstated philosophical contribution of CC a la Yao: Communication with a focus (“only need to determine 𝑓 𝑥,𝑦 ”) can be more effective (shorter than 𝑥 ,𝐻 𝑥 ,𝐻 𝑦 ,𝐼(𝑥;𝑦)… ) February 13, 2019 CINCS - Human Communication

Communication & (Uncertain) Context Context Alice =(𝑓,𝑅, 𝑃 𝑋𝑌 ) Context Bob =(𝑓,𝑅, 𝑃 𝑋𝑌 ) =( 𝑓 𝐴 , 𝑅 𝐴 , 𝑃 𝑋𝑌 𝐴 ) =( 𝑓 𝐵 , 𝑅 𝐵 , 𝑃 𝑋𝑌 𝐵 ) 𝑦 𝑥 𝑓: 𝑥,𝑦 ↦Σ Alice Bob 𝑓(𝑥,𝑦) w.p. 2/3 February 13, 2019 CINCS - Human Communication

1. Imperfectly Shared Randomness Context Alice =𝑅 Context Bob =𝑅 = 𝑅 𝐴 = 𝑅 𝐵 𝑦 𝑥 𝑓: 𝑥,𝑦 ↦Σ Alice Bob 𝑓(𝑥,𝑦) w.p. 2/3 February 13, 2019 CINCS - Human Communication

Imperfectly Shared Randomness (ISR) Model: 𝑅 𝐴 ∼ 𝑁 𝜌 𝑅 𝐵 (𝜌-correlated iid on each coord.) Thm [Bavarian-Gavinsky-Ito’15]: Equality testing has 𝑂(1)-comm. comp. with ISR. Thm [Canonne-Guruswami-Meka-S.’16]: If 𝑓 has CC 𝑘 with perfect rand., then it has ISR-CC 𝑂 𝜌 2 𝑘 Thm [CGMS] This is tight (for promise problems). Complete problem: Estimate 𝑥,𝑦 for 𝑥,𝑦∈ ℝ 𝑛 𝜖-approximation needs Θ 𝜖 −2 communication Hard problem: Sparse inner product – where 𝑥 is non-zero only 𝜖-fraction of the times. February 13, 2019 CINCS - Human Communication

2. Uncertain Functionality Context Alice =(𝑓,𝑅, 𝑃 𝑋𝑌 ) Context Bob =(𝑓,𝑅, 𝑃 𝑋𝑌 ) =( 𝑓 𝐴 ,𝑅, 𝑃 𝑋𝑌 ) =( 𝑓 𝐵 ,𝑅, 𝑃 𝑋𝑌 ) 𝑦 𝑥 𝑓: 𝑥,𝑦 ↦Σ Alice Bob 𝑓(𝑥,𝑦) w.p. 2/3 over 𝑥,𝑦∼ 𝑃 𝑋𝑌 w.p. 2/3 February 13, 2019 CINCS - Human Communication

Definitions and results Defining problem is non-trivial: Alice/Bob may not “know” 𝑓 but protocol might! Prevent this by considering entire class of function pairs 𝒢= ( 𝑓 𝐴 , 𝑓 𝐵 ) that are admissible. Complexity = complexity of 𝒢 ! Theorem[Ghazi,Komargodski,Kothari,S. 16]: If 𝑃 𝑋𝑌 arbitrary then there exists 𝒢 s.t. every 𝑓∈𝒢 has cc = 1, but uncertain-cc 𝒢 =Ω 𝑛 If 𝑃 𝑋𝑌 = uniform and every 𝑓∈𝒢 has one-way cc 𝑘, then uncertain-cc 𝒢 =𝑂 𝑘 . Theorem[Ghazi-S.,18]: Above needs perfect shared randomness. February 13, 2019 CINCS - Human Communication

3. Compression & (Uncertain) Context Context Alice = 𝑃 𝑋 Context Bob = 𝑃 𝑋 = 𝑃 𝑋 𝐴 = 𝑃 𝑋 𝐵 𝑦 𝑥 𝑓: 𝑥,𝑦 ↦Σ Alice Bob 𝑓(𝑥,𝑦) 𝑥 w.p. 2/3 February 13, 2019 CINCS - Human Communication

3. (Uncertain) Compression Without context: CC = log Ω (where 𝑥∈Ω) With shared context, Expected-CC = 𝐻 𝑃 𝑋 With imperfectly shared context, but with shared randomness, Expected-CC = 𝐻 𝑃 𝑋 𝐴 +Θ(Δ) Where Δ= max 𝑥 max log 𝑃 𝐴 𝑥 𝑃 𝐵 𝑥 , log 𝑃 𝐵 𝑥 𝑃 𝐴 𝑥 [JKKS’11] Without shared randomness … exact status unknown! Best upper bound ([Haramaty,S’14]): Expected-CC =𝑂 𝐻 𝑃 𝑋 𝐴 +Δ+ log log Ω February 13, 2019 CINCS - Human Communication

Compression as a proxy for language Information theoretic study of language? Goal of language: Effective means of expressing information/action. Implicit objective of language: Make frequent messages short. Compression! Frequency = Known globally? Learned locally? If latter – every one can’t possibly agree on it; Yet need to agree on language (mostly)! Similar to problem of Uncertain Compression. Studied formally in [Ghazi,Haramaty,Kamath,S. ITCS 17] February 13, 2019 CINCS - Human Communication

CINCS - Human Communication Part II: Proofs February 13, 2019 CINCS - Human Communication

CINCS - Human Communication Well understood … (Goldwasser-Micali-Rackoff, …) Interactive Proofs Zero Knowledge Proofs Multi-Prover Interactive Proofs PCPs Interactive Proofs for Muggles Pseudodeterministic Proofs … … nevertheless some challenges in understanding communication of proofs … February 13, 2019 CINCS - Human Communication

CINCS - Human Communication Standard Assumption A T Π Small (Constant) Number of Axioms 𝑋→𝑌, 𝑌→𝑍⇒ 𝑋→𝑍, Peano, etc. Medium Sized Theorem: ∀𝑥,𝑦,𝑧,𝑛∈ℕ, 𝑥 𝑛 + 𝑦 𝑛 = 𝑧 𝑛 →𝑛≤2 … Big Proof: Blah blah blah blah blah blah bla blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah February 13, 2019 CINCS - Human Communication

CINCS - Human Communication The truth Mathematical proofs assume large context. “By some estimates a proof that 2+2=4 in ZFC would require about 20000 steps … so we will use a huge set of axioms to shorten our proofs – namely, everything from high-school mathematics” [Lehman,Leighton,Meyer – Notes for MIT 6.042] Context (= huge set of axioms) shortens proofs. But context is uncertain! What is “high school mathematics”? February 13, 2019 CINCS - Human Communication

Communicating (“speaking of”) Proofs Ingredients: Prover: Axioms 𝐴, Theorem 𝑇, Proof Π Communicates 𝑇,Π (Claim: Π proves 𝐴→𝑇) Verifier: Complete+sound: ∃Π, 𝑉 𝐴 𝑇, Π =1 iff 𝐴→𝑇 Verifier efficient = Poly(𝑇, Π) with oracle access to 𝐴 Axioms = Context. A T Π February 13, 2019 CINCS - Human Communication

Uncertainty of Context? Prover: works with axioms 𝐴 𝑃 , generates Π s.t. 𝑉 𝐴 𝑃 𝑇,Π =1 Verifier: works with axioms 𝐴 𝑉 , checks 𝑉 𝐴 𝑉 𝑇,Π =1 Robust prover/verifier ? Need measure 𝛿 𝐴 𝑃 , 𝐴 𝑉 (not symmetric). Given 𝛿 prover should be able to generate Π 𝛿 such that ∀ 𝐴 𝑃 s.t. 𝛿 𝐴 𝑃 , 𝐴 𝑉 ≤𝛿, 𝑉 𝐴 𝑉 𝑇,Π =1 Π 𝛿 not much larger than Π= Π 0 𝛿 .,. “reasonable” … E.g. if 𝑋→𝑌, 𝑌→𝑍∈𝐴 and 𝐴 ′ =𝐴∪ 𝑋→𝑍 then 𝛿 𝐴′,𝐴 tiny. Open: Does such an efficient verifier exist? February 13, 2019 CINCS - Human Communication

Beyond oracle access: Modelling the mind Brain (of verifier) does not just store and retrieve axioms. Can make logical deductions too! But should do so feasibly. Semi-formally: Let 𝐴 𝑡 denote the set of axioms “known” to the verifier given 𝑡 query-proc. time Want 𝐴 2𝑡 ≫ 𝐴 𝑡 , but storage space ~ 𝐴 0 What is a computational model of the brain that allows this? Cell-probe – N o ∗ . ConsciousTM – Maybe. Etc… February 13, 2019 CINCS - Human Communication

CINCS - Human Communication Conclusions Very poor understanding of “communication of proofs” as we practice it. Have to rely on verifier’s “knowledge” But can’t expect to know it exactly Exposes holes in computational understanding of knowledge-processing. Can we “verify” more given more time? Or are we just memorizing? February 13, 2019 CINCS - Human Communication

CINCS - Human Communication Thank You! February 13, 2019 CINCS - Human Communication