1 2 Introduction In this lecture we’ll cover: Definition of strings as functions and vice versa Error correcting codes Low degree polynomials Low degree.

Slides:



Advertisements
Similar presentations
Theory of Computing Lecture 23 MAS 714 Hartmut Klauck.
Advertisements

Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
Lecture 24 MAS 714 Hartmut Klauck
Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
1. 2 Overview Review of some basic math Review of some basic math Error correcting codes Error correcting codes Low degree polynomials Low degree polynomials.
Constraint Satisfaction over a Non-Boolean Domain Approximation Algorithms and Unique Games Hardness Venkatesan Guruswami Prasad Raghavendra University.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
1 By Gil Kalai Institute of Mathematics and Center for Rationality, Hebrew University, Jerusalem, Israel presented by: Yair Cymbalista.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 13 June 25, 2006
Complexity 11-1 Complexity Andrei Bulatov Space Complexity.
Complexity 26-1 Complexity Andrei Bulatov Interactive Proofs.
CSC401 – Analysis of Algorithms Lecture Notes 1 Introduction
Derandomized DP  Thus far, the DP-test was over sets of size k  For instance, the Z-Test required three random sets: a set of size k, a set of size k-k’
CS151 Complexity Theory Lecture 7 April 20, 2004.
1 2 Introduction In this chapter we examine consistency tests, and trying to improve their parameters: In this chapter we examine consistency tests,
The Counting Class #P Slides by Vera Asodi & Tomer Naveh
Fundamental limits in Information Theory Chapter 10 :
1 2 Introduction In last chapter we saw a few consistency tests. In this chapter we are going to prove the properties of Plane-vs.- Plane test: Thm[RaSa]:
1. 2 Gap-QS[O(n), ,2|  | -1 ] 3SAT QS Error correcting codesSolvability PCP Proof Map In previous lectures: Introducing new variables Clauses to polynomials.
Correcting Errors Beyond the Guruswami-Sudan Radius Farzad Parvaresh & Alexander Vardy Presented by Efrat Bank.
1 COMPOSITION PCP proof by Irit Dinur Presentation by Guy Solomon.
1. 2 Gap-QS[O(1), ,2|  | -1 ] Gap-QS[O(n), ,2|  | -1 ] Gap-QS*[O(1),O(1), ,|  | -  ] Gap-QS*[O(1),O(1), ,|  | -  ] conjunctions of constant.
6/20/2015List Decoding Of RS Codes 1 Barak Pinhas ECC Seminar Tel-Aviv University.
1 2 Introduction In this chapter we examine consistency tests, and trying to improve their parameters: –reducing the number of variables accessed by.
Michael Bender - SUNY Stony Brook Dana Ron - Tel Aviv University Testing Acyclicity of Directed Graphs in Sublinear Time.
1 2 Introduction We are going to use several consistency tests for Consistent Readers. We are going to use several consistency tests for Consistent Readers.Consistent.
1. 2 Overview Some basic math Error correcting codes Low degree polynomials Introduction to consistent readers and consistency tests H.W.
1 INTRODUCTION NP, NP-hardness Approximation PCP.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
1 The PCP starting point. 2 Overview In this lecture we’ll present the Quadratic Solvability problem. In this lecture we’ll present the Quadratic Solvability.
Copyright © Cengage Learning. All rights reserved.
1 Slides by Asaf Shapira & Michael Lewin & Boaz Klartag & Oded Schwartz. Adapted from things beyond us.
1 Joint work with Shmuel Safra. 2 Motivation 3 Motivation.
Matrices and Systems Engineering Frank Lipsky copyrighted.
Linear Algebra Chapter 4 Vector Spaces.
Great Theoretical Ideas in Computer Science.
Sub-Constant Error Low Degree Test of Almost-Linear Size Dana Moshkovitz Weizmann Institute Ran Raz Weizmann Institute.
Information and Coding Theory Linear Block Codes. Basic definitions and some examples. Juris Viksna, 2015.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Theory of Computing Lecture 17 MAS 714 Hartmut Klauck.
Constraint Satisfaction Problems (CSPs) CPSC 322 – CSP 1 Poole & Mackworth textbook: Sections § Lecturer: Alan Mackworth September 28, 2012.
NP-COMPLETENESS PRESENTED BY TUSHAR KUMAR J. RITESH BAGGA.
2.1 Introduction In an experiment of chance, outcomes occur randomly. We often summarize the outcome from a random experiment by a simple number. Definition.
Great Theoretical Ideas in Computer Science.
Error Control Code. Widely used in many areas, like communications, DVD, data storage… In communications, because of noise, you can never be sure that.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
1 Building The Ultimate Consistent Reader. 2 Introduction We’ve already built a consistent reader (cube-Vs.-point)... Except it had variables ranging.
Umans Complexity Theory Lectures Lecture 1a: Problems and Languages.
CHAPTER 5 SIGNAL SPACE ANALYSIS
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
CS151 Complexity Theory Lecture 16 May 20, The outer verifier Theorem: NP  PCP[log n, polylog n] Proof (first steps): –define: Polynomial Constraint.
1 2 Introduction In this lecture we’ll cover: Definition of PCP Prove some classical hardness of approximation results Review some recent ones.
Complexity 24-1 Complexity Andrei Bulatov Interactive Proofs.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
RS – Reed Solomon Error correcting code. Error-correcting codes are clever ways of representing data so that one can recover the original information.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Complexity 27-1 Complexity Andrei Bulatov Interactive Proofs (continued)
1 CS 391L: Machine Learning: Computational Learning Theory Raymond J. Mooney University of Texas at Austin.
Probabilistic Algorithms
CHAPTER 2 RANDOM VARIABLES.
On the Size of Pairing-based Non-interactive Arguments
Functions Defined on General Sets
RS – Reed Solomon List Decoding.
Locally Decodable Codes from Lifting
The Curve Merger (Dvir & Widgerson, 2008)
I.4 Polyhedral Theory (NW)
I.4 Polyhedral Theory.
PCP Characterization of NP:
Umans Complexity Theory Lectures
Lecture 17 Making New Codes from Old Codes (Section 4.6)
Presentation transcript:

1

2 Introduction In this lecture we’ll cover: Definition of strings as functions and vice versa Error correcting codes Low degree polynomials Low degree extension Consistent readers Consistency tests

3 Strings & Functions Let  =  1  2...  3, where  i . We can describe the string  as a function  : [1…n]  , such that  i  (i) =  i. Let f be a function f : D  R. Then f can be described as a string over the alphabet  = R |D|, spelling f’s value on each point of D.

4 Strings & Functions - Example For example, let f be a function f : Z 5  Z 5, and let  = Z 5. f(x) = x 2   = 0, 1, 4, 4, 1

5 Error Correcting Codes Definition (encoding): An encoding E is a function E :  n   m, where m >> n. Definition (  -code): An encoding E is an  -code if  n  (E(  ),E(  ))  1 - , where  (x,y), denotes the fraction of entries on which x and y differ.

6 A Generic  -code Set F to be the finite field Z p for some prime p, and assume for simplicity that  = F and m = p. Given  n, let E(  ) be the string of the function f  : F  F that satisfies: f  is the unique n – 1 degree polynomial such that f  (i - 1) =  i for all 1  i  n.

7 A Generic  -code (2) E(  ) can be interpolated from any n points. Hence, for any , E(  ) and E(  ) may agree on at most n – 1 points. Therefore, E is an (n – 1) / m - code.

8 A Generic  -code - Example p = m = 5, n = 2  = 1, 2  = 3, 1 f  (x) = x + 1f  (x) = 3x + 3 E(  ) = 1, 2, 3, 4, 0E(  ) = 3, 1, 4, 2, 0

9 Polynomials Over Finite Fields Let V = F d be a geometric space of dimension d. Let p be a polynomial p : V  F of degree h in each variable. The total degree of p is h  d.

10 Properties of Low Degree Polynomials The value of p on any point of V can be interpolated from many (almost all) sets of (h+1) d points of V. (This is true when the points are linearly independent). If p is of a degree greater than h  d, then interpolation of a single point x  V from different sets of (h+1) d points may give different values (be inconsistent).

11 Properties of Low Degree Polynomials (2) Two distinct polynomials, p 1 and p 2, can agree on at most a very small (  ) fraction of V, i.e.  (p 1,p 2 )  1 - .

12 Properties of Low Degree Polynomials (3) The restriction of a polynomial of degree h in each variable to an affine sub-space of dimension d’ is a polynomial of dimension d’ and degree h in each variable. The restriction of a polynomial p of degree h in each variable to a curve  : F  V of degree k, p(  (t)) is a univariate polynomial of degree h  k.

13 Low Degree Extension (LDE) Definition (low degree extension): Assume H  F, such that |H| << |F|, and let d = log |H| n. A string  n is describable as a function  : H d  . LDE(  ) is a function LDE(  ) : F d  F such that:  LDE(  ) agrees with  on H d (extension).  LDE(  ) is of degree |H| - 1 in each variable (low degree).

14 Consistent Reading We need to be able to read a value of an LDE in a globally consistent manner. That is, have a representation scheme (a set of variables) for LDE(  ), and a reading procedure, that for any x  V accesses a very small number of representation variables and:  Rejects if inconsistency detected.  Otherwise, returns with high probability a value for LDE(  )(x), consistent among all points x.

15 Consistency A simple procedure: interpolate value from a set of (h+1) d points. This, however, requires (h+1) d points, while we would need a consistent-reader which accesses only a very small (preferably poly-log(n), ultimately constant) number of variables.

16 Consistency Test A consistency test requires only that value for a random x  V corresponds to a global degree-h polynomial. Notice that this is a weaker requirement than the one stated in the consistent reader definition, since it allows some x’s to be accepted with high probability although they do not agree with the global consistency, as long as the probability over all x’s remains low.

17 Consistency Test (2) Fix a representation: a set of variables. A test is a family of boolean functions over representation variables, each depending on a small set of variables, hence referred to as a local test – which may:  Reject if inconsistency detected.  Accept with high probability values conform to global consistency.

18 Consistency Test (3) P(0,0,0)P(0,0,1)P(0,0,2)P(0,0,3)P(0,0,4)P(0,0,5)P(0,0,6) P(0,1,0)P(0,1,1)P(0,1,2)P(0,1,3)P(0,1,4)P(0,1,5)P(0,1,6) P(0,2,0)P(0,2,1)P(0,2,2)P(0,2,3)P(0,2,4)P(0,2,5)P(0,2,6) P(0,3,0)P(0,3,1)P(0,3,2)P(0,3,3)P(0,3,4)P(0,3,5)P(0,3,6) P(6,6,0)P(6,6,1)P(6,6,2)P(6,6,3)P(6,6,4)P(6,6,5)P(6,6,6) P(0,0,0)P(0,0,1)P(0,0,2)P(0,0,3)P(0,0,4)P(0,0,5)P(0,0,6) 3 P(0,1,1)P(0,1,2)P(0,1,3)P(0,1,4)P(0,1,5)P(0,1,6) P(0,2,0)P(0,2,1) 5 P(0,2,3)P(0,2,4)P(0,2,5)P(0,2,6) P(0,3,0)P(0,3,1)P(0,3,2)P(0,3,3)P(0,3,4)P(0,3,5)P(0,3,6) P(6,6,0)P(6,6,1)P(6,6,2)P(6,6,3) 2 P(6,6,5)P(6,6,6)

19 Global Consistency Definition (pure global consistency): Pure global consistency would be for all x  V to be assigned values consistent with a single low degree polynomial. This cannot be detected by a local test, since changing the values in a small fraction of points will be detected only with low probability.

20 Corresponding Game Prover sets values to all variables in the representation. Verifier picks randomly a single local-test and accepts or rejects according to its output. The error-probability of a test is the fraction of local tests that may accept although the assigned values do not conform to global consistency.