Lecture 3. Symmetry of Information In Shannon information theory, the symmetry of information is well known. The same phenomenon is also in Kolmogorov.

Slides:



Advertisements
Similar presentations
Formal Models of Computation Part III Computability & Complexity
Advertisements

Completeness and Expressiveness
Some important properties Lectures of Prof. Doron Peled, Bar Ilan University.
Lecture 6. Prefix Complexity K The plain Kolmogorov complexity C(x) has a lot of “minor” but bothersome problems Not subadditive: C(x,y)≤C(x)+C(y) only.
Lecture 9. Resource bounded KC K-, and C- complexities depend on unlimited computational resources. Kolmogorov himself first observed that we can put resource.
Lecture 3 Universal TM. Code of a DTM Consider a one-tape DTM M = (Q, Σ, Γ, δ, s). It can be encoded as follows: First, encode each state, each direction,
Lecture 24 MAS 714 Hartmut Klauck
More Set Definitions and Proofs 1.6, 1.7. Ordered n-tuple The ordered n-tuple (a1,a2,…an) is the ordered collection that has a1 as its first element,
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen Department of Computer Science University of Texas-Pan American.
Copyright © Cengage Learning. All rights reserved.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Copyright © Cengage Learning. All rights reserved. CHAPTER 5 SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION.
Computability and Complexity
Agrawal-Kayal-Saxena Presented by: Xiaosi Zhou
Data Compression.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Lecture 2. Randomness Goal of this lecture: We wish to associate incompressibility with randomness. But we must justify this. We all have our own “standards”
1 Undecidability Andreas Klappenecker [based on slides by Prof. Welch]
1 Introduction to Computability Theory Lecture15: Reductions Prof. Amos Israeli.
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
Introduction to Computability Theory
1 Introduction to Computability Theory Lecture7: The Pumping Lemma for Context Free Languages Prof. Amos Israeli.
1 Introduction to Computability Theory Lecture4: Non Regular Languages Prof. Amos Israeli.
1 Introduction to Computability Theory Lecture13: Mapping Reductions Prof. Amos Israeli.
Courtesy Costas Busch - RPI1 A Universal Turing Machine.
Lecture 9 Recursive and r.e. language classes
Fall 2004COMP 3351 Recursively Enumerable and Recursive Languages.
Lecture 8 Recursively enumerable (r.e.) languages
1 Undecidability Andreas Klappenecker [based on slides by Prof. Welch]
Recursively Enumerable and Recursive Languages
1 Uncountable Sets continued Theorem: Let be an infinite countable set. The powerset of is uncountable.
1 Introduction to Computability Theory Lecture4: Non Regular Languages Prof. Amos Israeli.
Fall 2004COMP 3351 A Universal Turing Machine. Fall 2004COMP 3352 Turing Machines are “hardwired” they execute only one program A limitation of Turing.
1 Introduction to Computability Theory Lecture11: The Halting Problem Prof. Amos Israeli.
October 1, 2009Theory of Computation Lecture 8: Primitive Recursive Functions IV 1 Primitive Recursive Predicates Theorem 6.1: Let C be a PRC class. If.
1 Reducibility. 2 Problem is reduced to problem If we can solve problem then we can solve problem.
Lecture 3. Relation with Information Theory and Symmetry of Information Shannon entropy of random variable X over sample space S: H(X) = ∑ P(X=x) log 1/P(X=x)‏,
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
STATISTIC & INFORMATION THEORY (CSNB134)
Applied Discrete Mathematics Week 9: Relations
Mathematics Review Exponents Logarithms Series Modular arithmetic Proofs.
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
PRIMES is in P Manindra Agrawal NUS Singapore / IIT Kanpur.
1 1 CDT314 FABER Formal Languages, Automata and Models of Computation Lecture 15-1 Mälardalen University 2012.
Lecture 18. Unsolvability Before the 1930’s, mathematics was not like today. Then people believed that “everything true must be provable”. (More formally,
Quantifying Knowledge Fouad Chedid Department of Computer Science Notre Dame University Lebanon.
A Universal Turing Machine
1 Turing’s Thesis. 2 Turing’s thesis: Any computation carried out by mechanical means can be performed by a Turing Machine (1930)
Theory of Computation Lecture 5: Primitive Recursive Functions I
CS Lecture 14 Powerful Tools     !. Build your toolbox of abstract structures and concepts. Know the capacities and limits of each tool.
Lecture 21 Reducibility. Many-one reducibility For two sets A c Σ* and B c Γ*, A ≤ m B if there exists a Turing-computable function f: Σ* → Γ* such that.
Copyright © Cengage Learning. All rights reserved. CHAPTER 8 RELATIONS.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
Recursively Enumerable and Recursive Languages
Eric Allender Rutgers University Curiouser and Curiouser: The Link between Incompressibility and Complexity CiE Special Session, June 19, 2012.
Lecture 3. Symmetry of Information In Shannon information theory, the symmetry of information is well known. The same phenomenon is also in Kolmogorov.
1 Undecidability Andreas Klappenecker [based on slides by Prof. Welch]
1 Recursively Enumerable and Recursive Languages.
1 A Universal Turing Machine. 2 Turing Machines are “hardwired” they execute only one program A limitation of Turing Machines: Real Computers are re-programmable.
Sequences Lecture 11. L62 Sequences Sequences are a way of ordering lists of objects. Java arrays are a type of sequence of finite size. Usually, mathematical.
Primbs, MS&E345 1 Measure Theory in a Lecture. Primbs, MS&E345 2 Perspective  -Algebras Measurable Functions Measure and Integration Radon-Nikodym Theorem.
Recursively Enumerable and Recursive Languages
Homework: Friday Read Section 4.1. In particular, you must understand the proofs of Theorems 4.1, 4.2, 4.3, and 4.4, so you can do this homework. Exercises.
Lecture 6. Prefix Complexity K
Busch Complexity Lectures: Undecidable Problems (unsolvable problems)
The Curve Merger (Dvir & Widgerson, 2008)
Decidable Languages Costas Busch - LSU.
Formal Languages, Automata and Models of Computation
Presentation transcript:

Lecture 3. Symmetry of Information In Shannon information theory, the symmetry of information is well known. The same phenomenon is also in Kolmogorov complexity although the proof is totally different as well as the meaning. The term C(x)-C(x|y) is known as “the information y knows about x”. This information is symmetric was first proved by Levin and Kolmogorov (in Zvonkin-Levin, Russ. Math Surv, 1970) Today, we prove the important Symmetry of Information Theorem: Theorem. C(x)-C(x|y)=C(y)-C(y|x), modulo a log-term.

Symmetry in our lives Biological Crystal Greek architecture M.C. Escher’s art

Symmetry of Information Theorem Theorem. C(x)-C(x|y)=C(y)-C(y|x), modulo a log-term. Proof. Essentially we will prove: C(xy)=C(y|x)+C(x) + O(logC(xy)). (Since then C(xy)=C(x|y)+C(y) + O(logC(xy)). Theorem follows). (≤). It is trivial to see that C(xy)≤C(y|x)+C(x) + O(logC(xy)) is ture. (≥). We now need to prove C(xy)≥C(y|x)+C(x) + O(logC(xy)).

The idea --- To show: C(xy)≥C(y|x)+C(x) + O(logC(xy)) A = {(u,z)|C(u,z)<C(x,y)} |A| ≤ 2 c(x,y) A x ={z: C(x,z) ≤ C(x,y)} AxAx A x’ A x’’ This forms a partition of A A x must have >2 e elements otherwise C(y|x) too small. There cannot be more than |A| / 2 e such blocks. Thus C(x) is small. e = C(xy)-C(x) – clogC(xy)

Proving: C(xy) ≥ C(y|x)+C(x) + O(logC(xy)). Assume to the contrary: for each c, there are x and y s.t. C(xy)<C(y|x)+C(x) + clog C(xy) (1) Let A={(u,z): C(u,z) ≤ C(x,y)}. Given C(x,y), the set A can be recursively enumerated. Let A x ={z: C(x,z) ≤ C(x,y)}. Given C(x,y) and x, we have a simple algorithm to recursively enumerate A x. One can describe y, given x, using its index in the enumeration of A x and C(x,y). Hence C(y|x) ≤ log |A x | + 2logC(x,y) + O(1) (2) By (1) and (2), for each c, there are x, y s.t. |A x |>2 e, where e=C(x,y)-C(x)-clogC(x,y). But now, we obtain a too short description for x as follows. Given C(x,y) and e, we can recursively enumerate the strings u which are candidates for x by satisfying A u ={z: C(u,z) ≤ C(x,y)}, and 2 e <|A u |. (3) Denote the set of such u by U. Clearly, x ε U. Also { : u ε U & z ε A u } ⊆ A (4) The number of elements in A cannot exceed the available number of programs that are short enough to satisfy its definition: |A| ≤ 2 C(x,y)+O(1) (5) Using (3)(4)(5), |U| < |A|/2 e ≤ 2 C(x,y)+O(1) / 2 e Hence we can reconstruct x from C(x,y), e, and the index of x in the enumeration of U. Therefore C(x) < 2logC(x,y) + 2loge + C(x,y) – e +O(1) Substituting e=C(x,y)-C(x)+clogC(x,y) yields C(x)<C(x), for large c, contradiction. QED

Kamae Theorem For each natural number m, there is a string x such that for all but finitely many strings y, C(x) – C(x|y) ≥ m That is: there exist finite objects such that almost all finite objects contain a large amount of information about them.

Relations to Shannon Information theory Shannon entropy H(X) = - ∑ P(X=x)logP(X=x) Interpretation: H(X) bits are sufficient, on the average, to describe the outcome x. C(x), in Kolmogorov complexity, is the minimum description (smallest program) for one fixed x. It is a beautiful fact that these two notions can be shown to converge to the same thing, Theorem The symmetry of information is another example that the two concepts coincide.