Lecture 3. Symmetry of Information In Shannon information theory, the symmetry of information is well known. The same phenomenon is also in Kolmogorov.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms II
Advertisements

Formal Models of Computation Part III Computability & Complexity
Completeness and Expressiveness
Some important properties Lectures of Prof. Doron Peled, Bar Ilan University.
Extracting Randomness From Few Independent Sources Boaz Barak, IAS Russell Impagliazzo, UCSD Avi Wigderson, IAS.
Lecture 6. Prefix Complexity K The plain Kolmogorov complexity C(x) has a lot of “minor” but bothersome problems Not subadditive: C(x,y)≤C(x)+C(y) only.
Lecture 9. Resource bounded KC K-, and C- complexities depend on unlimited computational resources. Kolmogorov himself first observed that we can put resource.
Lecture 3 Universal TM. Code of a DTM Consider a one-tape DTM M = (Q, Σ, Γ, δ, s). It can be encoded as follows: First, encode each state, each direction,
More Set Definitions and Proofs 1.6, 1.7. Ordered n-tuple The ordered n-tuple (a1,a2,…an) is the ordered collection that has a1 as its first element,
Recursive Definitions and Structural Induction
CSCI 4325 / 6339 Theory of Computation Zhixiang Chen Department of Computer Science University of Texas-Pan American.
Copyright © Cengage Learning. All rights reserved.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Copyright © Cengage Learning. All rights reserved. CHAPTER 5 SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION.
1 Diagonalization Fact: Many books exist. Fact: Some books contain the titles of other books within them. Fact: Some books contain their own titles within.
Data Compression.
Math 344 Winter 07 Group Theory Part 3: Quotient Groups
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
Introduction to Computability Theory
1 Introduction to Computability Theory Lecture4: Non Regular Languages Prof. Amos Israeli.
1 Introduction to Computability Theory Lecture13: Mapping Reductions Prof. Amos Israeli.
Courtesy Costas Busch - RPI1 A Universal Turing Machine.
Lecture 8 Recursively enumerable (r.e.) languages
1 Undecidability Andreas Klappenecker [based on slides by Prof. Welch]
1 Introduction to Computability Theory Lecture4: Non Regular Languages Prof. Amos Israeli.
November 10, 2009Theory of Computation Lecture 16: Computation on Strings I 1Diagonalization Another example: Let TOT be the set of all numbers p such.
Fall 2004COMP 3351 A Universal Turing Machine. Fall 2004COMP 3352 Turing Machines are “hardwired” they execute only one program A limitation of Turing.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
The Byzantine Generals Strike Again Danny Dolev. Introduction We’ll build on the LSP presentation. Prove a necessary and sufficient condition on the network.
Copyright © Cengage Learning. All rights reserved.
1 Introduction to Computability Theory Lecture11: The Halting Problem Prof. Amos Israeli.
Lecture 3. Relation with Information Theory and Symmetry of Information Shannon entropy of random variable X over sample space S: H(X) = ∑ P(X=x) log 1/P(X=x)‏,
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
Applied Discrete Mathematics Week 9: Relations
Mathematics Review Exponents Logarithms Series Modular arithmetic Proofs.
1 Introduction to Abstract Mathematics Chapter 4: Sequences and Mathematical Induction Instructor: Hayk Melikya 4.1- Sequences. 4.2,
PRIMES is in P Manindra Agrawal NUS Singapore / IIT Kanpur.
1 1 CDT314 FABER Formal Languages, Automata and Models of Computation Lecture 15-1 Mälardalen University 2012.
Mathematical Induction
A Universal Turing Machine
1 Turing’s Thesis. 2 Turing’s thesis: Any computation carried out by mechanical means can be performed by a Turing Machine (1930)
Theory of Computation Lecture 5: Primitive Recursive Functions I
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
Lecture 21 Reducibility. Many-one reducibility For two sets A c Σ* and B c Γ*, A ≤ m B if there exists a Turing-computable function f: Σ* → Γ* such that.
Recursively Enumerable and Recursive Languages
Eric Allender Rutgers University Curiouser and Curiouser: The Link between Incompressibility and Complexity CiE Special Session, June 19, 2012.
Lecture 3. Symmetry of Information In Shannon information theory, the symmetry of information is well known. The same phenomenon is also in Kolmogorov.
1 Undecidability Andreas Klappenecker [based on slides by Prof. Welch]
CS 173, Lecture B August 27, 2015 Tandy Warnow. Proofs You want to prove that some statement A is true. You can try to prove it directly, or you can prove.
Chapter 5 With Question/Answer Animations 1. Chapter Summary Mathematical Induction - Sec 5.1 Strong Induction and Well-Ordering - Sec 5.2 Lecture 18.
Sorting by placement and Shift Sergi Elizalde Peter Winkler By 資工四 B 周于荃.
Great Theoretical Ideas in Computer Science.
Lecture #4 Thinking of designing an abstract machine acts as finite automata. Advanced Computation Theory.
Lecture #5 Advanced Computation Theory Finite Automata.
1 Recursively Enumerable and Recursive Languages.
1 A Universal Turing Machine. 2 Turing Machines are “hardwired” they execute only one program A limitation of Turing Machines: Real Computers are re-programmable.
Sequences Lecture 11. L62 Sequences Sequences are a way of ordering lists of objects. Java arrays are a type of sequence of finite size. Usually, mathematical.
Primbs, MS&E345 1 Measure Theory in a Lecture. Primbs, MS&E345 2 Perspective  -Algebras Measurable Functions Measure and Integration Radon-Nikodym Theorem.
1 Proofs by Counterexample & Contradiction There are several ways to prove a theorem:  Counterexample: By providing an example of in which the theorem.
Axiomatic Number Theory and Gödel’s Incompleteness Theorems
Recursively Enumerable and Recursive Languages
Diagonalization and Reducibility
CS154, Lecture 11: Self Reference, Foundation of Mathematics
Lecture 6. Prefix Complexity K
The Curve Merger (Dvir & Widgerson, 2008)
Decidable Languages Costas Busch - LSU.
Formal Languages, Automata and Models of Computation
Diagonalization and Reducibility
(c) Project Maths Development Team 2011
Presentation transcript:

Lecture 3. Symmetry of Information In Shannon information theory, the symmetry of information is well known. The same phenomenon is also in Kolmogorov complexity although the proof is totally different as well as the meaning. The term C(x)-C(x|y) is known as “the information y knows about x”. This information is symmetric was first proved by Levin and Kolmogorov (in Zvonkin-Levin, Russ. Math Surv, 1970) Theorem. C(x)-C(x|y)=C(y)-C(y|x) plus or minus a log-term. Proof. Essentially we will prove: C(xy)=C(y|x)+C(x) + O(logC(xy)). (Since then C(xy)=C(x|y)+C(y) + O(logC(xy)). Theorem follows). (≤). It is trivial to see that C(xy)≤C(y|x)+C(x) + O(logC(xy)) is true. (≥). We now need to prove C(xy)≥C(y|x)+C(x) + O(logC(xy)).

Proving: C(xy) ≥ C(y|x)+C(x) + O(logC(xy)). Assume to the contrary: for each c, there are x and y s.t. C(xy)<C(y|x)+C(x) + clog C(xy) (1) Let A={(u,z): C(u,z) ≤ C(x,y)}. Given C(x,y), the set A can be recursively enumerated. Let A x ={z: C(x,z) ≤ C(x,y)}. Given C(x,y) and x, we have a simple algorithm to recursively enumerate A x. One can describe y, given x, using its index in the enumeration of A x and C(x,y). Hence C(y|x) ≤ log |A x | + 2logC(x,y) + O(1) (2) By (1) and (2), for each c, there are x, y s.t. |A x |>2 e, where e=C(x,y)-C(x)-clogC(x,y). But now, we obtain a too short description for x as follows. Given C(x,y) and e, we can recursively enumerate the strings u which are candidates for x by satisfying A u ={z: C(u,z) ≤ C(x,y)}, and 2 e <|A u |. (3) Denote the set of such u by U. Clearly, x ε U. Also { : u ε U & z ε A u } ⊆ A (4) The number of elements in A cannot exceed the available number of programs that are short enough to satisfy its definition: |A| ≤ 2 C(x,y)+O(1) (5) Using (3)(4)(5), |U| < |A|/2 e ≤ 2 C(x,y)+O(1) / 2 e Hence we can reconstruct x from C(x,y), e, and the index of x in the enumeration of U. Therefore C(x) < 2logC(x,y) + 2loge + C(x,y) – e +O(1) Substituting e=C(x,y)-C(x)+clogC(x,y) yields C(x)<C(x), for large c, contradiction. QED

Kamae Theorem For each natural number m, there is a string x such that for all but finitely many strings y, C(x) – C(x|y) ≥ m That is: there exist finite objects such that almost all finite objects contain a large amount of information about them.

Relations to Shannon Information theory Shannon entropy H(X) = - ∑ P(X=x)logP(X=x) Interpretation: H(X) bits are sufficient, on the average, to describe the outcome x. C(x), in Kolmogorov complexity, is the minimum description (smallest program) for one fixed x. It is a beautiful fact that these two notions can be shown to converge to the same thing, Theorem The symmetry of information is another example that the two concepts coincide.