Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine victoria management school.

Slides:



Advertisements
Similar presentations
The Laws of Thermodynamics and Air Conditioners
Advertisements

Thermodynamics versus Statistical Mechanics
Entropy in the Quantum World Panagiotis Aleiferis EECS 598, Fall 2001.
Dr. Baljeet Kaur Lecturer Chemistry Government Polytechnic College for Girls Patiala.
Information Reading: Complexity: A Guided Tour, Chapter 3.
Information Reading: Complexity: A Guided Tour, Chapter 3.
R C Ball, Physics Theory Group and Centre for Complexity Science University of Warwick R S MacKay, Maths M Diakonova, Physics&Complexity Emergence in Quantitative.
Dealing with Complexity Peter Andras Department of Psychology University of Newcastle
AISAMP Nov’08 1/25 The cost of information erasure in atomic and spin systems Joan Vaccaro Griffith University Brisbane, Australia Steve Barnett University.
Biology How Does Information/Entropy/ Complexity fit in?
NANIA 2D - Daisyworld Graeme Ackland (physicist) Tim Lenton (ecologist) Michael Clark (project student) A model planet showing coupling between life and.
/ 29 ContextEnergy costAngular Mtm costImpactSummary 1 Erasure of information under conservation laws Joan Vaccaro Centre for Quantum Dynamics Griffith.
1. 2
Information Theory and Security
The Second Law of Thermodynamics Chapter Introduction The first law of thermodynamics is simple, general, but does not constitute a complete theory.
AJITESH VERMA 1.  Dictionary meaning of chaos- state of confusion lack of any order or control.  Chaos theory is a branch of mathematics which studies.
Lecture 3. Relation with Information Theory and Symmetry of Information Shannon entropy of random variable X over sample space S: H(X) = ∑ P(X=x) log 1/P(X=x)‏,
2.4 Energy Conversions 1 Energy conversions are described by principles called the laws of thermodynamics. Energy Flow 2.4 Energy Conversions The first.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
An introduction to Cellular Processes. Learning Objectives SWBAT: Explain why all biological systems require constant energy input to maintain organization,
The Laws of Thermodynamics
Computability Kolmogorov-Chaitin-Solomonoff. Other topics. Homework: Prepare presentations.
Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
Chapter 4 Electron Configurations. Early thoughts Much understanding of electron behavior comes from studies of how light interacts with matter. Early.
Quantifying Knowledge Fouad Chedid Department of Computer Science Notre Dame University Lebanon.
The Second Law of Thermodynamics
Kolmogorov Complexity and Universal Distribution Presented by Min Zhou Nov. 18, 2002.
Chapter 6 Electronic Structure and Periodicity. Objectives Periodic law Organization of the periodic table Electromagnetic Radiation (added) Quantum mechanics.
Chapter 6 Modern Atomic Theory
STATISTICAL COMPLEXITY ANALYSIS Dr. Dmitry Nerukh Giorgos Karvounis.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Thermodynamics Thermodynamics Thermodynamics Way to calculate if a reaction will occur Way to calculate if a reaction will occur Kinetics Kinetics Way.
Are worms more complex than humans? Rodrigo Quian Quiroga Sloan-Swartz Center for Theoretical Neurobiology. Caltech.
The Physics of Information: Demons, Black Holes, and Negative Temperatures Charles Xu Splash 2013.
MME 2009 Metallurgical Thermodynamics
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
More on complexity measures Statistical complexity J. P. Crutchfield. The calculi of emergence. Physica D
Bringing Together Paradox, Counting, and Computation To Make Randomness! CS Lecture 21 
L 20 Thermodynamics [5] heat, work, and internal energy heat, work, and internal energy the 1 st law of thermodynamics the 1 st law of thermodynamics the.
Lecture 3. Symmetry of Information In Shannon information theory, the symmetry of information is well known. The same phenomenon is also in Kolmogorov.
Lecture 3. Symmetry of Information In Shannon information theory, the symmetry of information is well known. The same phenomenon is also in Kolmogorov.
Information complexity - Presented to HCI group. School of Computer Science. University of Oklahoma.
DEFINITIONS  Exergy ( Availability energy):- The maximum portion of energy which can be converted into useful work by reversible that can be obtained.
THEME: Theoretic bases of bioenergetics. LECTURE 6 ass. prof. Yeugenia B. Dmukhalska.
Made by, Vasava vipul [ ]. Thermodynamics Thermodynamics is the science of energy conversion involving heat and other forms of energy, most.
Chemistry Thermodynamics Lecture 7 : Entropy Lecture 8 : Converting Heat to Work Lecture 9: Free Energies.
Advanced Algorithms Analysis and Design
Kolmogorov Complexity
The Laws of Thermodynamics
Entropy and the Second Law of Thermodynamics
Context-based Data Compression
L 20 Thermodynamics [5] heat, work, and internal energy
Dealing with Complexity
Objective of This Course
Entropy & the 2nd Law of Thermodynamics
The Second Law of Thermodynamics
Chapter Two: Basic Concepts of Thermodynamics_ Part One
OSU Quantum Information Seminar
Sajjad Ahmed Memon S.S./ Health Physicist NIMRA
Algorithmic Complexity and Random Strings
What thermodynamics can tell us about Living Organisms?
What is Thermo, Stat Mech, etc.?
Entropy and the Second Law of Thermodynamics
A measurable definition of Emergence in quantitative systems
Second Law of Thermodynamics
A Block Based MAP Segmentation for Image Compression
Presentation transcript:

Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine victoria management school

The Universe and Order This talk makes two points This talk makes two points 1. Replication is a major ordering process like crystallisation E.g. where dn/dt ~ n x, replicates will grow E.g. where dn/dt ~ n x, replicates will grow 2. Algorithmic Entropy can be used to quantify order Including systems with noise and variation Including systems with noise and variation

victoria management school Algorithmic entropy or algorithmic complexity Algorithmic Entropy = Algorithmic Entropy = Length of the shortest algorithm that generates the string defining a structure or configuration Length of the shortest algorithm that generates the string defining a structure or configuration Using simple binary UTM, denoted by U Using simple binary UTM, denoted by U H U (s) = minimum |p| such that U(p)=s H U (s) = minimum |p| such that U(p)=s H algo (s) ≤ H U (s) + O(1) H algo (s) ≤ H U (s) + O(1) self delimiting algorithm self delimiting algorithm –Kraft inequality holds

victoria management school Relationship other entropies For all strings in an equilibrium configuration, For all strings in an equilibrium configuration, H algo (s) = Shannon entropy (ignoring overheads) H algo (s) = Shannon entropy (ignoring overheads) Algo entropy of string = captures uncertainty Algo entropy of string = captures uncertainty = k B ln2 H algo (s) = Boltzmann-Gibbs entropies = k B ln2 H algo (s) = Boltzmann-Gibbs entropies Meaningful and consistent for off- equilibrium configurations Meaningful and consistent for off- equilibrium configurations

victoria management school Algorithmic Information Theory (AIT) and Entropy AIT developed by Kolmogorov, Levin and independently Chaitin AIT developed by Kolmogorov, Levin and independently Chaitin Developments not readily accessible to scientists Developments not readily accessible to scientists Zurek 1 st to seriously apply to physics Zurek 1 st to seriously apply to physics

victoria management school Ordered string can be compressed s = “111…..111” ; i.e. N 1’s, can be generated by: s = “111…..111” ; i.e. N 1’s, can be generated by: p =PRINT “1” N times p =PRINT “1” N times H algo (s) ~ log 2 N + log 2 log 2 N H algo (s) ~ log 2 N + log 2 log 2 N –Ignoring Print statement for large N –Second term is cost of self delimiting algorithms Disordered or Random string incompressible Disordered or Random string incompressible s = “ …11” of length N s = “ …11” of length N H algo > length of string H algo > length of string

victoria management school Order = low algorithmic entropy Order is rare –most strings are random Order is rare –most strings are random Cannot determine whether s compressible Cannot determine whether s compressible –Consequence of Gödel and Turing –But if we perceive order string is compressible

victoria management school Common algorithmic instructions taken as given Entropy is a state function- only difference has meaning. Entropy is a state function- only difference has meaning. Physical laws, machine dependence, phase space graining can be absorbed into the common instructions. Physical laws, machine dependence, phase space graining can be absorbed into the common instructions. p= xxxxxxxxxxxx:yyyyyyyyy………….. p= xxxxxxxxxxxx:yyyyyyyyy………….. I.e. string p* + physical laws etc. I.e. string p* + physical laws etc. H algo (s) can be taken to be |p*| H algo (s) can be taken to be |p*|

victoria management school Provisional Entropy Provisional entropy makes meaningful for noisy descriptions Provisional entropy makes meaningful for noisy descriptions H algo (Set) specifies the set of all possible noisy strings consistent with a pattern or a model. H algo (Set) specifies the set of all possible noisy strings consistent with a pattern or a model. Given the set, H algo (string in set) specifies particular string in the set Given the set, H algo (string in set) specifies particular string in the set H prov = H algo (Set) + H algo (string in set) H prov = H algo (Set) + H algo (string in set) Provisional because hidden pattern might exist Provisional because hidden pattern might exist Cf Algorithmic Minimum Sufficient Statistic of Kolmogorov Cf Algorithmic Minimum Sufficient Statistic of Kolmogorov

victoria management school Algorithm to define context i.e. model or pattern Hs = log 2 N H (specifies which string) = log 2 N Shannon and Provisional Entropy + No information on context SHANNON ENTROPYPROVISIONAL ENTROPY

victoria management school Algorithmic entropy and physical laws Real world computation defines system trajectory of system Real world computation defines system trajectory of system a chemical reaction a chemical reaction DNA replication DNA replication Function like a UTM Function like a UTM H algo ≤ | system’s internal algorithm| H algo ≤ | system’s internal algorithm| Discarded information makes irreversible Discarded information makes irreversible –Cost k B log e 2 per bit discarded ; i.e. k B Tlog e 2 Joules –Landauer, Bennett

victoria management school Algorithmic Entropy Defines entropy of actual configuration Defines entropy of actual configuration Applies to non equilibrium situations Applies to non equilibrium situations Provides the thermodynamic cost of discarding entropy Provides the thermodynamic cost of discarding entropy E.g. Cost of recycling E.g. Cost of recycling Cost of non equilibrium existence Cost of non equilibrium existence

victoria management school Set of replicates has low Algo Entropy p = “Repeat replicate N times” Example - two state atomic laser i = 11111…111, representing excited atomic states i = 11111…111, representing excited atomic states No photon states No photon states Ignore momentum states as constant Ignore momentum states as constant H algo (i) low =ordered H algo (i) low =ordered xy

victoria management school Free expansion trajectory f = atomic states atomic states + 11xy1 photon states + 11xy1 photon states (1= coherent, x =incoherent) (1= coherent, x =incoherent) At equilibrium- photons are absorbed and emitted At equilibrium- photons are absorbed and emitted 1 1xy

victoria management school The computation H algo (f) more disordered H algo (f) more disordered Atomic states longer description Atomic states longer description Coherent photon states short description Coherent photon states short description Repeat photon N times Repeat photon N times Incoherent photon states random Incoherent photon states random xyzxx… xyzxx… Like a free expansion Like a free expansion But replication compensates for disordering But replication compensates for disordering Would seem to be underlying principle Would seem to be underlying principle

victoria management school Generalisation e.g. N spins i = xxxxx, (spins) (= low T sink). i = xxxxx, (spins) (= low T sink). H prov (i) ~ N + |N| H prov (i) ~ N + |N| f = (spins) xxxxx..xxx (sink T rises) f = (spins) xxxxx..xxx (sink T rises) xxxxx..xxx discarded as latent heat xxxxx..xxx discarded as latent heat H prov (f) ~ |N| H prov (f) ~ |N| Disorder of sink states no longer in description Disorder of sink states no longer in description Irreversibility = ejecting disorder Irreversibility = ejecting disorder

victoria management school Provisional entropy measures variation in replicates s= “1111… ” s= “1111… ” H prov ~ log 2 N/2 + |11| (i.e. =log 2 N) H prov ~ log 2 N/2 + |11| (i.e. =log 2 N) S = “1y1y1y…..1y1y1y” where 1y is a variation of 11 S = “1y1y1y…..1y1y1y” where 1y is a variation of 11 H prov ~ N/2 +log 2 N/2 + |0| +|1| H prov ~ N/2 +log 2 N/2 + |0| +|1| –2 N/2 members in set Entropy change = N/2 Entropy change = N/2 = increase in uncertainty = increase in uncertainty

victoria management school System State Space Trajectory 1. Initial growth of replicates 2. At saturation- replicates die and are born 3. When entropy ejected, system locks into an attractor like region Births = deaths of replicates Births = deaths of replicates If not isolated- If not isolated- Homeostasis requires replicates regeneration Homeostasis requires replicates regeneration

victoria management school Attractor-like behaviour off equilibrium Resource flows needed to regenerate replicates Resource flows needed to regenerate replicates E.g. pump laser to replenish photons E.g. pump laser to replenish photons Variation in replicates stablises system Variation in replicates stablises system External impacts restrict size of attractor- like region External impacts restrict size of attractor- like region Shape changes – may merge with another replicate set Shape changes – may merge with another replicate set

victoria management school Coupling of Replicators Replicators that pass resources (entropy) to each other are Replicators that pass resources (entropy) to each other are More likely as more resource efficient More likely as more resource efficient Less cost to be maintained off equilibrium Less cost to be maintained off equilibrium E.g. one laser system pumping another E.g. one laser system pumping another

victoria management school Nesting Systems reduces algorithmic entropy Nested system orders at different scales Nested system orders at different scales As described by nested algorithms, H algo low As described by nested algorithms, H algo low But if large scale ordering lost But if large scale ordering lost Algo entropy increases Algo entropy increases At smallest scale no order observed At smallest scale no order observed Cf algorithm that defines me with Cf algorithm that defines me with Algorithms that see me as a pile of atoms. Algorithms that see me as a pile of atoms.

victoria management school d Diameter complexity Reducing scale suppresses order; i.e. longer description Reducing scale suppresses order; i.e. longer description Variation increases entropy (dotted line); but Variation increases entropy (dotted line); but Nesting decreases entropy to compensate Nesting decreases entropy to compensate D org = H max (x)-H d0 (x) D org = H max (x)-H d0 (x) Software variation is more efficient algorithmically as scale low Software variation is more efficient algorithmically as scale low

victoria management school Universe evolution and 2 nd law Universe is in an initial state Universe is in an initial state Trajectory determined by an algorithm Trajectory determined by an algorithm p=For step 0 to t; p=For step 0 to t; compute next state; compute next state; next step. next step. If physical laws are simple If physical laws are simple |p| ~ log 2 t |p| ~ log 2 t Equilibrium when log 2 t’ >>log 2 t Equilibrium when log 2 t’ >>log 2 t

victoria management school What does it all mean Have a practical entropy measure Have a practical entropy measure Tool for measuring change Tool for measuring change Shows how replication counters entropy increase Shows how replication counters entropy increase Nested structures are highly ordered Nested structures are highly ordered Nesting counters entropy increase from variations Nesting counters entropy increase from variations Minimises entropy cost of adaptation Minimises entropy cost of adaptation Maybe replication maintains order as the universe trajectory is towards disorder. Maybe replication maintains order as the universe trajectory is towards disorder.

victoria management school References Kolmogorov, K. Three approaches to the quantitative definition of information. Prob. Info. Trans. 1965, 1, 1-7. Levin, L. A. Zvonkin. The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russ. Math. Survs. 1970, 25, Chaitin, G. On the length of programs for computing finite binary sequences. J. ACM 1966, 13, Zurek W. H. Algorithmic randomness and physical entropy. Physical Review A 1989, 40, Bennett, C. H. Thermodynamics of Computation- A review. International Journal of Theoretical Physics 1982, 21, Landauer, R. Irreversibility and heat generation in the computing process, IBM Journal of Research and Development 5, , (1961).