Presentation is loading. Please wait.

Presentation is loading. Please wait.

Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine victoria management school.

Similar presentations


Presentation on theme: "Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine victoria management school."— Presentation transcript:

1 Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine victoria management school

2 The Universe and Order This talk makes two points This talk makes two points 1. Replication is a major ordering process like crystallisation E.g. where dn/dt ~ n x, replicates will grow E.g. where dn/dt ~ n x, replicates will grow 2. Algorithmic Entropy can be used to quantify order Including systems with noise and variation Including systems with noise and variation

3 victoria management school Algorithmic entropy or algorithmic complexity Algorithmic Entropy = Algorithmic Entropy = Length of the shortest algorithm that generates the string defining a structure or configuration Length of the shortest algorithm that generates the string defining a structure or configuration Using simple binary UTM, denoted by U Using simple binary UTM, denoted by U H U (s) = minimum |p| such that U(p)=s H U (s) = minimum |p| such that U(p)=s H algo (s) ≤ H U (s) + O(1) H algo (s) ≤ H U (s) + O(1) self delimiting algorithm self delimiting algorithm –Kraft inequality holds

4 victoria management school Relationship other entropies For all strings in an equilibrium configuration, For all strings in an equilibrium configuration, H algo (s) = Shannon entropy (ignoring overheads) H algo (s) = Shannon entropy (ignoring overheads) Algo entropy of string = captures uncertainty Algo entropy of string = captures uncertainty = k B ln2 H algo (s) = Boltzmann-Gibbs entropies = k B ln2 H algo (s) = Boltzmann-Gibbs entropies Meaningful and consistent for off- equilibrium configurations Meaningful and consistent for off- equilibrium configurations

5 victoria management school Algorithmic Information Theory (AIT) and Entropy AIT developed by Kolmogorov, Levin and independently Chaitin AIT developed by Kolmogorov, Levin and independently Chaitin Developments not readily accessible to scientists Developments not readily accessible to scientists Zurek 1 st to seriously apply to physics Zurek 1 st to seriously apply to physics

6 victoria management school Ordered string can be compressed s = “111…..111” ; i.e. N 1’s, can be generated by: s = “111…..111” ; i.e. N 1’s, can be generated by: p =PRINT “1” N times p =PRINT “1” N times H algo (s) ~ log 2 N + log 2 log 2 N H algo (s) ~ log 2 N + log 2 log 2 N –Ignoring Print statement for large N –Second term is cost of self delimiting algorithms Disordered or Random string incompressible Disordered or Random string incompressible s = “110111..1100…11” of length N s = “110111..1100…11” of length N H algo > length of string H algo > length of string

7 victoria management school Order = low algorithmic entropy Order is rare –most strings are random Order is rare –most strings are random Cannot determine whether s compressible Cannot determine whether s compressible –Consequence of Gödel and Turing –But if we perceive order string is compressible

8 victoria management school Common algorithmic instructions taken as given Entropy is a state function- only difference has meaning. Entropy is a state function- only difference has meaning. Physical laws, machine dependence, phase space graining can be absorbed into the common instructions. Physical laws, machine dependence, phase space graining can be absorbed into the common instructions. p= xxxxxxxxxxxx:yyyyyyyyy………….. p= xxxxxxxxxxxx:yyyyyyyyy………….. I.e. string p* + physical laws etc. I.e. string p* + physical laws etc. H algo (s) can be taken to be |p*| H algo (s) can be taken to be |p*|

9 victoria management school Provisional Entropy Provisional entropy makes meaningful for noisy descriptions Provisional entropy makes meaningful for noisy descriptions H algo (Set) specifies the set of all possible noisy strings consistent with a pattern or a model. H algo (Set) specifies the set of all possible noisy strings consistent with a pattern or a model. Given the set, H algo (string in set) specifies particular string in the set Given the set, H algo (string in set) specifies particular string in the set H prov = H algo (Set) + H algo (string in set) H prov = H algo (Set) + H algo (string in set) Provisional because hidden pattern might exist Provisional because hidden pattern might exist Cf Algorithmic Minimum Sufficient Statistic of Kolmogorov Cf Algorithmic Minimum Sufficient Statistic of Kolmogorov

10 victoria management school Algorithm to define context i.e. model or pattern Hs = log 2 N H (specifies which string) = log 2 N Shannon and Provisional Entropy + No information on context SHANNON ENTROPYPROVISIONAL ENTROPY

11 victoria management school Algorithmic entropy and physical laws Real world computation defines system trajectory of system Real world computation defines system trajectory of system a chemical reaction a chemical reaction DNA replication DNA replication Function like a UTM Function like a UTM H algo ≤ | system’s internal algorithm| H algo ≤ | system’s internal algorithm| Discarded information makes irreversible Discarded information makes irreversible –Cost k B log e 2 per bit discarded ; i.e. k B Tlog e 2 Joules –Landauer, Bennett

12 victoria management school Algorithmic Entropy Defines entropy of actual configuration Defines entropy of actual configuration Applies to non equilibrium situations Applies to non equilibrium situations Provides the thermodynamic cost of discarding entropy Provides the thermodynamic cost of discarding entropy E.g. Cost of recycling E.g. Cost of recycling Cost of non equilibrium existence Cost of non equilibrium existence

13 victoria management school Set of replicates has low Algo Entropy p = “Repeat replicate N times” Example - two state atomic laser i = 11111…111, representing excited atomic states i = 11111…111, representing excited atomic states No photon states No photon states Ignore momentum states as constant Ignore momentum states as constant H algo (i) low =ordered H algo (i) low =ordered 1 11111 0100110 1 11xy

14 victoria management school Free expansion trajectory f = 1100110.. atomic states atomic states + 11xy1 photon states + 11xy1 photon states (1= coherent, x =incoherent) (1= coherent, x =incoherent) At equilibrium- photons are absorbed and emitted At equilibrium- photons are absorbed and emitted 1 1xy 1 1001101

15 victoria management school The computation H algo (f) more disordered H algo (f) more disordered Atomic states longer description Atomic states longer description Coherent photon states short description Coherent photon states short description Repeat photon N times Repeat photon N times Incoherent photon states random Incoherent photon states random xyzxx… xyzxx… Like a free expansion Like a free expansion But replication compensates for disordering But replication compensates for disordering Would seem to be underlying principle Would seem to be underlying principle

16 victoria management school Generalisation e.g. N spins i = xxxxx, (spins) 0000000 (= low T sink). i = xxxxx, (spins) 0000000 (= low T sink). H prov (i) ~ N + |N| H prov (i) ~ N + |N| f = 111111 (spins) xxxxx..xxx (sink T rises) f = 111111 (spins) xxxxx..xxx (sink T rises) xxxxx..xxx discarded as latent heat xxxxx..xxx discarded as latent heat H prov (f) ~ |N| H prov (f) ~ |N| Disorder of sink states no longer in description Disorder of sink states no longer in description Irreversibility = ejecting disorder Irreversibility = ejecting disorder

17 victoria management school Provisional entropy measures variation in replicates s= “1111…..11111” s= “1111…..11111” H prov ~ log 2 N/2 + |11| (i.e. =log 2 N) H prov ~ log 2 N/2 + |11| (i.e. =log 2 N) S = “1y1y1y…..1y1y1y” where 1y is a variation of 11 S = “1y1y1y…..1y1y1y” where 1y is a variation of 11 H prov ~ N/2 +log 2 N/2 + |0| +|1| H prov ~ N/2 +log 2 N/2 + |0| +|1| –2 N/2 members in set Entropy change = N/2 Entropy change = N/2 = increase in uncertainty = increase in uncertainty

18 victoria management school System State Space Trajectory 1. Initial growth of replicates 2. At saturation- replicates die and are born 3. When entropy ejected, system locks into an attractor like region Births = deaths of replicates Births = deaths of replicates If not isolated- If not isolated- Homeostasis requires replicates regeneration Homeostasis requires replicates regeneration

19 victoria management school Attractor-like behaviour off equilibrium Resource flows needed to regenerate replicates Resource flows needed to regenerate replicates E.g. pump laser to replenish photons E.g. pump laser to replenish photons Variation in replicates stablises system Variation in replicates stablises system External impacts restrict size of attractor- like region External impacts restrict size of attractor- like region Shape changes – may merge with another replicate set Shape changes – may merge with another replicate set

20 victoria management school Coupling of Replicators Replicators that pass resources (entropy) to each other are Replicators that pass resources (entropy) to each other are More likely as more resource efficient More likely as more resource efficient Less cost to be maintained off equilibrium Less cost to be maintained off equilibrium E.g. one laser system pumping another E.g. one laser system pumping another

21 victoria management school Nesting Systems reduces algorithmic entropy Nested system orders at different scales Nested system orders at different scales As described by nested algorithms, H algo low As described by nested algorithms, H algo low But if large scale ordering lost But if large scale ordering lost Algo entropy increases Algo entropy increases At smallest scale no order observed At smallest scale no order observed Cf algorithm that defines me with Cf algorithm that defines me with Algorithms that see me as a pile of atoms. Algorithms that see me as a pile of atoms.

22 victoria management school d Diameter complexity Reducing scale suppresses order; i.e. longer description Reducing scale suppresses order; i.e. longer description Variation increases entropy (dotted line); but Variation increases entropy (dotted line); but Nesting decreases entropy to compensate Nesting decreases entropy to compensate D org = H max (x)-H d0 (x) D org = H max (x)-H d0 (x) Software variation is more efficient algorithmically as scale low Software variation is more efficient algorithmically as scale low

23 victoria management school Universe evolution and 2 nd law Universe is in an initial state Universe is in an initial state Trajectory determined by an algorithm Trajectory determined by an algorithm p=For step 0 to t; p=For step 0 to t; compute next state; compute next state; next step. next step. If physical laws are simple If physical laws are simple |p| ~ log 2 t |p| ~ log 2 t Equilibrium when log 2 t’ >>log 2 t Equilibrium when log 2 t’ >>log 2 t

24 victoria management school What does it all mean Have a practical entropy measure Have a practical entropy measure Tool for measuring change Tool for measuring change Shows how replication counters entropy increase Shows how replication counters entropy increase Nested structures are highly ordered Nested structures are highly ordered Nesting counters entropy increase from variations Nesting counters entropy increase from variations Minimises entropy cost of adaptation Minimises entropy cost of adaptation Maybe replication maintains order as the universe trajectory is towards disorder. Maybe replication maintains order as the universe trajectory is towards disorder.

25 victoria management school References Kolmogorov, K. Three approaches to the quantitative definition of information. Prob. Info. Trans. 1965, 1, 1-7. Levin, L. A. Zvonkin. The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russ. Math. Survs. 1970, 25, 83-124 Chaitin, G. On the length of programs for computing finite binary sequences. J. ACM 1966, 13, 547-569. Zurek W. H. Algorithmic randomness and physical entropy. Physical Review A 1989, 40, 4731-4751. Bennett, C. H. Thermodynamics of Computation- A review. International Journal of Theoretical Physics 1982, 21, 905-940. Landauer, R. Irreversibility and heat generation in the computing process, IBM Journal of Research and Development 5, 183-191, (1961).


Download ppt "Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine victoria management school."

Similar presentations


Ads by Google