Presentation is loading. Please wait.

Presentation is loading. Please wait.

More on complexity measures Statistical complexity J. P. Crutchfield. The calculi of emergence. Physica D. 1994.

Similar presentations


Presentation on theme: "More on complexity measures Statistical complexity J. P. Crutchfield. The calculi of emergence. Physica D. 1994."— Presentation transcript:

1 More on complexity measures Statistical complexity J. P. Crutchfield. The calculi of emergence. Physica D. 1994

2 xxx n : 2 complex  random Entropy and algorithmic complexity associate maximum complexity with randomness  pure order and pure noise are not “complex”  complex systems have  intricate structure on multiple scales  repeating patterns  continual variation, …  complexity lies between order and chaos  Wolfram’s class 4 CAs  Langton’s “edge of chaos” Mutual information shows complexity  RBN transition example (also k-SAT):  are there other measures like this? randomness, H C H I

3 HIDDEN: statistical complexity

4 xxx n : 4 when randomness = noise the measures so far assume that randomness is information  even logical depth  Randomness is not very “deep” information Sometimes, the “randomness” actually is information  the output of good compression algorithms is highly “random”  else the remaining structure could be used to compress it more “any sufficiently advanced communication is indistinguishable from noise”  crypto functions output “random” strings  else the remaining structure could be used to break the code

5 xxx n : 5 randomness and noise in the real world, (some) randomness is just “noise”  of no interest, carrying no “information” these pictures are all different microscopically, but all just “white noise” macroscopically  the differences are not important  information measures “overfit” noise as data  this kind of noisy randomness is intuitively simple  a small change to the noise, is just the same noise

6 xxx n : 6 to model a coin toss … how would you create an ensemble of random bit strings?  … just toss a coin!  in other words, use a stochastic automaton that’s quite a short description  conforming to our intuition that random strings are not very complex H | ½T | ½

7 xxx n : 7 Statistical complexity In certain circumstances, we can use theory of discrete computation and statistics to create equivalent models  Needs a discrete stochastic process that is conditionally stable  Future states do not depend on time, but only on previous states Complexity, C is the size of a minimal model yielding a finite description that is at the least computationally powerful level  infer the machine from data ensemble  The collection of observed strings generated by process of interest Statistical complexity ignores the “computational resource”  So randomness and periodicity have zero complexity J. P. Crutchfield. The calculi of emergence. Physica D. 1994

8 xxx n : 8 the inferred minimal model is called an   - machine  minimal model  size of the minimal stochastic machine  finite description  size of machine does not grow unboundedly with the size of the state  least computationally powerful level  e.g. finite state automaton, stack machine, UTM Intuition:  Each observation represents a state, which incorporates an indirect indication of the hidden environment  States that lead to the same next state help to predict the environment  Causal states  An   – machine captures a minimal sequence of causal states J. P. Crutchfield. The calculi of emergence. Physica D. 1994

9 xxx n : 9 Consider a simple process The process is a simple automaton A system with a two-symbol alphabet, α = {0,1}  Two recurrent states, A and B State A can, with equal probability,  emit a 0 and return to itself  emit a 1 and go to state B State B always emits 1 and goes to A But all we have is a black-box process C. R. Shalizi, K. L. Shalizi, J. P. Crutchfield. An algorithm for pattern discovery in time series. 2002. http://arxiv.org/pdf/cs/0210025v3.pdf This is Weiss’s “even process”: a 1 cannot be completely surrounded by other 1s

10 xxx n : 10 Record the process output We need to deduce the automaton from data observations Run the process many times  To get statistically useful data  e.g. 10 4 runs to word length = 4 C. R. Shalizi, K. L. Shalizi, J. P. Crutchfield. An algorithm for pattern discovery in time series. 2002. http://arxiv.org/pdf/cs/0210025v3.pdf

11 xxx n : 11 example : “even” process (1) Work out probabilities and infer a machine  “homogonisation” because homogeneous states are merged  Merge is the main source of error – need a lot of observations For full calculation, see: C. R. Shalizi, K. L. Shalizi, J. P. Crutchfield. An algorithm for pattern discovery in time series. 2002. http://arxiv.org/pdf/cs/0210025v3.pdf

12 xxx n : 12 example : “even” process (2) Check all states have incoming transitions  Reachability Remove transient states  A and B form a transient cycle  The only exit is to produce a 0 and go to C  Every C state goes to C (adding 0) or D (adding 1)  Every state in D goes to C (adding 1)  “Determinisation” Final   – machine has states C and D only C. R. Shalizi, K. L. Shalizi, J. P. Crutchfield. An algorithm for pattern discovery in time series. 2002. http://arxiv.org/pdf/cs/0210025v3.pdf

13 xxx n : 13   – machines and stability Replicating a process in an   – machine requires stability  Previous states aren’t always “causal” in unstable systems Stability is related to temporal scale  Recall flocking  At the level of birds, apparently arbitrary motion, few patterns  At the level of the flock, coherent, apparently co-ordinated motion So, can change level (scale) to one where there is stability  A bit like choosing the level to represent in differential equations We can tell a system is not suitably stable if the inferred   – machine changes with the word length  That is, as the process runs over time, the   – machine has to change to express its statistical behaviour

14 xxx n : 14 HIDDEN inferring  - machines start at the lowest level of the computational hierarchy, and infer a model (a stochastic finite automata:  - machine) from an ensemble  there are efficient algorithms to do this investigate how the machine size varies with length of strings L in the ensemble if the machines continue increasing in size as L increases, then increase the computational level of the machines why might the size increase…?

15 xxx n : 15   – machines and continuous systems Most natural systems are continuous Symbolic dynamics used to extract discrete time systems  Partition the state space and label each partition with a symbol  Over time, each point in the state space has a sequence of symbols  Its symbol at each observation point in its past and future  Loses information  Often deterministic continuous system gives stochastic discrete system http://vserver1.cscs.lsa.umich.edu/~crshalizi/notabene/symbolic- dynamics.htmlhttp://vserver1.cscs.lsa.umich.edu/~crshalizi/notabene/symbolic- dynamics.html (and citations) Ґ Ж Ц Ђ Ϡ a Ґ Ж Ц Ђ Ϡ a Point a is in region Ж at time t Over a series of discrete time observations, a moves through different regions:... Ж Ж Ж ϠϠЂЂЂЂ Ж Ж … a a a a a a a a

16 xxx n : 16 Symbolic dynamics (1) recast a continuous (space / time) dynamical system into a discrete one partition the continuous phase space U into a finite number of sets, each labelled with a unique element from a finite alphabet  : U i  observe the system at discretised time intervals, and note the label of the set U i it occupies, to give a sequence of symbols:  d c a a b d d a a …  rationale : sequences represent “results” of “measurements” of the underlying system a b c d

17 xxx n : 17 Symbolic dynamics (2) the symbolic dynamics of the system is the set of all sequences that can be produced (different initial conditions, etc)  defines a language analyse the dynamics of these sequences  using entropy, mutual information,  - machines, etc. e.g. Crutchfield’s analysis of the complexity and entropy of the logistic map: see J. P. Crutchfield. The calculi of emergence. Physica D. 1994 a b c d 3.5 <  < 4 = 3.5699…

18 xxx n : 18 HIDDEN example : logistic map simple iterated equation Bifurcation diagram of logistic map:  Plot, as a function of λ, a series of values for x n obtained by starting with a random value x 0, iterating many times, and discarding points before the iterates converge to the attractor  i.e. set of fixed points of x n corresponding to a value of λ, plotted for increasing values of λ 1 <  < 4 3.5 <  < 4 = 3.5699…

19 xxx n : 19 HIDDEN Symbolic dynamics to analyse logistic map discretise the continuous logistic trajectory x 0 x 1 x 2 x 3 x 4 … into a bit string b 0 b 1 b 2 b 3 b 4 …  partition x space [0,1] into [0, ½ ), labelled 0, and [ ½,1], labelled 1  so: b n = if x n  [0, ½ ) then 0 else 1 0 1

20 xxx n : 20 HIDDEN logistic map (3) for each  for each L  produce an ensemble of bitstrings of length L from the discretised logistic process  infer the model (finite state automaton,  -machine) that describes this ensemble  calculate the statistical complexity C (size of  -machine) and entropy H (of the ensemble) [Crutchfield 1994]

21 xxx n : 21 HIDDEN logistic map (4) example: a 47 state machine constructed from an L = 16 ensemble with = 3.5699… (the first period doubling onset of chaos) [Crutchfield 1994, fig 7a]

22 xxx n : 22 HIDDEN logistic map (5) results for L = 16 ; 193 different values of periodic chaotic values of C grows without bound at c = 3.5699… : need to move to a higher level computational machine (stack machine) [Crutchfield 1994, fig 6]

23 xxx n : 23 Analysis results for logistic map periodic behaviour, small H, small C  automaton size = the period chaotic behaviour, large H, small C  a small automaton captures the random behaviour (“coin toss”) randomness, H C J. P. Crutchfield. The calculi of emergence. Physica D. 1994 Complex behaviour, mid H, large C  near the transition from periodic to chaotic behaviour (“edge of chaos”) there is structure “on all scales”

24 HIDDEN multi-information : hierarchical complexity

25 xxx n : 25 Another complexity measure: multi-information recall mutual information between two systems:  where H(X) is the entropy of system X  H(X,Y) is the joint entropy of the systems X and Y  I = 0 if X and Y are independent For subsystems X 1, X 2, and the overall system X 1,2,this gives:

26 xxx n : 26 multi-information (1) multi-information generalises this to n subsystems of an overall system System X  = X 1,2,…n Subsystems X 1, X 2, …, X n  where MI = 0 if all the subsystems are independent M. Studeny, J. Vejnarova. The multiinfomration function as a tool for measuring stochastic dependence. In Learning in Graphical Models. Kluwer, 1998

27 xxx n : 27 multi-information (2) now consider partitioning the top level system X  into two subcomponents X a, comprising subsystems X 1, …, X k, and X b comprising subsystems X k+1, …, X n the relationship between the multi-information of the whole system and its two big components is  rearranging, and substituting so (unless the subsystems X a and X b are independent) : the MI of the whole is bigger than the parts X1X1 X2X2 … XkXk … XnXn XaXa XbXb XX

28 xxx n : 28 multi-information (3) instead of considering one big subcomponent comprising k subsystems, now consider all possible such big subcomponents of k subsystems, each comprising subsystems Xi 1, …, Xi k consider the average multi-information of these,  note that and given the MI of the whole is bigger than the parts, we have so the MI increases with the size of the subsystems considered X1X1 X2X2 X3X3 XX X1X1 X2X2 X1X1 X3X3 X2X2 X3X3 X12X12 X22X22 X32X32

29 xxx n : 29 multi-information = complexity complexity is the difference between actual increase of this average, and a linear increase: C  0 C is low if the system is random  all subsystems are independent, and so MI = 0 C is low if the system is homogeneously structured  average MI increases linearly C is high in the intermediate case, inhomogeneous groupings and clumpings  high, non-linearly increasing, average MI s G. Tononi, et al. A measure for brain complexity: relating functional segregation and integration in the nervous system. PNAS 91:5033-37, 1994

30 HIDDEN which complexity?

31 xxx n : 31 which complexity measure? unconditional entropy is probably not appropriate  counts randomness as maximally “complex”  entropy variance readily calculated  between different space / time parts of self algorithmic complexity K  useful for theoretical analyses, but not for analysing practical results conditional entropy/mutual information/multi-information  between two systems  which can be between different space / time parts of self  appears to be maximised around interesting transitions  or between hierarchical levels of a system statistical complexity C  of single system; appears to be maximised at “edge of chaos”

32 xxx n : 32 Some general sources http://www.scholarpedia.org/article/Complexity R. Badii, A. Politi. Complexity. Cambridge University Press. 1997 J. P. Sethna. Statistical mechanics. Oxford University Press. 2006


Download ppt "More on complexity measures Statistical complexity J. P. Crutchfield. The calculi of emergence. Physica D. 1994."

Similar presentations


Ads by Google