Computational Mechanics of ECAs, and Machine Metrics

Slides:



Advertisements
Similar presentations
Lecture 24 MAS 714 Hartmut Klauck
Advertisements

DYNAMICS OF RANDOM BOOLEAN NETWORKS James F. Lynch Clarkson University.
Timed Automata.
Probabilistic Modeling of Molecular Evolution Using Excel, AgentSheets, and R Jeff Krause (Shodor)
Entropy Rates of a Stochastic Process
Finite Automata Great Theoretical Ideas In Computer Science Anupam Gupta Danny Sleator CS Fall 2010 Lecture 20Oct 28, 2010Carnegie Mellon University.
Cellular Automata (Reading: Chapter 10, Complexity: A Guided Tour)
Finite Automata and Non Determinism
CS5371 Theory of Computation
1 Lecture 4: Formal Definition of Solvability Analysis of decision problems –Two types of inputs:yes inputs and no inputs –Language recognition problem.
Isometry invariant similarity
Equivalence, DFA, NDFA Sequential Machine Theory Prof. K. J. Hintz Department of Electrical and Computer Engineering Lecture 2 Updated and modified by.
Multiview stereo. Volumetric stereo Scene Volume V Input Images (Calibrated) Goal: Determine occupancy, “color” of points in V.
CS5371 Theory of Computation Lecture 6: Automata Theory IV (Regular Expression = NFA = DFA)
CS5371 Theory of Computation Lecture 4: Automata Theory II (DFA = NFA, Regular Language)
Great Theoretical Ideas in Computer Science.
CMPS 3223 Theory of Computation Automata, Computability, & Complexity by Elaine Rich ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Slides provided.
Digital Image Processing CCS331 Relationships of Pixel 1.
Introduction to Lattice Simulations. Cellular Automata What are Cellular Automata or CA? A cellular automata is a discrete model used to study a range.
Cellular Automata Martijn van den Heuvel Models of Computation June 21st, 2011.
Niloy Ganguly Biplab K Sikdar P Pal Chaudhuri Presented by Niloy Ganguly Indian Institute of Social Welfare and Business Management. Calcutta
CSCI 2670 Introduction to Theory of Computing August 25, 2005.
Cellular Automata Introduction  Cellular Automata originally devised in the late 1940s by Stan Ulam (a mathematician) and John von Neumann.  Originally.
Cellular Automata Martijn van den Heuvel Models of Computation June 21st, 2011.
Computational Mechanics of ECAs, and Machine Metrics.
UNIT - I Formal Language and Regular Expressions: Languages Definition regular expressions Regular sets identity rules. Finite Automata: DFA NFA NFA with.
CS Machine Learning Instance Based Learning (Adapted from various sources)
using Deterministic Finite Automata & Nondeterministic Finite Automata
Finite Automata Great Theoretical Ideas In Computer Science Victor Adamchik Danny Sleator CS Spring 2010 Lecture 20Mar 30, 2010Carnegie Mellon.
The Computational Nature of Language Learning and Evolution 10. Variations and Case Studies Summarized by In-Hee Lee
Computational Physics (Lecture 10) PHY4370. Simulation Details To simulate Ising models First step is to choose a lattice. For example, we can us SC,
Lecture #5 Advanced Computation Theory Finite Automata.
Recursively Enumerable and Recursive Languages. Definition: A language is recursively enumerable if some Turing machine accepts it.
1 Minimum Bayes-risk Methods in Automatic Speech Recognition Vaibhava Geol And William Byrne IBM ; Johns Hopkins University 2003 by CRC Press LLC 2005/4/26.
Krishnendu ChatterjeeFormal Methods Class1 MARKOV CHAINS.
Chapters 11 and 12 Decision Problems and Undecidability.
Topic 3: Automata Theory 1. OutlineOutline Finite state machine, Regular expressions, DFA, NDFA, and their equivalence, Grammars and Chomsky hierarchy.
WELCOME TO A JOURNEY TO CS419 Dr. Hussien Sharaf Dr. Mohammad Nassef Department of Computer Science, Faculty of Computers and Information, Cairo University.
Spatio-Temporal Information for Society Münster, 2014
Hiroki Sayama NECSI Summer School 2008 Week 3: Methods for the Study of Complex Systems Cellular Automata Hiroki Sayama
Computational Models.
Computational Physics (Lecture 10)
Sequential Flexibility
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Lexical analysis Finite Automata
Instance Based Learning
Deterministic FA/ PDA Sequential Machine Theory Prof. K. J. Hintz
Regular Expressions.
Linear Bounded Automata LBAs
RE-Tree: An Efficient Index Structure for Regular Expressions
Recognizer for a Language
Graph Analysis by Persistent Homology
Hierarchy of languages
Non-Deterministic Finite Automata
CS 154, Lecture 3: DFANFA, Regular Expressions.
4. Properties of Regular Languages
4b Lexical analysis Finite Automata
Recurrent Networks A recurrent network is characterized by
Boltzmann Machine (BM) (§6.4)
4b Lexical analysis Finite Automata
Instructor: Aaron Roth
IR fixed points and conformal window in SU(3) gauge Theories
Comparing Images Using Hausdorff Distance
Group 9 – Data Mining: Data
Chapter 1 Regular Language
Lecture 5 Scanning.
Chapter 1. Formulations.
Cellular Automata (CA) Overview
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Presentation transcript:

Computational Mechanics of ECAs, and Machine Metrics

Elementary Cellular Automata 1d lattice with N cells (periodic BC) Cells are binary valued {1,0} -- B or W Deterministic update rule, , applied to all cells simultaneously to determine cell values at next time step. nearest neighbor interactions only

Example - Rule 54 000 001 010 011 100 101 110 111 0 1 1 0 1 1 0 0

Typical Behavior of ECAs Emergence of “Domains” -- spatially homogeneous regions that spread through lattice as time progresses. Largely independent of lattice size N, for N big. Depends (sensitively) on update rule .

Characterizing ECA Behavior Domains can be characterized by their state transition machines (DFAs). Rule 18 (0W)* Rule 54 (1110)* 1 0,1 A B A B 1 D C 1

Formally Defining Domains Since each ECA Domain can be characterized by a DFA, domains are regular languages. Def: a (spatial) domain or (spatial) domain language  is a regular langauge s.t. (1) () =  or p() =  , for some p. (temporal invariance). (2) Process graph of  is strongly connected (spatial homogeneity).

Temporal Invariance? (3) Use T’ to construct M’ = [T]out Question: Given a potential domain, , with corresponding DFA, M, how do we determine temporal invariance? Can this even be done in general? Answer: Yes, but somewhat involved. Steps are: (1) Encode CA update rule as a Transducer, T. (2) Take composition T(M) = T’ (3) Use T’ to construct M’ = [T]out (4) Check if M’ = M

How to Determine Domains Visual Inspection in simple cases (#54) Epsilon Machine Reconstruction Fixed Point Equation

-Machine Reconstruction Several Difficulties: ‘Experimental’ spatial data does not consist entirely of domain regions. Must sort out true transitions from anomalies. May be multiple domains Pattern may be spatio-temporal not simply spatial.

Results Good for entirely periodic spatial patterns, which are temporally fixed. Can reconstruct some spatial domains with indeterminancy e.g. Rule 18 = (0W)* , Rule 80. Can reconstruct some period 2 domains e.g. Rule 54. In general, difficulties for domains with lots of ‘noise’, non-block processes, low transition probabilities, and spatio-temporal processes.

Questions from Demos How to analyze patterns in space-time? Minimal invariant sets - domains within domains e.g. 000… in rule 18. What does it mean for a domain to be stable or attracting? Particles and transient dynamics?

Unit Perturbation DFAs The unit perturbation language L’ of L is L’ = { w’ s.t.  w in L s.t. d(w’,w)  1} Note: L regular  L’ regular L process  L’ process

Attractors A regular language L is a fixed point attractor for a CA, , if (1) (L) = L (2) n(L’)  L’, for all n (3) For ‘almost every’ w in L’ , n(w) is in L, for some n Note: If p(L) = L, but (L)  L then (Lp) = Lp where Lp = {L, (L), 2(L) … p-1(L) }. And also Lp is regular. Hence, we may assume L is a fixed point and not p-periodic. The attractor is then not necessarily spatially homogeneous at each time step. It is NOT a single spatial domain, but rather a union of spatial domains each of which is periodic in time.

Comments on Attractor Definition This definition ensures that domain grows instead of shrinking in time at the domain/other stuff interface. Finite time collapse onto (NOT close to) the attractor is different for CAs then in spatially continuous systems such as DEQs or 1d-maps because you can’t ‘get within ’ without being equal, due to discreteness.

Comparison of ECAs Can (sort of) characterize behavior of an individual ECA. Can we compare the behavior of two different ECAs and measure how similar their dynamics are? And How? For example, in what sense is ECA 9 similar to ECA 25 (and how similar)?

Basic Strategy Consider only asymptotic spatial patterns. Ignore particles, transient dynamics, and even temporal patterns. Compare ECAs based only upon the domain machines M1,M2. Create ‘machine metrics’. ** Note: This is now a somewhat more general question because such metrics could be used to compare -machines for other types of processes as well.

Distinguishing Between Sources I #54 (1110)* #54 (0001)* #160 (0)*

Distinguishing Between Sources II # 80 (1,0,W)* RR-XOR

Machine Metrics I Let M1, M2 be two machines with corresponding languages L1, L2 and Let p1,n(w), p2,n(w) be the probability mass function of words of length n for the languages L1, L2. We define … dn,1(M1,M2) = w |p1,n(w) - p2,n(w)| dn,2(M1,M2) = (w |p1,n(w) - p2,n(w)|2)1/2 dn,(M1,M2) = Max |p1,n(w) - p2,n(w)| Consider weighted Averages: D(M1,M2) = n dn(M1,M2)*n , 0 <  < 1

Problems with Lp Metrics dn,(M1,M2)  0, as n   and dn,2(M1,M2)  0, as n   for M1, M2 with h(M1) > 0, h(M2) > 0. dn,1(M1,M2)  2 as n   for any M1, M2 with h(M1)  h(M2) ** Note: 2 is the maximum value for any of these metrics.**

The Hausdorff Metric Let (X,d) be a metric space, define the Hausdorff metric between compact subsets of X by (A,B) = Max Min d(a,b) (B,A) = Max Min d(a,b) dH(A,B) = Max {(A,B) , (B,A) } a b b a

Examples A B A B (A,B) (B,A) (A,B) dH(A,B) = (A,B) = (B,A)

Machine Metrics II Hausdorff ‘metric’ on length n words Let M1, M2 be two machines with corresponding languages L1, L2. Hausdorff ‘metric’ on length n words dn,H(M1,M2) = dH(L1,n, L2,n) Averaged Min Distance “metric” on length n words (M1,M2) = (w1 min d(w1,w2))/|L1,n| (M2,M1) = (w2 min d(w1,w2))/ |L2,n| dn,A(M1,M2) = Max {(M1,M2) , (M2,M1) } w2 w1 ** Can take weighted averages or lim n   dn,H(M1,M2) **

Example - ECA 18 and 3 periodic Domains #18 (0W)* #54 (1110)* #54 (0001)* #160 (0)*

Distance to ECA domain (0W)* (vs. periodic domains) d-H d-AMD 0* 0.966 0.5 0.254 (0001)* 0.952 0.2 0.156 (1110)* 1.0 0.7 0.446

Example - ECA 18 and 3 non-periodic Domains #18 (0W)* # 80 (1,0,W)* RR-XOR

Distance to ECA domain (0W)* (vs. non-periodic domains) d-H d-AMD Rule 80 0.909 0.3 0.115 (10W)* 1.0 0.229 RR-XOR 0.926 0.4 0.187

ECA 25 vs. ECA 9 d1 = 0.6 dH = 0.2 d-AMD = 0.08 25 9

RR0 vs. RR-XOR d1 = 0.883 dH = 0.2 d-AMD = 0.092 RR0 RR-XOR

Metric Correlations