Cellular Automata and Amorphous Computing Melanie Mitchell Portland State University and Santa Fe Institute Complex Systems Summer School Friday June 20,

Slides:



Advertisements
Similar presentations
1 (Review of Prerequisite Material). Processes are an abstraction of the operation of computers. So, to understand operating systems, one must have a.
Advertisements

Timed Automata.
Dealing with Complexity Robert Love, Venkat Jayaraman July 24, 2008 SSTP Seminar – Lecture 10.
An Introduction to Amorphous Computing Daniel Coore, PhD Dept. Mathematics and Computer Science University of the West Indies, Mona.
Genetic Algorithms, Part 2. Evolving (and co-evolving) one-dimensional cellular automata to perform a computation.
Chapter 4 DECISION SUPPORT AND ARTIFICIAL INTELLIGENCE
Cellular Automata (Reading: Chapter 10, Complexity: A Guided Tour)
Programmable Self-Assembly Prashanth Bungale October 26, 2004 “Programmable Self-Assembly Using Biologically-Inspired Multiagent Control”, R. Nagpal, ACM.
Programming Methodology for Biologically-Inspired Self-Assembling Systems Otherwise known as “Amorphous Computing” or “Swarm Computing” Radhika Nagpal.
Chapter 2: Algorithm Discovery and Design
Introduction to Computers and Programming. Some definitions Algorithm: –A procedure for solving a problem –A sequence of discrete steps that defines such.
EE 685 presentation Optimization Flow Control, I: Basic Algorithm and Convergence By Steven Low and David Lapsley Asynchronous Distributed Algorithm Proof.
Outline Chapter 1 Hardware, Software, Programming, Web surfing, … Chapter Goals –Describe the layers of a computer system –Describe the concept.
C++ Programming: From Problem Analysis to Program Design, Third Edition Chapter 1: An Overview of Computers and Programming Languages C++ Programming:
What computers just cannot do. COS 116: 2/28/2008 Sanjeev Arora.
Lectures on Cellular Automata Continued Modified and upgraded slides of Martijn Schut Vrij Universiteit Amsterdam Lubomir Ivanov Department.
Chapter 2: Algorithm Discovery and Design
McGraw-Hill/Irwin ©2005 The McGraw-Hill Companies, All rights reserved ©2005 The McGraw-Hill Companies, All rights reserved McGraw-Hill/Irwin.
2IS80 Fundamentals of Informatics Spring 2014 Lecture 15: Conclusion Lecturer: Tom Verhoeff.
A New Kind of Science Chapter 3 Matthew Ziegler CS 851 – Bio-Inspired Computing.
Nawaf M Albadia Introduction. Components. Behavior & Characteristics. Classes & Rules. Grid Dimensions. Evolving Cellular Automata using Genetic.
CS190/295 Programming in Python for Life Sciences: Lecture 1 Instructor: Xiaohui Xie University of California, Irvine.
CHAPTER 4: INTRODUCTION TO COMPUTER ORGANIZATION AND PROGRAMMING DESIGN Lec. Ghader Kurdi.
Studies in Big Data 4 Weng-Long Chang Athanasios V. Vasilakos MolecularComputing Towards a Novel Computing Architecture for Complex Problem Solving.
Artificial Chemistries – A Review Peter Dittrich, Jens Ziegler, and Wolfgang Banzhaf Artificial Life 7: , 2001 Summarized by In-Hee Lee.
The Role of Artificial Life, Cellular Automata and Emergence in the study of Artificial Intelligence Ognen Spiroski CITY Liberal Studies 2005.
Computer Viruses, Artificial Life & the Origin of Life Robert C Newman Abstracts of Powerpoint Talks - newmanlib.ibri.org -newmanlib.ibri.org.
Governor’s School for the Sciences Mathematics Day 13.
CS 484 – Artificial Intelligence1 Announcements Lab 4 due today, November 8 Homework 8 due Tuesday, November 13 ½ to 1 page description of final project.
Computer Systems Organization CS 1428 Foundations of Computer Science.
Scheduling Many-Body Short Range MD Simulations on a Cluster of Workstations and Custom VLSI Hardware Sumanth J.V, David R. Swanson and Hong Jiang University.
Course material – G. Tempesti Course material will generally be available the day before the lecture Includes.
1 Cellular Automata and Applications Ajith Abraham Telephone Number: (918) WWW:
We must therefore not be discouraged by the difficulty of interpreting life by the ordinary laws of physics... We must also be prepared to find a new.
Introduction to Self-Organization
Cellular Automata Martijn van den Heuvel Models of Computation June 21st, 2011.
Cellular Automata. John von Neumann 1903 – 1957 “a Hungarian-American mathematician and polymath who made major contributions to a vast number of fields,
C++ Programming: From Problem Analysis to Program Design, Third Edition Chapter 1: An Overview of Computers and Programming Languages.
Boundary Assertion in Behavior-Based Robotics Stephen Cohorn - Dept. of Math, Physics & Engineering, Tarleton State University Mentor: Dr. Mircea Agapie.
“Politehnica” University of Timisoara Course No. 2: Static and Dynamic Configurable Systems (paper by Sanchez, Sipper, Haenni, Beuchat, Stauffer, Uribe)
Computer Architecture And Organization UNIT-II General System Architecture.
ELEN 033 Lecture #1 Tokunbo Ogunfunmi Santa Clara University.
Theory of Programming Languages Introduction. What is a Programming Language? John von Neumann (1940’s) –Stored program concept –CPU actions determined.
EE 685 presentation Optimization Flow Control, I: Basic Algorithm and Convergence By Steven Low and David Lapsley.
A Preliminary Proposal
Computer Science 101 Theory of Computing. Computer Science is... The study of algorithms, with respect to –their formal properties –their linguistic realizations.
A New Kind of Science by Stephen Wolfram Principle of Computational Equivalence - Ting Yan,
© GCSE Computing Candidates should be able to:  describe the characteristics of an assembler Slide 1.
Cellular Automata Introduction  Cellular Automata originally devised in the late 1940s by Stan Ulam (a mathematician) and John von Neumann.  Originally.
Cellular Automata Martijn van den Heuvel Models of Computation June 21st, 2011.
CS851 – Biological Computing February 6, 2003 Nathanael Paul Randomness in Cellular Automata.
Cellular Automata BIOL/CMSC 361: Emergence 2/12/08.
Pedro R. Andrade Münster, 2013
제 4 주. Cellular Automata A Brief history of Cellular Automata P. Sarkar, ACM Computing Surveys, vol. 32, no. 1, pp. 80~107, 2000 학습목표 계산도구로서의 Cellular.
Introduction to Quantum Computing
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
1 Cellular Automata What could be the simplest systems capable of wide-ranging or even universal computation? Could it be simpler than a simple cell?
Von Neumann Computers Article Authors: Rudolf Eigenman & David Lilja
2IS80 Fundamentals of Informatics Quartile 2, 2015–2016 Lecture 15: Conclusion Lecturer: Tom Verhoeff.
Artificial Intelligence: Research and Collaborative Possibilities a presentation by: Dr. Ernest L. McDuffie, Assistant Professor Department of Computer.
Donghyun (David) Kim Department of Mathematics and Physics North Carolina Central University 1 Chapter 0 Introduction Some slides are in courtesy of Prof.
Onlinedeeneislam.blogspot.com1 Design and Analysis of Algorithms Slide # 1 Download From
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
INTRODUCTION TO COMPUTER PROGRAMMING(IT-303) Basics.
Sub-fields of computer science. Sub-fields of computer science.
Spatio-Temporal Information for Society Münster, 2014
Hiroki Sayama NECSI Summer School 2008 Week 3: Methods for the Study of Complex Systems Cellular Automata Hiroki Sayama
Lecture One: Automata Theory Amjad Ali
Cellular Automata (CA) Overview
Presentation transcript:

Cellular Automata and Amorphous Computing Melanie Mitchell Portland State University and Santa Fe Institute Complex Systems Summer School Friday June 20, 2008 Copyright © 2008 by Melanie Mitchell

What are cellular automata? Game of Life

Review (?) of Computation Theory Hilbert’s problems Turing machines Universal Turing Machines Uncomputability of the halting problem

Turing machine

Example of Turing machine rule set

Two fundamental theorems of computation theory: 1.There exists a universal Turing machine 2.There is no Turing machine that can solve the halting problem.

Interesting problem: Given an initial configuration, can you calculate analytically how many steps Life will run for before it reaches a fixed configuration?

Universal Computation in the Game of Life

What is the feasibility of using this kind of universal computation in practice?

Von Neumann’s self-reproducing automaton

John von Neumann 1903–1957

Von Neumann’s self-reproducing automaton After his key role in designing the first electronic computers, von Neumann became interested in links between computers and biology John von Neumann 1903–1957

Von Neumann’s self-reproducing automaton In the last years of his life, von Neumann worked on the “logic” of self-reproduction and devised the first instance of a self-reproducing “machine” (in software, finally implemented in 1990s). John von Neumann 1903–1957

Von Neumann’s self-reproducing automaton Von Neumann’s design is complicated, but some of its key ideas can be captured by a simpler problem:

Von Neumann’s self-reproducing automaton Von Neumann’s design is complicated, but some of its key ideas can be captured by a simpler problem: Design a computer program that will print out a copy of itself.

A candidate self-copying program

program copy

A candidate self-copying program program copy print( “program copy”);

A candidate self-copying program program copy print( “program copy”); print( “ print(“program copy”);”);

A candidate self-copying program program copy print( “program copy”); print( “ print(“program copy”);”); print(“ print( “ print(“program copy”);”);”);

A candidate self-copying program program copy print( “program copy”); print( “ print(“program copy”);”); print(“ print( “ print(“program copy”);”);”);

A candidate self-copying program program copy print( “program copy”); print( “ print(“program copy”);”); print(“ print( “ print(“program copy”);”);”); “A machine can’t reproduce itself; to do so it would have to contain a description of itself, and that description would have to contain a description of itself, and so on ad infinitum”.

Some commands we will need in our programming language

mem –the memory location of the instruction currently being executed

Some commands we will need in our programming language mem –the memory location of the instruction currently being executed computer memory program test print(“Hello, world”); print(“Goodbye”); end

Some commands we will need in our programming language mem –the memory location of the instruction currently being executed computer memory program test print(“Hello, world”); print(“Goodbye”); end mem = 2

Some commands we will need in our programming language

line(n) –the string of characters in memory location n

Some commands we will need in our programming language line(n) –the string of characters in memory location n program test print(“Hello, world”); print(“Goodbye”); end

Some commands we will need in our programming language line(n) –the string of characters in memory location n print(line(2)); will print print(“Hello, world”); program test print(“Hello, world”); print(“Goodbye”); end

Some commands we will need in our programming language

loop until condition –loops until the condition is true

Some commands we will need in our programming language loop until condition –loops until the condition is true x = 0; loop until x = 4 { print(“Hello, world); x = x+1; }

Some commands we will need in our programming language loop until condition –loops until the condition is true x = 0; loop until x = 4 { print(“Hello, world); x = x+1; } Hello, world Output:

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end Output

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end Output

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end Output L=3

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy Output L=3

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; Output L=3

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; Output L=3

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; Output L=3

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=3

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=4

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=4

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=4

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); Output L=4

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=4

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=5

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=5

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=5

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); Output L=5

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=5

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=6

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=6

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=6

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” Output L=6

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=6

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=7

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=7

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=7

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { Output L=7

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=7

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=8

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=8

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=8

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); Output L=8

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=8

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=9

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=9

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=9

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; Output L=9

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } Output L=9

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } Output L=10

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } Output L=10

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } Output L=10

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); Output L=10

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); Output L=11

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); Output L=11

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); Output L=11

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); end Output L=11

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); end Output L=11

A self-copying program 1 program copy 2 L = mem + 1; 3 print(“program copy”); 4 print(“ L = mem + 1;”); 5 loop until line[L] =“end” 6 { 7 print(line[L]); 8 L = L+1; 9 } 10 print(“end”); 11 end program copy L = mem + 1; print(“program copy”); print(“ L = mem + 1;”); loop until line[L] =“end” { print(line[L]); L = L+1; } print(“end”); end Output L=11

Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program

Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program –interpret it as “data” to be used by the instructions in the computer program

Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program –interpret it as “data” to be used by the instructions in the computer program This is also a key mechanism in biological self- reproduction

Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program –interpret it as “data” to be used by the instructions in the computer program This is also a key mechanism in biological self- reproduction –DNA = program and data

Significance of the self-copying program The essence of self-copying in this program is to use the same information stored in memory in two ways: –interpret it as instructions in a computer program –interpret it as “data” to be used by the instructions in the computer program This is also a key mechanism in biological self- reproduction –DNA = program and data This principle was formulated by von Neumann in the 1940s, before the details of biological self-reproduction were well-understood.

Programs and interpreters

Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions

Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology:

Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology: –DNA = program and data

Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology: –DNA = program and data –RNA, ribosomes, and enzymes = interpreter

Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology: –DNA = program and data –RNA, ribosomes, and enzymes = interpreter DNA contains instructions for copying itself, and also for building its own interpreter

Programs and interpreters Notice that the self-copying program needs an external interpreter—the part of the computer that carries out the instructions Rough analogy to biology: –DNA = program and data –RNA, ribosomes, and enzymes = interpreter DNA contains instructions for copying itself, and also for building its own interpreter Von Neumann’s self-reproducing automaton also did this.

What are cellular automaton actually used for? Different perspectives: –CAs are models of physical (or biological or social) systems –CAs are alternative methods for approximating differential equations –CAs are devices that can simulate standard computers –CAs are parallel computers that can perform image processing, random number generation, cryptography, etc. –CAs are a framework for implementing molecular scale computation –CAs are a framework for exploring how “collective computation” might take place in natural systems (and that might be imitated in novel human-made computational systems)

Dynamics and Computation in Cellular Automata

1.Fixed point 2.Periodic 3.Chaotic 4.“Complex” –long transients –universal computation? Wolfram’s classes for elementary CAs

Rule: ECA 110 is a universal computer (Matthew Cook, 2002) Wolfram’s numbering of ECA: = 110 in binary

Outline of A New Kind of Science (Wolfram,2001) (from MM review, Science, 2002) Simple programs can produce complex, and random-looking behavior –Complex and random-looking behavior in nature comes from simple programs. Natural systems can be modeled using cellular-automata-like architectures Cellular automata are a framework for understanding nature Principle of computational equivalence

Principle of Computational Equivalence 1.The ability to support universal computation is very common in nature. 2.Universal computation is an upper limit on the sophistication of computations in nature. 3.Computing processes in nature are almost always equivalent in sophistication.

How can we describe information processing in complex systems?

majority on A cellular automaton evolved by the genetic algorithm majority off

Stephen Wolfram's last problem from ``Twenty problems in the theory of cellular automata'' (Wolfram, 1985): 20. What higher-level descriptions of information processing in cellular automata can be given? "It seems likely that a radically new approach is needed"

What is needed? 1.How can we characterize patterns as computations? 2.How can we design computations in the language of patterns?

1. How can we characterize patterns as computations?

Components of computation in spatially extended systems: –Storage of information –Transfer of information –Integration of information from different spatial locations

1. How can we characterize patterns as computations? Components of computation in spatially extended systems: –Storage of information –Transfer of information –Integration of information from different spatial locations First step: What structures in the observed patterns implement these components? Second step: How do these structures implement the computation?

–Transfer of information: moving particles From

–Transfer of information: moving particles From

–Transfer of information: moving particles –Integration of information from different spatial locations: particle collisions From

–Transfer of information: moving particles –Integration of information from different spatial locations: particle collisions From

How to automatically identify information-carrying structures in spatio-temporal patterns?

Three proposals: Filter by regular languages (Crutchfield and Hanson, 1993; Crutchfield et al., 2002) Filter by local statistical complexity (Shalizi et al., 2006) Filter by local information measures (Lizier et al., 2007) How to automatically identify information-carrying structures in spatio-temporal patterns?

Filter by regular languages (Crutchfield and Hanson, 1993; Crutchfield et al., 2002) Regular language: Simple-to-describe periodic pattern

Examples: Regular domains: (0)*, (1)*, (01)* Particles (spatially localized, temporally periodic boundaries or “defects” between regular domains) Regular domains filtered out CA for performing density classification

Regular domains: (0001)*, (1110)* Rule 54 Regular domains filtered out

Filter by local statistical complexity (Shalizi et al., 2006) Local statistical complexity of site i: –amount of information from past needed for optimal prediction of future in the vicinity of site i

Filter by local statistical complexity (Shalizi et al., 2006) Local statistical complexity of site i: –amount of information from past needed for optimal prediction of future in the vicinity of site i site i at time t

Filter by local statistical complexity (Shalizi et al., 2006) Local statistical complexity of site i: –amount of information from past needed for optimal prediction of future in the vicinity of site i site i at time t

Filter by local statistical complexity (Shalizi et al., 2006) Local statistical complexity of site i: –amount of information from past needed for optimal prediction of future in the vicinity of site i site i at time t

Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i

Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i

Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i

Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i light cone of future influence of site i

Filter by local statistical complexity (Shalizi et al., 2006) light cone of past influence on site i site i at time t Local statistical complexity of site i: – amount of information from past needed for optimal prediction of future in the vicinity of site i light cone of future influence of site i How well does past light cone predict future light cone?

Example: Rule 110 Filtering by local statistical complexity OriginalFiltered

Example: Rule 110 Filtering by local statistical complexity Note: This filter requires no prior determination of “regular domains”. But more computationally expensive than filtering by regular domains. OriginalFiltered

Filter by local information measures (Lizier et al., 2006)

Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. site i at time t Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. site i at time t site i k time steps in past Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. site i at time t Degree of information storage: how predictable is current state from past states? site i k time steps in past Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. site i at time t Degree of information storage: how predictable is current state from past states? site i k time steps in past Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

Rule 54 Positive information storage From Lizier et al., 2007 Degree of information storage at each site Positive: information has been stored at this site

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Amount of information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Amount of information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 (j  radius of neighborhood)

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Amount of information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 Degree of information transfer: how predictable is state at site i from previous state at site i  j ?

Rule 54Positive t (i, j, n+1) Degree of information transfer at each site (for j =  1) Positive: information has been transferred from site i  j to site i

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i.

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i k time steps in past

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 site i k time steps in past

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 site i k time steps in past Degree of information modification: how unpredictable is state at site i from storage and transfer?

Filter by local information measures (Lizier et al., 2006) Degree of information storage at site i: Mutual information between state at site i at t and states of site i at k previous time steps. Degree of information transfer from site i-j to site i: Average information in the state of the source (site i-j) about next state of destination (site i) Degree of information modification at site i: Degree to which sum of information storage and information transfer do not predict state at site i. site i at time t site i  j at time t  1 site i k time steps in past Degree of information modification: how unpredictable is state at site i from storage and transfer? “local separable information”: positive: no info modification negative: info modification

Local information modification Rule 54 Negative s(i, n) plotted on top of information transfer Negative: State at site i is not well predicted by information storage or transfer.

First step: What structures in the observed patterns implement these components?

Second step: How do these structures implement the computation?

majority on A cellular automaton evolved by the genetic algorithm (Performance  80%) majority off

Particles Regular domains: (0)*, (1)*, (01)* From Crutchfield et al., 2001

laws of “particle physics”

Hordijk, Crutchfield, and Mitchell, 1996: Models of CA computation in terms of particle kinematics (Note “condensation phase”)

Particle model of CA computation

CAs evolved for density classification CAs evolved for synchronization

generation 17 generation 18

First step: What structures in the observed patterns implement these components? Second step: How do these structures implement the computation?

How can we design computations in the language of patterns? “Programming” the CA in the language of the high-level structures, and compiling the program to the level of the CA rule Open problem!

Amorphous Computing Abelson, Sussman et al., MIT

Amorphous Computing (Abelson et al., 2000; 2007) Main ideas: –Produce vast quantities of tiny, unreliable computing elements (“particles”) at very low cost –Give them limited wireless communication abilities so each particle can communicate with nearby particles –Spray paint them onto a surface. Spatial arrangement, and thus communication, will be irregular. Processing and communication will be asynchronous. –Have them self-organize into a reliable network that does something useful

Some possible applications Smart buildings that sense usage and adjust to save energy Smart bridges that monitor traffic load and structural integrity Smart arrays of microphones for opimizing acoustics Self-assembly of nano-machines

One example: Origami-based self-assembly (R. Nagpal et al.) Set-up: –“Spray paint” thousands of MEMS “agents” on a 2D square foldable material. –Agents have no knowledge of global position or interconnect topology. –All agents have identical program. –Agents communicate locally (out to distance r) –Agents run asynchronously –Agents will collectively “fold” material to desired shape.

High-level language: Origami Shape Language –Six paper-folding axioms that can be used to construct a large class of flat folded shapes. Low-level language primitives: –gradients –neighborhood query –cell-to-cell contact –polarity inversion –flexible folding

Folding a cup

Origami Shape Language Primitives: points p i, lines L i, regions R i Axioms (Huzita): 1. Given two points p1 and p2, fold a line through them. 2. Given two points p1 and p2, fold p1 onto p2 (creates crease that bisects the line p1p2 at right angles 3. Given two lines L1 and L2, fold L1 onto L2 (constructs the crease that bisects the angle between L1 and L2)

4. Given p1 and L1, fold L1 onto itself through p1 (constructs a crease through p1 perpendicular to L1). 5. Given p1 and p2 and L1, make a fold that places p1 on L1 and passes through p2 (constructs the tanget to the parabola (p1 L1) through p2) 6. Given p1 and p2 and lines L1 and L2, make a fold that places p1 on L1 and p2 on L2 (constructs the tanget to two parabolas)

Primitive operations we’ll need: (create-region p1 L1) (within-region r1 op1...): restricts operations to give region (execute-fold L1 type landmark-point) Fold types: Valley (apical), Mountain (basal) apical: puts apical surface on inside basal: puts basal surface on inside Landmark-point: defines new “apical” and “basal” surfaces after fold by defining side of fold that will reverse polarity (intersect L1 L1): returns a point that is the intersection of the two lines

Corner points: c1-c4 Edge lines: e12-e41

Low-level agent operations (Nagpal, 2001)

How to implement OSL in low-level cell programs Example

I will put a link to all my slides on the CSSS wiki (under “Readings”  “Melanie Mitchell”). Thanks for listening!