R C Ball, Physics Theory Group and Centre for Complexity Science University of Warwick R S MacKay, Maths M Diakonova, Physics&Complexity Emergence in Quantitative.

Slides:



Advertisements
Similar presentations
Ricard V. Solè and Sergi Valverde Prepared by Amaç Herdağdelen
Advertisements

On-line learning and Boosting
Query Optimization of Frequent Itemset Mining on Multiple Databases Mining on Multiple Databases David Fuhry Department of Computer Science Kent State.
The Physics of Spin Otto Stern Walther Gerlach 1922: Wrong theory
From
3.3 Toward Statistical Inference. What is statistical inference? Statistical inference is using a fact about a sample to estimate the truth about the.
For stimulus s, have estimated s est Bias: Cramer-Rao bound: Mean square error: Variance: Fisher information How good is our estimate? (ML is unbiased:
ENGS Lecture 8 ENGS 4 - Lecture 8 Technology of Cyberspace Winter 2004 Thayer School of Engineering Dartmouth College Instructor: George Cybenko,
For a random variable X with distribution p(x), entropy is given by H[X] = -  x p(x) log 2 p(x) “Information” = mutual information: how much knowing the.
Fundamental limits in Information Theory Chapter 10 :
2015/6/15VLC 2006 PART 1 Introduction on Video Coding StandardsVLC 2006 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
Multi-Scale Analysis for Network Traffic Prediction and Anomaly Detection Ling Huang Joint work with Anthony Joseph and Nina Taft January, 2005.
EG1204: Earth Systems: an introduction Meteorology and Climate Lecture 7 Climate: prediction & change.
1 Jianwei Dai, Lei Wang, and Faquir Jain Department of Electrical and Computer Engineering University of Connecticut Analysis of Defect Tolerance in Molecular.
AN INFORMATION-THEORETIC PRIMER ON COMPLEXITY, SELF-ORGANISATION & EMERGENCE Mikhail Prokopenko CSIRO Fabio Boschetti CSIRO Alex Ryan DSTO.
2015/7/12VLC 2008 PART 1 Introduction on Video Coding StandardsVLC 2008 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
Dynamics of Learning & Distributed Adaptation PI: James P. Crutchfield, Santa Fe Institute Second PI Meeting, April 2001, SFe Dynamics of Learning:
Lectures on Cellular Automata Continued Modified and upgraded slides of Martijn Schut Vrij Universiteit Amsterdam Lubomir Ivanov Department.
Analyzing iterated learning Tom Griffiths Brown University Mike Kalish University of Louisiana.
Bioinformatics lectures at Rice University Lecture 4: Shannon entropy and mutual information.
Basic Concepts in Information Theory
IE 594 : Research Methodology – Discrete Event Simulation David S. Kim Spring 2009.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
The free-energy principle: a rough guide to the brain? K Friston Summarized by Joon Shik Kim (Thu) Computational Models of Intelligence.
Nicole Paulson CCSSO Webinar March 21, 2012 Transition to the Common Core State Standards in Elementary Math.
Cities and Complexity Gilberto Câmara Based on the book “Cities and Complexity” by Mike Batty Reuses on-line material on Batty’s website
Time Series Data Analysis - I Yaji Sripada. Dept. of Computing Science, University of Aberdeen2 In this lecture you learn What are Time Series? How to.
© 2003, Carla Ellis Simulation Techniques Overview Simulation environments emulation exec- driven sim trace- driven sim stochastic sim Workload parameters.
(Important to algorithm analysis )
Glass Phenomenology from the connection to spin glasses: review and ideas Z.Nussinov Washington University.
Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine victoria management school.
Experimentation in Computer Science (Part 1). Outline  Empirical Strategies  Measurement  Experiment Process.
A Study of Residue Correlation within Protein Sequences and its Application to Sequence Classification Christopher Hemmerich Advisor: Dr. Sun Kim.
Graph Evolution: A Computational Approach Olaf Sporns, Department of Psychological and Brain Sciences Indiana University, Bloomington, IN 47405
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Information & Communication INST 4200 David J Stucki Spring 2015.
Information Theory The Work of Claude Shannon ( ) and others.
STATISTICAL COMPLEXITY ANALYSIS Dr. Dmitry Nerukh Giorgos Karvounis.
PROCESS MODELLING AND MODEL ANALYSIS © CAPE Centre, The University of Queensland Hungarian Academy of Sciences Statistical Model Calibration and Validation.
Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy Soundararajan Ezekiel Matthew Lang Computer Science Department Indiana University.
Lecture V Probability theory. Lecture questions Classical definition of probability Frequency probability Discrete variable and probability distribution.
BioSS reading group Adam Butler, 21 June 2006 Allen & Stott (2003) Estimating signal amplitudes in optimal fingerprinting, part I: theory. Climate dynamics,
Analyzing wireless sensor network data under suppression and failure in transmission Alan E. Gelfand Institute of Statistics and Decision Sciences Duke.
More on complexity measures Statistical complexity J. P. Crutchfield. The calculi of emergence. Physica D
Sharpening Occam’s razor with Quantum Mechanics SISSA Journal Club Matteo Marcuzzi 8th April, 2011.
Options and generalisations. Outline Dimensionality Many inputs and/or many outputs GP structure Mean and variance functions Prior information Multi-output,
Tutorial I: Missing Value Analysis
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
Pseudo-random generators Talk for Amnon ’ s seminar.
Information complexity - Presented to HCI group. School of Computer Science. University of Oklahoma.
The Problem of Pattern and Scale in Ecology - Summary What did this paper do that made it a citation classic? 1.It summarized a large body of work on spatial.
Spring ÇGIE lecture 41 lecture 4: complexity simple systems, complex systems –parallel developments that are joining together: systems literature.
[Chaos in the Brain] Nonlinear dynamical analysis for neural signals Jaeseung Jeong, Ph.D Department of Bio and Brain Engineering, KAIST.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Chaos Analysis.
Dimension Review Many of the geometric structures generated by chaotic map or differential dynamic systems are extremely complex. Fractal : hard to define.
The absorption of solar radiation in the climate system
Hiroki Sayama NECSI Summer School 2008 Week 3: Methods for the Study of Complex Systems Information Theory p I(p)
Context-based Data Compression
General Strong Polarization
Dynamics of Learning & Distributed Adaptation James P
Quantum One.
Quantum Information Theory Introduction
Classical World because of Quantum Physics
Dynamics of Learning & Distributed Adaptation James P
The free-energy principle: a rough guide to the brain? K Friston
A measurable definition of Emergence in quantitative systems
Adaptive Cooperative Systems Chapter 3 Coperative Lattice Systems
Presentation transcript:

R C Ball, Physics Theory Group and Centre for Complexity Science University of Warwick R S MacKay, Maths M Diakonova, Physics&Complexity Emergence in Quantitative Systems – towards a measurable definition

Input ideas: Shannon: Information -> Entropy transmission -> Mutual Information Crutchfield: Complexity Information MacKay: Emergence = system evolves to non-unique state Emergence in Quantitative Systems – towards a measurable definition Emergence measure: Persistent Mutual Information across time. Work in progress …. still mostly ideas.

Emergent Behaviour? System + Dynamics Many internal d.o.f. and/or observe over long times Properties: averages, correlation functions Multiple realisations (conceptually) Emergent properties - behaviour which is predictable (from prior observations) but not forseeable (from previous realisations). time realisations Statistical properties

Strong emergence: different realisations (can) differ for ever MacKay: non-unique Gibbs phase (distribution over configurations for a dynamical system) Physics example: spontaneous symmetry breaking  system makes/inherits one of many equivalent choices of how to order  fine after you have achieved the insight that there is ordering (maybe heat capacity anomaly?) and what ordering to look for (no general technique).

Entropy & Mutual Information Shannon 1948 A B A B A B

MI-based Measures of Complexity time Entropy density (rate) Shannon ? Excess Entropy Crutchfield & Packard 1982 AB Persistent Mutual Information - candidate measure of Emergence Statistical Complexity Shalizi et al PRL 2004 space

Measurement of Persistent MI Measurement of I itself requires converting the data to a string of discrete symbols (e.g. bits) above seems the safer order of limits, and computationally practical The outer limit may need more careful definition

Examples with PMI Oscillation (persistent phase) Spontaneous ordering (magnets) Ergodicity breaking (spin glasses) – pattern is random but aspects become frozen in over time Cases without with PMI Reproducible steady state Chaotic dynamics

PMI = 0 log 2 log 4 log 8 log 3 log 4 log 2 0 Logistic map

Issue of time windows and limits PMI / log2 Length of “present” Length of past, future Short time correl’ n Long strings under- sampled r=3.58, PMI / log2 = 2

First direct measurements PMI / ln2 r r

Discrete vs continuous emergent order parameters This suggests some need to anticipate “information dimensionalities”

A definition of Emergence System self-organises into a non-trivial behaviour; there are different possible instances of that behaviour; the choice is unpredictable but it persists over time (or other extensive coordinate). Quantified by PMI = entropy of choice Shortcomings Assumes system/experiment conceptually repeatable Measuring MI requires deep sampling Appropriate mathematical limits need careful construction Generalisations Admit PMI as function of timescale probed Other extensive coordinates could play the role of time