Information Reading: Complexity: A Guided Tour, Chapter 3.

Slides:



Advertisements
Similar presentations
Chemical Thermodynamics. Spontaneous Processes and Entropy First Law Energy can neither be created nor destroyed" The energy of the universe is constant.
Advertisements

15-6 Refrigerators, Air Conditioners, and Heat Pumps Clausius statement of the 2 nd Law of T…. …A perfect refrigerator, that requires no work from the.
Spontaneity, Entropy and Free Energy. Spontaneous Processes and Entropy  First Law “Energy can neither be created nor destroyed" The energy of the universe.
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
Boltzmann, Shannon, and (Missing) Information
Chapter 3 Classical Statistics of Maxwell-Boltzmann
Cmprssd Vw f Infrmtn Thry: A Compressed View of Information John Woodward 1.Is a picture really worth 1000 words? 2.Does.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Information Reading: Complexity: A Guided Tour, Chapter 3.
1 Time Arrow, Statistics and the Universe I Physics Summer School 16 July 2002 K. Y. Michael Wong Outline: * The problem of time arrow * What solution.
CS 346U Exploring Complexity in Science and Technology Instructor: Melanie Mitchell Textbook: M. Mitchell, Complexity: A Guided Tour (Oxford University.
Central question for the sciences of complexity. How do large networks with.
A microstate of a gas of N particles is specified by: 3N canonical coordinates q 1, q 2, …, q 3N 3N conjugate momenta p 1, p 2, …, p 3N Statistical Mechanics.
Maximum Entropy, Maximum Entropy Production and their Application to Physics and Biology Roderick C. Dewar Research School of Biological Sciences The Australian.
1 Time Arrow, Statistics and the Universe II Physics Summer School 17 July 2002 K. Y. Michael Wong Outline: * Counting and statistical physics * Explaining.
Entropy. Optimal Value Example The decimal number 563 costs 10  3 = 30 units. The binary number costs 2  10 = 20 units.  Same value as decimal.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Molecular Information Theory Niru Chennagiri Probability and Statistics Fall 2004 Dr. Michael Partensky.
Entropy and the Second Law of Thermodynamics
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
1. Entropy as an Information Measure - Discrete variable definition Relationship to Code Length - Continuous Variable Differential Entropy 2. Maximum Entropy.
Topic 10.3 Second Law of Thermodynamics and Entropy
Maxwell's Demon: Implications for Evolution and Biogenesis
Entropy and the Second Law Lecture 2. Getting to know Entropy Imagine a box containing two different gases (for example, He and Ne) on either side of.
THERMODYNAMICS [5] Halliday, David, Resnick, Robert, and Walker, Jeart, Fundamentals of Physics, 6 th edition, John Wiley, Singapore, 2001, ISBN
Laws of Thermodynamics C ontents: 1 st Law 2 nd Law 3 rd Law.
CS 346U Exploring Complexity in Science and Technology Instructor: Melanie Mitchell Textbook: M. Mitchell, Complexity: A Guided Tour (Oxford University.
Brunswik's Copy of Shannon's 1949 Book Ryan D. Tweney Bowling Green State University ABSTRACT Brunswik read Shannon's Mathematical.
Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine victoria management school.
1 Computer Science 129 Science, Computing and Society Week 4 Chapter 3.
THEORY The argumentation was wrong. Halting theorem!
How are you all doing? Any questions about anything?
STATISTICAL THERMODYNAMICS Wayne M. Lawton Department of Mathematics National University of Singapore 2 Science Drive 2 Singapore
The Physics of Information: Demons, Black Holes, and Negative Temperatures Charles Xu Splash 2013.
Lecture 10—Ideas of Statistical Mechanics Chapter 4, Wednesday January 30 th Finish Ch. 3 - Statistical distributions Statistical mechanics - ideas and.
Entropy and Boltzmann. Entropy on the Molecular Scale  Ludwig Boltzmann described the concept of entropy on the molecular level.  Temperature is a measure.
Entropy – Randomness & Disorder Mr Nelson
Information Theory: Connection to Statistical Mechanics.
Spontaneous & Nonspontaneous Processes By Alec Gautier.
WARM UP Think back to the our thermodynamics unit. In any exothermic reaction, lots of heat is created quickly, and the reaction creates lots of entropy.
Chapter 20 Lecture 35: Entropy and the Second Law of Thermodynamics HW13 (problems):19.3, 19.10, 19.44, 19.75, 20.5, 20.18, 20.28,
Thermodynamics and Information Alex Thompson Phys 495, UW, Autumn 2015.
Models & Model development ThermoTech key-note lecture by Tore Haug-Warberg Department of Chemical Engineering NTNU December 4, 2003.
PHYS 172: Modern Mechanics Lecture 22 – Entropy and Temperature Read Summer 2012.
Complexity: Ch. 3 Information Complexity in Systems 1.
Boltzmann statistics, average values
To understand entropy, we need to consider probability.
Entropy PREPARED BY: KANZARIYA JAYESHBHAI
Chapter-2 Maxwell-Boltzmann Statistics.
Shannon Entropy Shannon worked at Bell Labs (part of AT&T)
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
Entropy and the Second Law of Thermodynamics
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Topic 10.3 Second Law of Thermodynamics and Entropy
A Reversible MC68HC11 Simulator and Its Empirical Computational Complexity Charles Vermette CIS 4914 Senior Project May 30, 2001.
Entropy (S) a measure of disorder
Statistical Interpretation of Entropy
Spontaneity, Entropy, & Free Energy
Microstates, macrostates and
ENTROPY SEM 3RD.
Entropy and the Second Law of Thermodynamics
Second Law of Thermodynamics
Thermodynamics and Statistical Physics
The study of energy and its interconversions…
Spontaneous process; reverse would never be observed, why ??
Presentation transcript:

Information Reading: Complexity: A Guided Tour, Chapter 3

Recap: Core disciplines of the science of complexity Dynamics: The study of continually changing structure and behavior of systems Information: The study of representation, symbols, and communication Computation: The study of how systems process information and act on the results Evolution: The study of how systems adapt to constantly changing environments

Information Motivating questions: What are “order” and “disorder”? What are the laws governing these quantities? How do we define “information”? What is the “ontological status” of information How is information signaled between two entities? How is information processed to produce “meaning”?

Energy, Work, and Entropy What is energy? What is entropy? What are the laws of thermodynamics? What is “the arrow of time”?

Maxwell’s Demon James Clerk Maxwell, 1831  1879 See Netlogo simulation

Szilard’s solution Leo Szilard, 1898  1964

Bennett and Landauer’s solution Charles Bennett, b Rolf Landauer, 1927–1999

Entropy/Information in Statistical Mechanics What is “statistical mechanics”? Describe the concepts of “macrostate” and “microstate” Ludwig Boltzmann, 1844  1906

Entropy/Information in Statistical Mechanics What is “statistical mechanics”? Describe the concepts of “macrostate” and “microstate”. Combinatorics of a slot machine Possible fruits: apple, orange, cherry, pear, lemon – Microstates – Macrostates Macrostate: “Three identical fruits” – How many microstates? Macrostate: “Exactly one lemon” – How many microstates? Macrostate: “At least 2 cherries” – How many microstates?

Boltzmann’s entropy, S

Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”).

Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Aside: What is a “natural logarithm”?

Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Aside: What is a “natural logarithm”? Examples from slot machine.

Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Bolzmann entropy: the more microstates that give rise to a macrostate, the more probable that macrostate is. Thus high entropy = more probable macrostate. Aside: What is a “natural logarithm”? Examples from slot machine.

Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Bolzmann entropy: the more microstates that give rise to a macrostate, the more probable that macrostate is. Thus high entropy = more probable macrostate. Second Law of Thermodynamics (à la Boltzmann): Nature tends towards more probable macrostates Aside: What is a “natural logarithm”? Examples from slot machine.

Boltzmann’s entropy, S What does this have to do with the “arrow of time”?

Quick review of logarithms log 10 ln log 2 log a b = log 10 b / log 10 a = log n b / log n a for any n

Shannon Information / Entropy Claude Shannon, 1916  2001 What were his motivations for defining/studying information? What is a “message source”?

Shannon Information Measured in “bits” Message source has N “miscrostates” (or “messages”, e.g., words). p i is the probability of message i. Boltzmann Entropy Measured in units defined by k (often “Joules per Kelvin”)

Messages: {Da}

Messages: {300 words}

Netlogo Information-Content Lab

Go over Week 2 homework

Projects Schedule By Tuesday, October 18: Project “abstract” due By Thursday October 20: Feedback from me Week of October 24: Present project abstract to class Month of November: Time in class for help on projects December 9: Final paper due

Brainstorming on projects