Information Reading: Complexity: A Guided Tour, Chapter 3.

Slides:



Advertisements
Similar presentations
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
Advertisements

Boltzmann, Shannon, and (Missing) Information
Chapter 3 Classical Statistics of Maxwell-Boltzmann
Entropy in the Quantum World Panagiotis Aleiferis EECS 598, Fall 2001.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
1 Time Arrow, Statistics and the Universe I Physics Summer School 16 July 2002 K. Y. Michael Wong Outline: * The problem of time arrow * What solution.
Central question for the sciences of complexity. How do large networks with.
Information Reading: Complexity: A Guided Tour, Chapter 3.
Maximum Entropy, Maximum Entropy Production and their Application to Physics and Biology Roderick C. Dewar Research School of Biological Sciences The Australian.
1 Time Arrow, Statistics and the Universe II Physics Summer School 17 July 2002 K. Y. Michael Wong Outline: * Counting and statistical physics * Explaining.
Entropy. Optimal Value Example The decimal number 563 costs 10  3 = 30 units. The binary number costs 2  10 = 20 units.  Same value as decimal.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Entropy and the Second Law of Thermodynamics
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Chapter 15 Thermodynamics. MFMcGrawChap15d-Thermo-Revised 5/5/102 Chapter 15: Thermodynamics The first law of thermodynamics Thermodynamic processes Thermodynamic.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
1 Logarithms Definition If y = a x then x = log a y For log 10 x use the log button. For log e x use the ln button.
12.3 Assembly of distinguishable particles
Topic 10.3 Second Law of Thermodynamics and Entropy
Lecture 8 The Gas Laws. Kinetic Theory of Matter. Chapter 4.7  4.16 Outline Ideal Gas Kinetic Theory of Matter Changes of State Entropy.
Entropy and the Second Law Lecture 2. Getting to know Entropy Imagine a box containing two different gases (for example, He and Ne) on either side of.
THERMODYNAMICS [5] Halliday, David, Resnick, Robert, and Walker, Jeart, Fundamentals of Physics, 6 th edition, John Wiley, Singapore, 2001, ISBN
Laws of Thermodynamics C ontents: 1 st Law 2 nd Law 3 rd Law.
NS1300 – Emergence of Modern Science Energy and Thermodynamics.
Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Chapter 20 Thermodynamics and Equilibrium. Overview First Law of Thermodynamics Spontaneous Processes and Entropy –Entropy and the Second Law of Thermodynamics.
Algorithmic Information Theory and the Emergence of Order Entropy and replication Sean Devine victoria management school.
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
1 Computer Science 129 Science, Computing and Society Week 4 Chapter 3.
AP Chapter 19.  Energy can not be created nor destroyed, only transferred between a system and the surroundings.  The energy in the universe is constant.
THEORY The argumentation was wrong. Halting theorem!
STATISTICAL THERMODYNAMICS Wayne M. Lawton Department of Mathematics National University of Singapore 2 Science Drive 2 Singapore
The Physics of Information: Demons, Black Holes, and Negative Temperatures Charles Xu Splash 2013.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Entropy and Boltzmann. Entropy on the Molecular Scale  Ludwig Boltzmann described the concept of entropy on the molecular level.  Temperature is a measure.
Principles of Science Energy Transfer and Transformations – What is the role of energy in our world?
Entropy – Randomness & Disorder Mr Nelson
Second Law of Thermodynamics Heat generally cannot flow spontaneously from a material at lower temperature to a material at higher temperature. The entropy.
Spontaneous & Nonspontaneous Processes By Alec Gautier.
WARM UP Think back to the our thermodynamics unit. In any exothermic reaction, lots of heat is created quickly, and the reaction creates lots of entropy.
Chapter 20 Lecture 35: Entropy and the Second Law of Thermodynamics HW13 (problems):19.3, 19.10, 19.44, 19.75, 20.5, 20.18, 20.28,
General Physics 1 Hongqun Zhang The Department of Physics, Beijing Normal University June 2005.
 The force exerted per unit area  As the number and/or the speed of the collisions increases, the rate of change of the momentum of the particles.
Thermodynamics and Information Alex Thompson Phys 495, UW, Autumn 2015.
PHYS 172: Modern Mechanics Lecture 22 – Entropy and Temperature Read Summer 2012.
Complexity: Ch. 3 Information Complexity in Systems 1.
Boltzmann statistics, average values
To understand entropy, we need to consider probability.
Chapter-2 Maxwell-Boltzmann Statistics.
Shannon Entropy Shannon worked at Bell Labs (part of AT&T)
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
Introduction to Energy
Entropy and the Second Law of Thermodynamics
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Chapter 12 Work and Energy
Topic 10.3 Second Law of Thermodynamics and Entropy
Entropy (S) a measure of disorder
Statistical Interpretation of Entropy
Spontaneity, Entropy, & Free Energy
ENTROPY SEM 3RD.
Sajjad Ahmed Memon S.S./ Health Physicist NIMRA
Chapter 1: Statistical Basis of Thermodynamics
Entropy and the Second Law of Thermodynamics
Second Law of Thermodynamics
Thermodynamics and Statistical Physics
The study of energy and its interconversions…
Presentation transcript:

Information Reading: Complexity: A Guided Tour, Chapter 3

Recap: Core disciplines of the science of complexity Dynamics: The study of continually changing structure and behavior of systems Information: The study of representation, symbols, and communication Computation: The study of how systems process information and act on the results Evolution: The study of how systems adapt to constantly changing environments

Information Motivating questions: What are “order” and “disorder”? What are the laws governing these quantities? How do we define “information”? What is the “ontological status” of information How is information signaled between two entities? How is information processed to produce “meaning”?

Energy, Work, and Entropy What is energy? What is entropy? What are the laws of thermodynamics? What is “the arrow of time”?

Maxwell’s Demon James Clerk Maxwell, 1831−1879 See Netlogo simulation

Szilard’s solution Leo Szilard, 1898−1964

Bennett and Landauer’s solution Charles Bennett, b Rolf Landauer, 1927–1999

Entropy/Information in Statistical Mechanics What is “statistical mechanics”? Describe the concepts of “macrostate” and “microstate” Ludwig Boltzmann, 1844−1906

Entropy/Information in Statistical Mechanics What is “statistical mechanics”? Describe the concepts of “macrostate” and “microstate”. Combinatorics of a slot machine Possible fruits: apple, orange, cherry, pear, lemon –Microstates –Macrostates Macrostate: “Three identical fruits” How many microstates? Macrostate: “Exactly one lemon” How many microstates? Macrostate: “At least 2 cherries” How many microstates?

Boltzmann’s entropy, S

Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”).

Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Aside: What is a “natural logarithm”?

Natural Log e

Intuition about Log b x “How many places (diversity) do I need (when working with b number of symbols) to to represent number x ?” Log = 2 Log = 3 Log 2 16 = 4

Intuition about Log b x “How many places (diversity) do I need (when working with b number of symbols) to to represent number x ?” Log = 2 Log = 3 Log 2 16 = 4 Log 2.5 = -1 ?

Quick review of logarithms log 10 ln log 2 log a b = log 10 b / log 10 a = log n b / log n a for any n

Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Bolzmann entropy: the more microstates that give rise to a macrostate, the more probable that macrostate is. Thus high entropy = more probable macrostate.

Boltzmann’s entropy, S Boltzmann entropy of a macrostate: natural logarithm of the number of microstates (W) corresponding to that macrostate (you can ignore the “k”). Bolzmann entropy: the more microstates that give rise to a macrostate, the more probable that macrostate is. Thus high entropy = more probable macrostate. Second Law of Thermodynamics (à la Boltzmann): Nature tends towards more probable macrostates

Boltzmann’s entropy, S What does this have to do with the “arrow of time”?

Lord Kelvin: Heat Death of the Universe “The result would inevitably be a state of universal rest and death, if the universe were finite and left to obey existing laws. But it is impossible to conceive a limit to the extent of matter in the universe; and therefore science points rather to an endless progress, through an endless space, of action involving the transformation of potential energy into palpable motion and hence into heat, than to a single finite mechanism, running down like a clock, and stopping for ever. [4]potential energy and hence into echa On the Age of the Sun’s Heat" (1862)

Shannon Information / Entropy

Shannon Information / Entropy Claude Shannon, 1916−2001 What were his motivations for defining/studying information? What is a “message source”?

Shannon Information Vocab

Shannon Information Definitions

Shannon Information Measured in “bits” Message source has N “miscrostates” (or “messages”, e.g., words). p i is the probability of message i. Boltzmann Entropy Measured in units defined by k (often “Joules per Kelvin”)

Messages: {Da}

Messages: {300 words}

Calculating Shannon Entropy