Boltzmann, Shannon, and (Missing) Information

Slides:



Advertisements
Similar presentations
Thermodynamics Thermo – Heat Dynamics – motion or flow.
Advertisements

Air Conditioners. Introductory Question If you operate a window air conditioner on a table in the middle of a room, the average temperature in the room.
Thermodynamics versus Statistical Mechanics
Short Version : nd Law of Thermodynamics Reversibility & Irreversibility Block slowed down by friction: irreversible Bouncing ball: reversible.
Thermodynamics April 27, 2015April 27, 2015April 27, 2015.
PHYSICS 231 INTRODUCTORY PHYSICS I
PA2001: Time and Energy Thermodynamics 2 nd Law Cycles Efficiency Heat engines and refrigerators Entropy Kinetic theory of gasses Maxwell’s demon Tipler.
Phy 212: General Physics II Chapter 20: Entropy & Heat Engines Lecture Notes.
Physics 207: Lecture 27, Pg 1 Lecture 26Goals: Chapters 18, entropy and second law of thermodynamics Chapters 18, entropy and second law of thermodynamics.
PHYSICS 231 INTRODUCTORY PHYSICS I Lecture 19. First Law of Thermodynamics Work done by/on a gas Last Lecture.
1 Time Arrow, Statistics and the Universe I Physics Summer School 16 July 2002 K. Y. Michael Wong Outline: * The problem of time arrow * What solution.
Thermo & Stat Mech - Spring 2006 Class 14 1 Thermodynamics and Statistical Mechanics Kinetic Theory of Gases.
1 Time Arrow, Statistics and the Universe II Physics Summer School 17 July 2002 K. Y. Michael Wong Outline: * Counting and statistical physics * Explaining.
Phy 212: General Physics II
Entropy and the Second Law of Thermodynamics
The Second Law of Thermodynamics Chapter Introduction The first law of thermodynamics is simple, general, but does not constitute a complete theory.
1 Thermal Physics 13 - Temperature & Kinetic Energy 15 - Laws of Thermodynamics.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
Thermodynamics Chapter 10 ~Energy. Intro Most natural events involve a decrease in total energy and an increase in disorder. The energy that was “lost”
Thermodynamically possible order formation excludes evolution Thomas Seiler Stuttgart, Germany.
METR and 13 February Introduction What is thermodynamics? Study of energy exchange between a system and its surroundings In meteorology,
Topic 10.3 Second Law of Thermodynamics and Entropy
Heat Engines, Entropy and the Second Law of Thermodynamics
The Laws of Thermodynamics
What happens when we place a hot bowl of soup in a cool room? ice water at 5 o C air trapped from outside at 25 0 C water at 15 0 C air at 15 0 C Why.
The Laws of Thermodynamics
Spontaneity of Chemical and Physical Processes: The Second and Third Laws of Thermodynamics 1.
The second law of thermodynamics: The heat flow statement: Heat flows spontaneously from a substance at a higher temperature to a substance at a lower.
Physics 213: Lecture 3, Pg 1 Packet 3.4 Thermodynamics Thermodynamics l Internal Energy l W = PΔV l 1 st Law of Thermodynamics: ΔU = Q – W l Define: Adiabatic,
1 Introduction Physics 313 Professor Lee Carkner Lecture 1.
The four laws of Thermodynamics The 0 th Law (discovered 4 th ) The 1 st Law (discovered 2 nd ) The 2 nd Law (discovered 1 st ) The 3 rd Law (discovered.
The Laws of Thermodynamics
Available Energy. First Law of Thermodynamics The increase in internal energy of a system is equal to the heat added plus the work done on the system.
ERT 108 Physical Chemistry The Second Law of Thermodynamics by Miss Anis Atikah binti Ahmad
Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin
Entropy Change by Heat Transfer Define Thermal Energy Reservoir (TER) –Constant mass, constant volume –No work - Q only form of energy transfer –T uniform.
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
Chapter 18. Most laws of physics are reversible in time: Newton’s Laws, optics, electromagnetism, relativity, … Being reversible means if you record some.
Chapter 12 The Laws of Thermodynamics. Homework, Chapter 11 1,3,5,8,13,15,21,23,31,34.
Chapter seven The second Law of thermodynamics The direction of thermal phenomena IF a system for some reason or other is not in a state of equilibrium.
© 2005 Pearson Prentice Hall This work is protected by United States copyright laws and is provided solely for the use of instructors in teaching their.
The Second Law of Thermodynamics
Entropy Time’s Arrow. Objectives Explain the tendency of matter and energy to spread out over time. Identify entropy changes in familiar processes.
The Zeroth Law of Thermodynamics
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Z EROTH L AW OF T HERMODYNAMICS If two thermodynamic systems (bodies) are separately in thermal equilibrium with a third, they are also in thermal equilibrium.
Entropy, the Second and Third Law of Thermodynamics By Doba Jackson, Ph.D. Associate Professor of Chemistry and Biochemistry Huntingdon College.
The Physics of Information: Demons, Black Holes, and Negative Temperatures Charles Xu Splash 2013.
Thermodynamics Internal energy of a system can be increased either by adding energy to the system or by doing work on the system Remember internal energy.
Chapter 5 Thermal Energy
Thermodynamics. Consider two blocks of Energy A and B with different temperatures Ta and Tb. Ta > Tb. Heat will flow from Block A to block B until they.
Chapter 11 Laws of Thermodynamics. Chapter 11 Objectives Internal energy vs heat Work done on or by a system Adiabatic process 1 st Law of Thermodynamics.
Chapter 20. Most laws of physics are reversible in time: Newton’s Laws, optics, electromagnetism, relativity, … Being reversible means if you record some.
Thermodynamic Processes
Chapter 12 Laws of Thermodynamics. Chapter 12 Objectives Internal energy vs heat Work done on or by a system Adiabatic process 1 st Law of Thermodynamics.
Advanced Placement Physics B Chapter 12: Heat Engines and Efficiency.
Made by, Vasava vipul [ ]. Thermodynamics Thermodynamics is the science of energy conversion involving heat and other forms of energy, most.
T HE FOUR LAWS OF T HERMODYNAMICS Prepared by: VAGHASIYA JAYDIP RAMANIKBHAI ( )
 The force exerted per unit area  As the number and/or the speed of the collisions increases, the rate of change of the momentum of the particles.
Introduction to Entropy. Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The.
The Second Law of Thermodynamics this is cool § 18.2–18.3.
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
Plan for Today (AP Physics 2) Lecture/Notes on Heat Engines, Carnot Engines, and Entropy.
Entropy & Energy Quality
Chapter 3 The 2nd law of thermodynamics
Entropy (S) a measure of disorder
Statistical Interpretation of Entropy
The Second Law of Thermodynamics
The Micro/Macro Connection
Temperature Main Concept:
Presentation transcript:

Boltzmann, Shannon, and (Missing) Information

Entropy of a gas. Entropy of a message. Information? Second Law of Thermodynamics. Entropy of a gas. Entropy of a message. Information?

B.B. (before Boltzmann): Carnot, Kelvin, Clausius, (19th c.) Second Law of Thermodynamics: The entropy of an isolated system never decreases. Entropy defined in terms of heat exchange: Change in entropy = (Heat absorbed)/(Absolute temp). (+ if absorbed, - if emitted). (Molecules unnecessary).

Q Hot (Th) Cold (Tc) Isolated system. Has some structure (ordered). Heat, Q, extracted from hot, same amount absorbed by cold – energy conserved, 1st Law. Entropy of hot decreases by Q/Th; entropy of cold increases by Q/Tc > Q/Th, 2d Law. In the fullness of time … Lukewarm No structure (no order).

Stuff releases heat q, gets more organized entropy decreases Paul’s entropy picture Sun releases heat Q at high temp  entropy decreases Surroundings absorb q, gets more disorganized  entropy increases … Living stuff absorb heat Q at lower temp  larger entropy increases Overall, entropy increases

2d Law of Thermodynamics does not forbid emergence of local complexity (e.g., life, brain, …). 2d Law of Thermodynamics does not require emergence of local complexity (e.g., life , brain, …).

Boltzmann (1872)) Entropy of a dilute gas. N molecules obeying Newtonian physics (time reversible). State of each molecule given by its position and momentum. Molecules may collide – i.e., transfer energy and momentum among each other. colliding

pk = fraction of particles whose positions and momenta are in bin k. Represent system in a space whose coordinates are positions and momenta = mv (phase space). momentum position Subdivide space into B bins. pk = fraction of particles whose positions and momenta are in bin k.

Build a histogram of the pk’s. pk’s change because of Motion Collisions External forces

Given the pk’s, how much information do you need to locate a molecule in phase space? All in 1 bin – highly structured, highly ordered no missing information, no uncertainty. Uniformly distributed – unstructured, disordered, random. maximum uncertainty, maximum missing information. In-between case  intermediate amount of missing information (uncertainty). Any flattening of histogram (phase space landscape) increases uncertainty.

Boltzmann: Amount of uncertainty, or missing information, or randomness, of the distribution of the pk’s, can be measured by HB =  pk log(pk)

pk histogram revisited. All in 1 bin – highly structured, highly ordered HB = 0. Maximum HB. Uniformly distributed – unstructured, disorder, random. HB = - log B. Minimum HB. In-between case  intermediate amount of missing information (uncertainty). In – between value of HB.

Boltzmann’s Famous H Theorem Define: HB = pklog(pk) Assume: Molecules obey Newton’s Laws of motion. Show: HB never increases. AHA! - HB never decreases: behaves like entropy!! If it looks like a duck … Identify entropy with – HB : S = - kBHB Boltzmann’s constant

New version of Second Law: The phase space landscape either does not change or it becomes flatter. life? It may peak locally provided it flattens overall.

Two “paradoxes” 1. Reversal (Loschmidt, Zermelo). Irreversible phenomena (2d Law, arrow of time) emerge from reversible molecular dynamics. (How can this be? – cf Tony Rothman).

2. Recurrence (Poincaré) 2. Recurrence (Poincaré). Sooner or later, you are back where you started. (So, what does approach to equilibrium mean?) Graphic from: J. P. Crutchfield et al., “Chaos,” Sci. Am., Dec., 1986.

Well … Interpret H theorem probabilistically. Boltzmann’s treatment of collisions is really probabilistic,…, molecular chaos, coarse-graining, indeterminacy – anticipating quantum mechanics? Entropy is probability of a macrostate – is it something that emerges in the transition from the micro to the macro? Poincaré recurrence time is really very, very long for real systems – longer than the age of the universe, even. Anyhow, entropy does not decrease! … on to Shannon

AB (After Boltzmann): Shannon (1949) Entropy of a message Message encoded in an alphabet of B symbols, e.g., English sentence (26 letters + space + punctuations) Morse code (dot, dash, space) DNA (A, T, G, C) pk = fraction of the time that symbol k occurs (~ probability that symbol k occurs).

pick a symbol – any symbol … Shannon’s problem: Want a quantity that measures missing information: how much information is needed to establish what the symbol is, or uncertainty about what the symbol is, or how many yes-no questions need to be asked to establish what the symbol is. Shannon’s answer: HS = - k pk log(pk) A positive number

Morse code example: All dots: p1 = 1, p2 = p3 = 0. Take any symbol – it’s a dot; no uncertainty, no question needed, no missing information, HS = 0. 50-50 chance that it’s a dot or a dash: p1 = p2 = ½, pk = 0. Given the p’s, need to ask one question (what question?), one piece of missing information, HS = log(2) = 0.69 Random: all symbols equally likely, p1 = p2 = p3 = 1/3. Given the p’s, need to ask as many as 2 questions -- 2 pieces of missing information, HS = log(3) = 1.1

Two comments: 1. It looks like a duck … but does it quack? There’s no H theorem for Shannon’s HS. 2. H is insensitive to meaning. Shannon: “ [The] semantic aspects of communication are irrelevant to the engineering problem.”

On H theorems: Q: What did Boltzmann have that Shannon didn’t? A: Newton (or equivalent dynamical rules for the evolution of the pk’s). Does Shannon have rules for how the pk’s evolve? In a communications system, the pk’s may change because of transmission errors. In genetics, is it mutation? Is the result always a flattening of the pk landscape, or an increase in missing information? Is Shannon’s HS just a metaphor? What about Maxwell’s demon?

On dynamical rules. Is a neuron like a refrigerator? Entropy of fridge decreases. Entropy of signal decreases.

General Electric designs refrigerators. The entropy of a refrigerator may increase, but it needs electricity. The entropy of the message passing through a neuron may increase, but it needs nutrients. General Electric designs refrigerators. Who designs neurons?

Same pk’s, same entropies – same “missing information.” Insensitive to meaning: Morse revisited X={…. . .-.. .-.. --- .-- --- .-. .-.. -..} H E L L O W O R L D Y={.- -… -.-. -.. . ..-. --. … .. .--- -.-} A B C D E F G H I J M Same pk’s, same entropies – same “missing information.”

If X and Y are separately scrambled – still same pk’s, same “missing information” – same entropy. The message is in the sequence? What do geneticists say? Information – as entropy – is not a very useful way to characterize the genetic code?

Do Boltzmann and Shannon mix? Boltzmann’s entropy of a gas, SB = - kB Spklog pk kB relates temperature to energy: E = kBT relates temperature of a gas to PV. Shannon’s entropy of a message, SS = - kSpklog pk k is some positive constant – no reason to be kB. Does SB + SS mean anything? Does the sum never decrease? Can an increase in one make up for a decrease in the other?

Maxwell’s demon yet once more. Demon measures velocity of molecule by bouncing light on it and absorbing reflected light; process transfers energy to demon; increases demon’s entropy – makes up for entropy decrease of gas.