PHYS 172: Modern Mechanics Lecture 22 – Entropy and Temperature Read 12.3 - 12.5 Summer 2012.

Slides:



Advertisements
Similar presentations
Dr Roger Bennett Rm. 23 Xtn Lecture 19.
Advertisements

Dr Roger Bennett Rm. 23 Xtn Lecture 13.
The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
Lecture 4. Entropy and Temperature (Ch. 3) In Lecture 3, we took a giant step towards the understanding why certain processes in macrosystems are irreversible.
We’ve spent quite a bit of time learning about how the individual fundamental particles that compose the universe behave. Can we start with that “microscopic”
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Thermal & Kinetic Lecture 14 Permutations, Combinations, and Entropy Overview Distribution of energy at thermal equilibrium Permutations and combinations.
Thermal & Kinetic Lecture 13 Towards Entropy and the 2 nd Law: Permutations and Combinations.
Lecture 4. Macrostates and Microstates (Ch. 2 ) The past three lectures: we have learned about thermal energy, how it is stored at the microscopic level,
1 Time Arrow, Statistics and the Universe II Physics Summer School 17 July 2002 K. Y. Michael Wong Outline: * Counting and statistical physics * Explaining.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Thermodynamics and Statistical Mechanics Spring 2006.
Lecture 5. Entropy and Temperature, 2 nd and 3d Laws of Thermodynamics (Ch. 2 ) In Lecture 4, we took a giant step towards the understanding why certain.
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Entropy Physics 202 Professor Lee Carkner Ed by CJV Lecture -last.
Statistical Mechanics Physics 313 Professor Lee Carkner Lecture 23.
2nd Law of Thermodynamics 1st Law: energy is conserved But is that enough ? –Object drops converting KE to heat but never see the opposite –H 2 and O 2.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
P340 Lecture 5 (Second Law) THE FUNDAMENTAL POSTULATE (OF THERMAL PHYSICS) Any isolated system in thermodynamic equilibrium is equally likely to be in.
Boltzmann Distribution and Helmholtz Free Energy
12.3 Assembly of distinguishable particles
Topic 10.3 Second Law of Thermodynamics and Entropy
Entropy and the Second Law Lecture 2. Getting to know Entropy Imagine a box containing two different gases (for example, He and Ne) on either side of.
Too many particles… can’t keep track! Use pressure (p) and volume (V) instead. Temperature (T) measures the tendency of an object to spontaneously give.
Summary: Isolated Systems, Temperature, Free Energy Zhiyan Wei ES 241: Advanced Elasticity 5/20/2009.
The Laws of Thermodynamics
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Lecture 3. Combinatorics, Probability and Multiplicity (Ch. 2 )
Thermodynamic systems and concepts—topic 10.1
The “Arrow” of Time In statistical thermodynamics we return often to the same statement of the equilibrium condition Systems will spontaneously tend towards.
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Thermodynamics System: Part of Universe to Study. Open or Closed boundaries. Isolated. Equilibrium: Unchanging State. Detailed balance State of System:
Lecture 9 Pg Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
The Physics of Information: Demons, Black Holes, and Negative Temperatures Charles Xu Splash 2013.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Chapter 6: Basic Methods & Results of Statistical Mechanics
PHYS 172: Modern Mechanics Lecture 23 – Heat Capacity Read 12.6 Summer 2012.
Lecture 4. Maxwell-Boltzman Distribution
PHYS 172: Modern Mechanics Lecture 24 – The Boltzmann Distribution Read 12.7 Summer 2012.
Summer 2012 PHYS 172: Modern Mechanics
Byeong-Joo Lee Byeong-Joo Lee POSTECH - MSE Statistical Thermodynamic s.
Boltzmann statistics, average values
To understand entropy, we need to consider probability.
Thermal Physics Too many particles… can’t keep track!
Lecture 3. Combinatorics, Probability and Multiplicity (Ch. 2 )
Applications of the Canonical Ensemble: Simple Models of Paramagnetism
Statistical Mechanics
The average occupation numbers
Statistical Physics – Some Reminders
PHYS 213 Midterm Exam HKN Review Session
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Lecture 3. Combinatorics, Probability and Multiplicity (Ch. 2 )
Thermal & Kinetic Lecture 12
Lecture 08: A microscopic view
Topic 10.3 Second Law of Thermodynamics and Entropy
Lecture 45 Entropy Clausius theorem Entropy as a state function
Lecture 4. Entropy and Temperature (Ch. 3)
Entropy & the 2nd Law of Thermodynamics
Homework Solution Consider two energy levels at i = 2.05 x 10-21J and j = 2.64 x 10-21J. The occupation numbers of these in a particular system at.
Microstates, macrostates and
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
ENTROPY SEM 3RD.
PHYS 213 Midterm Exam HKN Review Session
Chapter 1: Statistical Basis of Thermodynamics
Total Energy is Conserved.
Homework Solution Draw out all 15 possible microstates for a system of three molecules with equally-spaced (1 quantum) energy levels and four quanta of.
Statistical Thermodynamics
PHYS 213 Midterm Exam HKN Review Session
Presentation transcript:

PHYS 172: Modern Mechanics Lecture 22 – Entropy and Temperature Read Summer 2012

Clicker Question: Consider the energy principle, ΔE system =W+Q. Choose the correct statement: A.If Q is added to the system, the system temperature must increase. B.If Q is added, the temperature must increase if W=0. C.If Q is added, the temperature might not change at all. D.None of the above.

First, an important reminder about multiplying probabilities: A deck of 52 cards has 4 aces. Starting from a randomly- shuffled deck, what is the probability that an ace is the first card dealt? Once this happens, what is the probability that the second card is an ace? So, what is the probability that the first two cards are aces?

Last Time Einstein Model of Solids # microstates (N oscillators, q quanta) Fundamental assumption of statistical mechanics Over time, an isolated system in a given macrostate (total energy) is equally likely to be found in any of its microstates (microscopic distribution of energy).

These show the 15 different microstates where 4 quanta of energy are distributed among 3 oscillators (1 atom). The Fundamental Assumption of Statistical Mechanics A fundamental assumption of statistical mechanics is that in our state of macroscopic ignorance, each microstate (microscopic distribution of energy) corresponding to a given macrostate (total energy) is equally probable.

q1q1 q2q2 Ω1Ω1 Ω2Ω2 Ω 1 Ω atom 1 (3 oscillators) atom 2 (3 oscillators) Macrostate = 4 quanta of energy in 2 atoms “Microstates” = q 1 quanta in atom 1, q 2 quanta in atom 2

Fundamental Assumption of Statistical Mechanics The fundamental assumption of statistical mechanics is that, over time, an isolated system in a given macrostate is equally likely to be found in any of it’s microstates. Thus, our system of 2 atoms is most likely to be in a microstate where energy is split up 50/50. atom 1atom 2 29% 12% 24% 12% 24%

Interacting Blocks q1q1 q 2 = 100–q 1 Ω1Ω1 Ω2Ω2 Ω 1 Ω E E E E E E E E E E e E E E+88 …………… As quanta are transferred from one block to another, the system transitions from one macrostate to another. Each of these macrostates has a different number of microstates Ω 1 Ω 2.

more oscillators Two Important Observations 1.the most likely circumstance is that the energy divided in proportion to the # of oscillators in each object: 2.as the # of oscillators get large, only one such circumstance is observed: the curve gets narrower: most likely state has 60% of quanta in block with 60% of the oscillators. more oscillators

Probability of macrostates becomes extremely narrow as the numbers increase! Thermal equilibrium of two blocks in contact: quanta

Definition of Entropy It is convenient to deal with lnΩ instead of Ω for several reasons: 1.Ω is typically huge, and ln(Ω) is a smaller number to deal with 2.dΩ/dE is usually huge, and is easier to deal with. 3.Ω 12 =Ω 1 Ω 2 (multiplies), whereas lnΩ 12 =lnΩ 1 + lnΩ 2 (adds). This will make the connection between entropy and temperature easier to see later…. where Boltzmann’s constant is entropy:

We will see that in equilibrium, the most probable distribution of energy quanta is that which maximizes the total entropy. (Second Law of Thermodynamics). Note smaller numbers:

Why entropy? (property of logarithms) When there is more than one “object” in a system, we can add entropies just like we can add energies: S total =S 1 +S 2, E total =E 1 +E 2. Remember:

Approaching Equilibrium assume system starts in a macrostate A far away from equilibrium (most likely state). Ω A is “small” and hence S A = k·lnΩ A is small. what happens as the system evolves? macrostate Amacrostate B

If the initial state is not the most probable, energy is exchanged until the most probable distribution is reached. NOTE: This is a random, statistical process, consistent with the fundamental assumption of statistical mechanics that all microstates are equally likely.

Approaching Equilibrium assume system starts in a macrostate A far away from equillibrium. Ω A is “small” and hence S A = k·lnΩ A is small. as the system evolves (moves from one microstate to the next), it is MUCH more likely to end up in macrostate B, where Ω B is “large” and hence S B = k·lnΩ B is large. from a purely statistical view, we expect that the entropy of the system will increase. macrostate Amacrostate B Note: The number of microstates for is MUCH larger than the number of microstates for

Second Law of Thermodynamics A mathematically sophisticated treatment of these ideas was developed by Boltzmann in the late 1800’s. The result of such considerations, confirmed by all experiments, is THE SECOND LAW OF THERMODYNAMICS If a closed system is not in equilibrium, the most probable consequence is that the entropy of the system will increase. By “most probable” we don’t mean just “slightly more probable”, but rather “so ridiculously probable that essentially no other possibility is ever observed”. Hence, even though it is a statistical statement, we call it the “2 nd Law” and not the “2 nd Likelihood”.

Lecture question: This logarithmic plot shows the number of ways 100 quanta can be distributed between box 1 (300 oscillators) and box 2 (200 oscillators). Look at the end where q 1 =0. If you had to guess, which box is hotter? A.Box 1 (300 oscillators, q 1 =0). B.Box 2 (200 oscillators, q 2 =100).

Note: highest entropy S is the most likely macrostate (equilibrium). Remember 2nd Law! At equilibrium, these two slopes are equal. What else is equal here?

The temperature of the two objects that are in contact and have come to equilibrium. This leads to a natural guess for a useful definition of temperature: Remember: the box is hotter when the slope is smaller. We use E int instead of q 1 to be more general.

SUMMARY: 1.We learned to count the number of microstates in a certain macrostate, using the Einstein model of a solid. 2.We defined ENTROPY. 3.We observe that the 2nd Law of thermodynamics requires that interacting systems move toward their most likely macrostate (equilibrium reflects statistics!); explains irreversibility. 4.We defined TEMPERATURE to correspond to the observed behavior of systems moving into equilibrium, using entropy as a guide (plus 2nd law, statistics).