Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.

Slides:



Advertisements
Similar presentations
Dr Roger Bennett Rm. 23 Xtn Lecture 19.
Advertisements

Lecture 4. Entropy and Temperature (Ch. 3) In Lecture 3, we took a giant step towards the understanding why certain processes in macrosystems are irreversible.
We’ve spent quite a bit of time learning about how the individual fundamental particles that compose the universe behave. Can we start with that “microscopic”
CHAPTER 14 THE CLASSICAL STATISTICAL TREATMENT OF AN IDEAL GAS.
The Maxwell-Boltzmann Distribution Valentim M. B. Nunes ESTT - IPT April 2015.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1 Lecture 6 Ideal gas in microcanonical ensemble. Entropy. Sackur-Tetrode formula. De Broglie wavelength. Chemical potential. Ideal gas in canonical ensemble.
1.The Statistical Basis of Thermodynamics 1.The Macroscopic & the Microscopic States 2.Contact between Statistics & Thermodynamics: Physical Significance.
1 Lecture 5 The grand canonical ensemble. Density and energy fluctuations in the grand canonical ensemble: correspondence with other ensembles. Fermi-Dirac.
Chapter 2 The Second Law. Why does Q (heat energy) go from high temperature to low temperature? cold hot Q flow Thermodynamics explains the direction.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Thermal & Kinetic Lecture 13 Towards Entropy and the 2 nd Law: Permutations and Combinations.
1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Thermal Properties of Matter
Thermo & Stat Mech - Spring 2006 Class 19 1 Thermodynamics and Statistical Mechanics Partition Function.
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Statistical Mechanics Physics 313 Professor Lee Carkner Lecture 23.
Physics 361 Principles of Modern Physics Lecture 3.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
P340 Lecture 5 (Second Law) THE FUNDAMENTAL POSTULATE (OF THERMAL PHYSICS) Any isolated system in thermodynamic equilibrium is equally likely to be in.
Boltzmann Distribution and Helmholtz Free Energy
Ch 9 pages Lecture 18 – Quantization of energy.
Ch 23 pages Lecture 15 – Molecular interactions.
Introduction to (Statistical) Thermodynamics
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Chapter 18 Bose-Einstein Gases Blackbody Radiation 1.The energy loss of a hot body is attributable to the emission of electromagnetic waves from.
Molecular Information Content
Entropy and the Second Law Lecture 2. Getting to know Entropy Imagine a box containing two different gases (for example, He and Ne) on either side of.
THERMODYNAMICS [5] Halliday, David, Resnick, Robert, and Walker, Jeart, Fundamentals of Physics, 6 th edition, John Wiley, Singapore, 2001, ISBN
Summary: Isolated Systems, Temperature, Free Energy Zhiyan Wei ES 241: Advanced Elasticity 5/20/2009.
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Lecture 9 Energy Levels Translations, rotations, harmonic oscillator
The Ideal Monatomic Gas. Canonical ensemble: N, V, T 2.
Chemical Reactions in Ideal Gases. Non-reacting ideal gas mixture Consider a binary mixture of molecules of types A and B. The canonical partition function.
Summary Boltzman statistics: Fermi-Dirac statistics:
Ch 22 pp Lecture 2 – The Boltzmann distribution.
STATISTICAL THERMODYNAMICS Wayne M. Lawton Department of Mathematics National University of Singapore 2 Science Drive 2 Singapore
Chapter 14: The Classical Statistical Treatment of an Ideal Gas.
Lecture 9 Pg Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Monatomic Crystals.
2/18/2014PHY 770 Spring Lecture PHY Statistical Mechanics 11 AM – 12:15 & 12:30-1:45 PM TR Olin 107 Instructor: Natalie Holzwarth.
The Heat Capacity of a Diatomic Gas Chapter Introduction Statistical thermodynamics provides deep insight into the classical description of a.
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Lecture 7. Thermodynamic Identities (Ch. 3). Diffusive Equilibrium and Chemical Potential Sign “-”: out of equilibrium, the system with the larger  S/
Chapter 6: Basic Methods & Results of Statistical Mechanics
PHYS 172: Modern Mechanics Lecture 23 – Heat Capacity Read 12.6 Summer 2012.
PHYS 172: Modern Mechanics Lecture 24 – The Boltzmann Distribution Read 12.7 Summer 2012.
Lecture 10, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Chapter 20 Lecture 35: Entropy and the Second Law of Thermodynamics HW13 (problems):19.3, 19.10, 19.44, 19.75, 20.5, 20.18, 20.28,
Phonons Packets of sound found present in the lattice as it vibrates … but the lattice vibration cannot be heard. Unlike static lattice model , which.
Introduction to Entropy. Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The.
PHYS 172: Modern Mechanics Lecture 22 – Entropy and Temperature Read Summer 2012.
Chapter 6 Applications of
Applications of the Canonical Ensemble: Simple Models of Paramagnetism
The units of g(): (energy)-1
PHYS 213 Midterm Exam HKN Review Session
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Einstein Model for the Vibrational Heat Capacity of Solids
Lecture 4. Entropy and Temperature (Ch. 3)
Recall the Equipartition Theorem: In Ch 6,
Lattice Vibrational Contribution to the Heat Capacity of the Solid
Lattice Vibrational Contribution
PHYS 213 Midterm Exam HKN Review Session
Total Energy is Conserved.
Homework Solution Draw out all 15 possible microstates for a system of three molecules with equally-spaced (1 quantum) energy levels and four quanta of.
Statistical Thermodynamics
PHYS 213 Midterm Exam HKN Review Session
Presentation transcript:

Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General definition of temperature  Why heat flows from hot to cold Reading for this Lecture: Elements Ch 7 Reading for Lecture 10: Elements Ch 8

Lecture 8, p 2  Counting microstates of combined systems  Volume exchange between systems  Definition of Entropy and its role in equilibrium Last Time

Lecture 8, p 3 Quick Review: The Scope of Thermodynamics  When an isolated system can explore some number, , of microstates the microstates are equally likely.  The probability that you observe macrostate A is: P A =  A /  All  i.e., the fraction of all the microstates that look like A. Entropy:  The entropy of a system is ln(  ), where  counts all possible states. Therefore, P A is proportional to e  A, since  A = ln(  A ).  For a big system, the entropy of the most likely macrostate,  A, is not much less than the total entropy,  All. Example: 10 6 spins:  All  = ln( ) = 10 6 ln(2)=  5  10 5 up = ln(10 6 ! / 5  10 5 ! 5  10 5 !)= Thermodynamics applies to systems that are big enough so that this is true. Question:What is the probability that in a system of 10 6 spins, exactly 5  10 5 will be pointing up?

Lecture 8, p 4 The Entropy of an isolated system can only increase (or remain constant) as a function of time. The Second Law of Thermodynamics This is a consequence of probability.  Macroscopic systems have very sharply peaked probability distributions (much sharper than shown here).  If x is initially far from its most likely position, x e, then it will evolve towards x e (where most of the microstates are).  If x is initially near its most likely position, x e, then it will remain there.  has its maximum value.  All available microstates are equally likely, but in big systems the vast majority of them correspond to very similar macrostates (e.g., about 50% spin up). Macroscopic quantity, x (e.g., position of partition) xexe (or 

Lecture 8, p 5 Lessons from Volume Exchange You learned last lecture: When the system consists of independent parts: The number of states of the whole was the product of the number of states of the parts. We define entropy to be the ln(# microstates). For a big system in equilibrium we almost certainly see the macrostate that maximizes  TOT (or  TOT ). To determine equilibrium, maximize the total entropy by maximizing the sum of the entropies of the parts. If the parts can exchange volume: In equilibrium each part must have the same derivative of its entropy with respect to its volume. This argument doesn’t rely on the parts being the same (or even similar). Now use the same principle for systems that exchange ENERGY

Lecture 8, p 6 Model System for Energy Exchange: Simple Harmonic Oscillator (SHO) To make the mathematics simple we use a system with discrete, equally-spaced energy levels, E n = n. , where n = 1,2,3 … (quantum #) These are the energy levels for a mass on a spring: This system was studied in P214. All you need to know is the energy level formula (E n = n  ). The SHO is an exact description of photons, and a very good description of vibrations of atoms in solids (= “phonons”). Our notation convention: E = energy of a single oscillator U = internal energy of a multi-oscillator system m  Note to 214 folks: We drop the 0.5  for convenience, since only energy differences matter. 4321043210

Lecture 8, p 7 Terminology for HW, Lab, etc. 1-D Einstein Solid: A collection of N oscillators in 1 dimension. 3-D Einstein Solid: A collection of N atoms each oscillating in 3 dimensions: 3N oscillators We’ll assume that each oscillator has the same frequency (i.e., we ignore the frequency dependence of the “normal modes”).

Lecture 8, p 8 ACT 1: Energy Exchange Two oscillators exchange energy. The total energy is U = 4  That is: E 1 = n 1 , E 2 = n 2 , where n 1 +n 2 = 4. What is the total number of microstates? A) 4 B) 5 C) 6D) 7E) 8 E 1 E 2 q = 4 In general we say U = q , where q is the number of energy quanta. Osc1 Osc2 Here’s one microstate.

Lecture 8, p 9 Solution Two oscillators exchange energy. The total energy is U = 4  That is: E 1 = n 1 , E 2 = n 2 , where n 1 +n 2 = 4. What is the total number of microstates? A) 4 B) 5 C) 6D) 7E) 8 E 1 E 2 q = 4 In general we say U = q , where q is the number of energy quanta. Osc1 Osc2

Lecture 8, p 10 Exercise Energy exchange between 3 oscillators Three oscillators have total energy U = 2  (q = 2) Find  n,, m are integers (n + + m = 2) Oscillator #1 #2 #3 543210543210

Lecture 8, p 11 Solution Three oscillators have total energy U = 2  (q = 2) Find  n,, m are integers (n + + m = 2) Oscillator #1 #2 #3 543210543210 E n = 0 E n = 1 E n = 2  = 6

Lecture 8, p 12 Probability of observing energy E n What is the probability that oscillator #1 has energy 0,  or  ? Look at the solution to the exercise. Probability that oscillator #1 has energy 0  :P 0 = 3/6 = 1/2 Probability that oscillator #1 has energy 1  :P 1 = 2/6 = 1/3 Probability that oscillator #1 has energy 2  :P 2 = 1/6 Why does P n decrease as E n increases? This is an important general feature of energy distributions in thermal equilibrium. We’ll discuss it more next week PnPn EnEn 0  1  2 

Lecture 8, p 13 ACT 2: Average energies In an example with 2  in 3 SHOs, what would be the average thermal energy U 1 in the first SHO? A) 0B)  /3C) 2  /3D)  E) 2 

Lecture 8, p 14 Solution In an example with 2  in 3 SHOs, what would be the average thermal energy U 1 in the first SHO? A) 0B)  /3C) 2  /3D)  E) 2  We can calculate it using the probabilities: = P(E 0 )E 0 + P(E 1 )E 1 + P(E 2 )E 2 = ½ * 0 + 1/3 *  + 1/6 * 2  = 2/3  Of course, there’s an easier way: All three oscillators are the same, so they each must have, on average 1/3 of the total energy. As in the previous example, the most likely energy (0) is not the average energy.

Lecture 8, p 15 Home Exercise For a system of 3 oscillators with U = 3 , Plot P n, the probability that oscillator #1 has energy E n = n  Can you state in words why P n decreases with increasing E n ? It is very important that you understand this. PnPn EnEn 0  1  2  3  1

Lecture 8, p 16 This is the same as the formula for N identical particles in M bins.* ParticlesEnergy N particlesq energy quanta M binsN oscillators The q energy quanta are identical (indistinguishable) “particles”. The N oscillators are multiple-occupancy “bins” for those quanta. *See the derivation in last lecture’s Appendix. Relief from Counting The general formula for the number of microstates in a system of N oscillators sharing q energy quanta: We’ll call it the “q-formula”. Beware !!! N means different things in the two situations. (sorry for the notation)

Lecture 8, p 17 How  depends on U Use the q-formula:  For a 2-SHO system, N=2,  = q+1.  For N=3, here’s  for q = 0, 1, 2, 3, 4:  For N=3:  = (q+2)(q+1)/2  q 2 when q is large. q       q In general,  U N-1 for N oscillators when q >> N (or, U >> Nε). Note: This happens when the average energy per oscillator, U/N, is much larger than the energy spacing, . Remember: U = q 

Lecture 8, p 18 Thermal Equilibrium with Energy Exchange This expression for thermal equilibrium was obtained by maximizing the entropy subject to a constraint. The result is very similar to the one for volume exchange (see last lecture’s summary slide). Its form results from the sharing of some quantity among systems. We’ll see other examples. U1U1 0 U eq U tot - U 1  tot  2  1 Assume the systems are large (N >> 1), so that we can take derivatives. Calculate U eq by maximizing  tot as we vary U 1 : (keeping U 1 +U 2 = U tot ) Note: We are holding V constant. That’s the reason for the partial derivative.  is a function of both U and V.

Lecture 8, p 19 General Definition of Temperature Let’s use the energy sharing result to write a more general definition of temperature that does not assume equipartition which (as we’ll see soon) is not always valid. Recall that equipartition is needed for the ideal gas law, pV = NkT, so the new definition will free us from relying on ideal gases to measure temperature. Start with the energy sharing result: In thermal equilibrium: Define the absolute temperature T: By definition, T 1 = T 2 in equilibrium. That’s nice, but to be useful it must give other results that we all know and love. For example:  Out of equilibrium, heat flows from hot to cold.  Under everyday conditions (large objects, not near absolute zero) we have equipartition of energy: U = ½kT for each energy mode.

Lecture 8, p 20 Act 2: Heat Flow Consider the entropies of two systems in thermal contact. System 1 starts with internal energy U 1i > U eq. 1. Which direction will energy flow? A) From 1 to 2 B) From 2 to 1 C) Neither (no flow) 2. Which system has a higher initial temperature? A) System 1 B) System 2 C) Neither (equal temperatures)      U 1i U tot U1U1  U eq

Lecture 8, p 21 Solution Consider the entropies of two systems in thermal contact. System 1 starts with internal energy U 1i > U eq. 1. Which direction will energy flow? A) From 1 to 2 B) From 2 to 1 C) Neither (no flow) The two systems will evolve toward the white dots, as shown. U 1 decreases. Heat flows until  1 +  2 is a maximum.      U 1i U tot U1U1  U eq

Lecture 8, p 22 Solution Consider the entropies of two systems in thermal contact. System 1 starts with internal energy U 1i > U eq. 1. Which direction will energy flow? A) From 1 to 2 B) From 2 to 1 C) Neither (no flow) 2. Which system has a higher initial temperature? A) System 1 B) System 2 C) Neither (equal temperatures) The temperature is inversely proportional to the slope (d  /dU). Therefore (look at the graph) T 1i > T 2i.      U 1i U tot U1U1  U eq

Lecture 8, p 23 Why Heat Flow from Hot to Cold Is Irreversible Isolated systems evolve toward the macrostate that has maximum total entropy (maximum probability). Almost all the probability is very near to the equilibrium macrostate, so once equilibrium is reached, it is very unlikely that the system will go anywhere else. The age of the universe is short compared to how long you’d have to wait for the energy to re-separate! If we prepare an isolated system out of equilibrium, the system will irreversibly adjust itself to: T 1 > T 2 T 1 = T 2

Lecture 8, p 24 What about Equipartition? Equipartition holds for everyday phenomena. In this situation, U >> N . That is, each oscillator (atom, or something similar) has many energy quanta, because q = U/  >> N. Large objects also have N >> 1: N-1 ~ N. We want to verify that our new definition of temperature agrees with the usual one (which is based on the ideal gas law) in the everyday regime. In particular, we want to verify that equipartition holds, in which case the ideal gas formulas for specific heat, etc. will work out. Let’s consider harmonic oscillators, because we know how to do the calculation. Recall: Therefore: Using the above approximations The constant contains (among other things) the volume dependence.  U

Lecture 8, p 25 Equipartition (2) Now calculate the temperature of N oscillators that have internal energy U: Thus, U = NkT. Equipartition holds if q >> N. Here’s a more practical way to write the q >> N condition: U = q  >> N , or U/N >> . Therefore, kT = U/N >> . Equipartition holds if the thermal energy, kT, is much larger than the energy spacing, . Equipartition fails at low temperatures. We’ll study this phenomenon later. The constant doesn’t depend on U, so its partial derivative is zero.

Lecture 8, p 26 Next Week  The Boltzmann factor (and applications)  What happens when equipartition fails?