Lecture 9 Pg 1 213 Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.

Slides:



Advertisements
Similar presentations
Dr Roger Bennett Rm. 23 Xtn Lecture 19.
Advertisements

Thermodynamics: Entropy and the Second Law
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
Lecture 4. Entropy and Temperature (Ch. 3) In Lecture 3, we took a giant step towards the understanding why certain processes in macrosystems are irreversible.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1.The Statistical Basis of Thermodynamics 1.The Macroscopic & the Microscopic States 2.Contact between Statistics & Thermodynamics: Physical Significance.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Thermal & Kinetic Lecture 13 Towards Entropy and the 2 nd Law: Permutations and Combinations.
Thermodynamics: the Second Law 자연과학대학 화학과 박영동 교수 The Direction of Nature and spontaneity.
1 Ch3. The Canonical Ensemble Microcanonical ensemble: describes isolated systems of known energy. The system does not exchange energy with any external.
Entropy and the Second Law of Thermodynamics
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Entropy Physics 202 Professor Lee Carkner Ed by CJV Lecture -last.
For the cyclic process shown, W is:D A] 0, because it’s a loop B] p 0 V 0 C] - p 0 V 0 D] 2 p 0 V 0 E] 6 p 0 V 0 For the cyclic process shown,  U is:
Lecture # 9 Physics 7A Summer Session II Evaluations First You get 15 minutes to do Evaluations This Evaluation is for Cassandra the Lecturer You.
Chapter 15 Thermodynamics. MFMcGrawChap15d-Thermo-Revised 5/5/102 Chapter 15: Thermodynamics The first law of thermodynamics Thermodynamic processes Thermodynamic.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
Thermodynamics AP Physics 2.
Boltzmann Distribution and Helmholtz Free Energy
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Entropy and the Second Law of Thermodynamics
Entropy and the Second Law Lecture 2. Getting to know Entropy Imagine a box containing two different gases (for example, He and Ne) on either side of.
Summary: Isolated Systems, Temperature, Free Energy Zhiyan Wei ES 241: Advanced Elasticity 5/20/2009.
1 Introduction Physics 313 Professor Lee Carkner Lecture 1.
Q19. Second Law of Thermodynamics
Constant-Volume Gas Thermometer
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
Physics 207: Lecture 26, Pg 1 Lecture 26 Goals: Chapter 18 Chapter 18  Understand the molecular basis for pressure and the ideal- gas law.  Predict the.
The Ideal Monatomic Gas. Canonical ensemble: N, V, T 2.
Entropy Time’s Arrow. Objectives Explain the tendency of matter and energy to spread out over time. Identify entropy changes in familiar processes.
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Lecture 9 Overview (Ch. 1-3) Format of the first midterm: four problems with multiple questions. The Ideal Gas Law, calculation of  W,  Q and dS for.
Dr Roger Bennett Rm. 23 Xtn Lecture 15.
3.The Canonical Ensemble 1.Equilibrium between a System & a Heat Reservoir 2.A System in the Canonical Ensemble 3.Physical Significance of Various Statistical.
Chapter 10 Thermal Physics. Heat The exchange of energy between objects because of temperature differences is called heat Objects are in thermal contact.
1 Property Relationships Chapter 6. 2 Apply the differential form of the first law for a closed stationary system for an internally reversible process.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Lecture 3 Examples and Problems
Physics 1210/1310 Mechanics&Thermodynamics Lecture 39~40 Thermodynamics.
Physics 207: Lecture 29, Pg 1 Physics 207, Lecture 29, Dec. 13 l Agenda: Finish Ch. 22, Start review, Evaluations  Heat engines and Second Law of thermodynamics.
Lecture 13. Thermodynamic Potentials (Ch. 5) So far, we have been using the total internal energy U and, sometimes, the enthalpy H to characterize various.
Work, Energy & Heat The First Law: Some Terminology System: Well defined part of the universe Surrounding: Universe outside the boundary of the system.
Lecture 7. Thermodynamic Identities (Ch. 3). Diffusive Equilibrium and Chemical Potential Sign “-”: out of equilibrium, the system with the larger  S/
Results of Midterm points # of students GradePoints A> 85 B+B B60-79 C+C C30-54 D<
Chemistry Thermodynamics Lecture 7 : Entropy Lecture 8 : Converting Heat to Work Lecture 9: Free Energies.
1 12. Thermodynamics The essential concepts here are heat and temperature. Heat is a form of energy transferred between a system and an environment due.
Lecture 10, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Chapter 20 Lecture 35: Entropy and the Second Law of Thermodynamics HW13 (problems):19.3, 19.10, 19.44, 19.75, 20.5, 20.18, 20.28,
PHYS 172: Modern Mechanics Lecture 22 – Entropy and Temperature Read Summer 2012.
Boltzmann statistics, average values
12. Thermodynamics Temperature
Entropy and the Second Law of Thermodynamics
Dr.Salwa Al Saleh Statistical Physics 2009 Dr.Salwa Al Saleh
Statistical Mechanics
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
Physics 2 – March 9, 2017 P3 Challenge – What is the sign of W, the work done by a gas during a) an expansion, and b) a compression. Today’s Objective:
Entropy and the Second Law of Thermodynamics
Results of Midterm 1 # of students
First Law of Thermodynamics
PHYS 213 Midterm Exam HKN Review Session
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Thermal & Kinetic Lecture 12
Lecture 4. Entropy and Temperature (Ch. 3)
PHYS 213 Midterm Exam HKN Review Session
Homework Solution Draw out all 15 possible microstates for a system of three molecules with equally-spaced (1 quantum) energy levels and four quanta of.
Statistical Thermodynamics
PHYS 213 Midterm Exam HKN Review Session
Presentation transcript:

Lecture 9 Pg Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion 1-4 Labs 1-2 Review Session Sunday April 8, 3-5 PM, 141 Loomis

Lecture 9 Pg 2 Lecture 9 Examples and Problems  Counting microstates of combined systems  Volume exchange between systems  Definition of Entropy and its role in equilibrium  The second law of thermodynamics  Statistics of energy exchange  General definition of temperature  Why heat flows from hot to cold

Lecture 9 Pg 3 Pretend we are playing a coin-tossing game. We will toss 100 pennies (all distinguishable) into the air, and there is an equal chance they will land on the table or on the floor. What is the probability that all 100 pennies will land on the table? What is the dimensionless entropy associated with the situation that 70 pennies land on the table, and 30 land on the floor (accounting for the fact that each penny can land heads or tails, but ignoring the various places on the table or floor that each penny can land)? Exercise: Microstates and Entropy

Lecture 9 Pg 4 Pretend we are playing a coin-tossing game. We will toss 100 pennies (all distinguishable) into the air, and there is an equal chance they will land on the table or on the floor. What is the probability that all 100 pennies will land on the table? What is the dimensionless entropy associated with the situation that 70 pennies land on the table, and 30 land on the floor (accounting for the fact that each penny can land heads or tails, but ignoring the various places on the table or floor that each penny can land)? Exercise: Microstates and Entropy P(100 on table) = = 7.9 x This is one macrostate, and we want to know how many microstates correspond to it. There are three parts: 1. how many ways to have a 70/30 split: 100!/(70!30!) 2. how many ways for the 70 to be configured: how many ways for the 30 to be configured: 2 30 Therefore:  (70 on table) = 100!/(70!30!) = 100!/(70!30!)  = ln  = ln[100!/(70!30!) ] = 100 ln 2 + ln(100!) – ln(70!) – ln(30!) {We’ll see later that ln x! ≈ x ln x – x}   = 130

Lecture 9 Pg 5 Summary of Bin Counting Number of microstates for N objects in M bins: UnlimitedSingleN << M occupancyoccupancyDilute gas Distinguishable Identical Needed at high densities (liquids and solids) OK for gases, too. OK at low densities (gases only)

Lecture 9 Pg 6 When the numbers are small, and the whole distribution matters, the average and the most likely may not be very close. Consider 3 particles (A, B, C) in a system with 6 multiple occupancy cells partitioned by a movable barrier: Take N L =1 (particle A) and N R =2 (particles B and C). 1) Calculate the entropy for each partition position. 2) What are the most likely and average partition positions? Note:We are interested in the volume on the left and right, V L and V R, so we’ll express our answers in terms of them. V L + V R = 6. Exercise: Microstates and Entropy A A BC BC A B C Some possible microstates:

Lecture 9 Pg 7 Worksheet for this problem  L  R  =  L  R  = ln(  ) V L = = V L = 2 V L = 3 V L = 4 V L = 5 1 particle on left 2 (distinguishable on right) multiple-occupancy bins

Lecture 9 Pg 8 Solution  L  R  =  L  R  = ln(  ) V L = = V L = = V L = = V L = = V L = =  tot = 105 The most likely V L = 2. The average = (1*25 + 2*32 + 3*27 + 4*16 + 5*5) / 105 = 259/105 = particle on left 2 (distinguishable on right) multiple-occupancy bins

Lecture 9 Pg 9 Notice that the most likely position occurs when: What is the probability, P(2), that V L = 2? Graph the Results VLVL LRLR VLVL

Lecture 9 Pg 10 Graph the Results Most probable V L :  = ln(32) = 3.47 VLVL LRLR VLVL Notice that the most likely position occurs when: What is the probability, P(2), that V L = 2?

Lecture 9 Pg 11 If the partition is allowed to move freely, the most likely macrostate occurs at maximum total entropy,  tot =  L +  R. This is equivalent to maximizing . This corresponds to not    . Entropy will be more convenient to calculate than . Maximizing the Total Entropy     Most likely value of V L VLVL   VLVL Maximum  1 +  2 is where slopes cancel

Lecture 9 Pg 12 Summary  The total entropy of an isolated system is maximum in equilibrium.  So if two parts (1 and 2) can exchange V, equilibrium requires: This is a general equilibrium condition. A similar relation holds for any exchanged quantity. Entropy of an ideal gas: For N distinguishable particles in volume V:   V N  = NlnV + const You can’t calculate the constant (that requires quantum mechanics), but it drops out of problems where one only needs the entropy change. For example, if the temperature is constant:  f -  i = NlnV f - NlnV i = Nln(V f /V i )

Lecture 9 Pg 13 Act 1: Isothermal Expansion We isothermally compress 1 mole of O 2 from 2 liters to 1 liter. 1. How much does the (dimensionless) entropy of the gas change? A) 0 B) 6x10 23 ln(2) C) -6x10 23 ln(2) 2. How much does the (dimensionless) entropy of the environment change? A) 0 B) 6x10 23 ln(2) C) -6x10 23 ln(2)

Lecture 9 Pg 14 Solution We isothermally compress 1 mole of O 2 from 2 liters to 1 liter. 1. How much does the (dimensionless) entropy of the gas change? A) 0 B) 6x10 23 ln(2) C) -6x10 23 ln(2) 2. How much does the (dimensionless) entropy of the environment change? A) 0 B) 6x10 23 ln(2) C) -6x10 23 ln(2) There are half as many places for each gas particle:  i  V N   f  (V/2) N  f -  i = (NlnV f + const)  (NlnV i + const) = Nln(V f /V i ) = N A ln(1/2) = -6x10 23 ln 2) According to the 2 nd Law, the entropy of an isolated system must stay the same or increase. Considering the (gas + environment) as the ‘system’, the reduction in  gas must be matched by an increase in  env. We will see later that for isothermal processes,  total = 0 (i.e., they are reversible!).

Lecture 9 Pg 15 Act 2: Free Expansion Now we let the gas ‘free’ expand back to 2 liters. How much does the total entropy change? A) 0 B) 6x10 23 ln(2) C) -6x10 23 ln(2)

Lecture 9 Pg 16 Solution Now we let the gas ‘free’ expand back to 2 liters. How much does the total entropy change? A) 0 B) 6x10 23 ln(2) C) -6x10 23 ln(2) There are twice as many places for each gas particle:  i  V N   f  (2V) N  f -  i = (NlnV f + const)  (NlnV i + const) = Nln(V f /V i ) = N A ln(2) = 6x10 23 ln 2) This is an isothermal process (why?), but not quasi-static, i.e., the gas is not in equilibrium throughout the expansion. In fact, the gas does not interact with the enviroment at all, so  env doesn’t change. Because this is an irreversible expansion (the particles will never randomly all go back into the 1-liter box!),  tot increases.

Lecture 9 Pg 17 Act 3

Lecture 9 Pg 18 Solution (Gibbs Paradox) Initial entropyfinal entropy What did we do wrong?

Lecture 9 Pg 19 Gibbs Paradox (2) So far we have dealt with situations where the number of particles in each subsystem is fixed. Here, after the partition is removed, particles can move freely between the two halves. To calculate the change in entropy, we must account for the fact that the particles are identical. Initial entropyfinal entropy

Lecture 9 Pg 20 Exercise Consider a ~1 liter container of N 2 at STP. What fractional increase in the volume of the box V will double the number of microstates? 1) What is N? 2) What is  ? 3) What is  f /  i ?

Lecture 9 Pg 21 Exercise Consider a ~1 liter container of N 2 at STP. What fractional increase in the volume of the box V will double the number of microstates? 1) What is N? 2) What is  ? 3) What is  f /  i ? At STP, 1 mole of a gas occupies 22.4 liter. Therefore, in 1 liter there is N = 1 liter x (6 x /22.4 liter) = 2.7 x  = (n T V) N  f /  i = 2 = (n T V f ) N /(n T V i ) N = (V f /V i ) N (V f /V i ) = 2 1/N = 2^(3.7x ) = 1.0 according to my calculator. = x according to Mathematica The lesson: the number of microstates increases mind-bogglingly fast – that’s why the gas fills the space.

Lecture 9 Pg 22 Exercise: Probability P(E n ) For a system of 3 oscillators with U = 3  plot P(E n ), the probability that oscillator #1 has energy E n = n  PnPn EnEn 0 1  2  3 

Lecture 9 Pg 23 Solution For a system of 3 oscillators with U = 3  plot P(E n ), the probability that oscillator #1 has energy E n = n  E 1 E 2 +E 3   4 1  2  3 2  1  2 3  P n 0.4 EnEn 0 1  2  3  The probability decreases as the energy in oscillator #1 increases, because the more energy it has the fewer microstates are available to oscillators #2 and #3. Oscillator #1 has only one microstate no matter what its energy is. That’s a feature of very simple systems (no internal structure).

Lecture 9 Pg 24 Example: Exchanging Energy Two oscillators, A and B (“solid 1”) with energy level spacing  share a total of 6 quanta. One possible microstate is shown in the figure below. They are brought into contact with four oscillators, C, D, E and F (“solid 2”) which initially share no quanta. 1) Initially, what is the entropy of solid 1? Solid 2? 2) If solid 2 is brought into contact with solid 1, what is the total entropy after equilibrium is reached? 3) What is the average energy in oscillator A, before and after? ABCDEFABCDEF Solid 1 Solid 2

Lecture 9 Pg 25 Solution Two oscillators, A and B (“solid 1”) with energy level spacing  share a total of 6 quanta. One possible microstate is shown in the figure below. They are brought into contact with four oscillators, C, D, E and F (“solid 2”) which initially share no quanta. 1) Initially, what is the entropy of solid 1? Solid 2? 2) If solid 2 is brought into contact with solid 1, what is the total entropy after equilibrium is reached? 3) What is the average energy in oscillator A, before and after? Before:2 oscillators share 6 . = 3 . After:6 oscillators share 6 . = . ABCDEFABCDEF Solid 1 Solid 2

Lecture 9 Pg 26 Entropy, T, and C V A conventional entropy, S  k , is often used. Our basic definition of T is then: S has dimensions energy/temperature (J/K). For fixed-V processes: So: Or: One can use this to calculate absolute entropy (no arbitrary constants).

Lecture 9 Pg 27 FYI: Absolute Entropy There is a natural method to determine absolute entropy (no arbitrary constants).  At T=0, an object in equilibrium has only one accessible state (maybe two) (the lowest energy level, or two if there is degeneracy).  Entropy is ~0 at T=0.  As T  0, the entropy falls smoothly to that limit. This is sometimes called the “Third Law of Thermodynamics”  So, the absolute S(T) can be calculated by setting S(0) = 0: This integral does not diverge, because C v  0 as T  0.  Absolute entropies are computed for many substances. For example, S O2 (T=25° C, p=1Atm) = J/mol.K

Lecture 9 Pg 28 Exercise: A Large Collection of Oscillators Consider oscillators at 300 K with  << kT. 1) What’s C V ? 2) If 1 J of U is added, how much does T go up? 3) How much does that 1 J make  go up?

Lecture 9 Pg 29 Solution Consider oscillators at 300 K with  << kT. 1) What’s C V ? 2) If 1 J of U is added, how much does T go up? 3) How much does that 1 J make  go up? Equipartition holds, so C V = Nk = 13.8 J/K.  T =  U / C V = 0.07 K d  /dU = 1/kT T is almost constant here, so:  U/kT = 1 J / (4.2  J) = 2.4   is the exponential of this, a gigantic number. Note:  S= k  U/T = 14 J/K

Lecture 9 Pg 30 Consider this situation: N 1 = 3, N 2 = 4 q = q 1 + q 2 = 8 (i.e., U = 8  ) What is the most likely energy for each of the two systems? Example: Energy Exchange U1U1 U2U2

Lecture 9 Pg 31 Consider this situation: N 1 = 3, N 2 = 4 q = q 1 + q 2 = 8 (i.e., U = 8  ) What is the most likely energy for each of the two systems? Solution Most likely macrostate: U 1 /N 1 = 3/3  U 2 /N 2 = 5/4.  Equipartition. N is not large. U1U1 U2U2 I am not going to work it out in detail. Here’s a graph of the solution. Sample calculation: q 1 =1, q 2 = 7. Use: Then: q 1 = U 1 /  Number of States Most Likely U  2 (U-U 1 )  1 (U 1 )  =  1  2