Download presentation
Presentation is loading. Please wait.
Published byHilary Hopkins Modified over 9 years ago
1
Chapter 2: The Second Law. Start with Combinatorics, Probability and Multiplicity Combinatorics and probability 2-state paramagnet and Einstein solid Multiplicity of a macrostate –Concept of Entropy Directionality of thermal processes (irreversibility) –Overwhelmingly probable
2
Combinatorics is the branch of mathematics studying the enumeration, combination, and permutation of sets of elements and the mathematical relations that characterize their properties. Combinatorics and probability Examples: random walk, two-state systems, … Probability is the branch of mathematics that studies the possible outcomes of given events together with the outcomes' relative likelihoods and distributions. In common usage, the word "probability" is used to mean the chance that a particular event (or set of events) will occur.
3
Probability Multiplication rule for independent events: P (i and j) = P (i) x P (j) Example: What is the probability of the same face appearing on two successive throws of a dice? The probability of any specific combination, e.g., (1,1): 1/6x1/6=1/36 (multiplication rule). Hence, by addition rule, P(same face) = P(1,1) + P(2,2) +...+ P(6,6) = 6x1/36 = 1/6 An event (very loosely defined) – any possible outcome of some measurement. An event is a statistical (random) quantity if the probability of its occurrence, P, in the process of measurement is < 1. The “sum” of two events: in the process of measurement, we observe either one of the events. Addition rule for independent events: P (i or j) = P (i) + P (j) The “product” of two events: in the process of measurement, we observe both events. (independent events – one event does not change the probability for the occurrence of the other). Expectation value of a macroscopic observable A: (averaged over all accessible microstates)
4
Two model systems with fixed positions of particles and discrete energy levels - the models are attractive because they can be described in terms of discrete microstates which can be easily counted (for a continuum of microstates, as in the example with a freely moving particle, we still need to learn how to do this). This simplifies calculation of . On the other hand, the results will be applicable to many other, more complicated models. Despite the simplicity of the models, they describe a number of experimental systems in a surprisingly precise manner. - two-state paramagnet .... (“limited” energy spectrum) - the Einstein model of a solid (“unlimited” energy spectrum)
5
The Two-State Paramagnet The energy of a macrostate: N - the number of “up” spins N - the number of “down” spins - a system of non-interacting magnetic dipoles in an external magnetic field B, each dipole can have only two possible orientations along the field, either parallel or any-parallel to this axis (e.g., a particle with spin ½ ). No “quadratic” degrees of freedom (unlike in an ideal gas, where the kinetic energies of molecules are unlimited), the energy spectrum of the particles is confined within a finite interval of E (just two allowed energy levels). - the magnetic moment of an individual dipole (spin) E E 1 = - B E 2 = + B 0 an arbitrary choice of zero energy - B for parallel to B, + B for anti-parallel to B The total magnetic moment: (a macroscopic observable) The energy of a single dipole in the external magnetic field: A particular microstate ( ....) is specified if the directions of all spins are specified. A macrostate is specified by the total # of dipoles that point “up”, N (the # of dipoles that point “down”, N = N - N ).
6
Example Consider two spins. There are four possible configurations of microstates: M = 2 0 0 - 2 In zero field, all these microstates have the same energy (degeneracy). Note that the two microstates with M=0 have the same energy even when B 0: they belong to the same macrostate, which has multiplicity =2. The macrostates can be classified by their moment M and multiplicity : M = 2 0 - 2 = 1 2 1 For three spins: M = 3 - - - -3 M = 3 - -3 = 1 3 3 1 macrostates:
7
The Multiplicity of Two-State Paramagnet Each of the microstates is characterized by N numbers, the number of equally probable microstates – 2 N, the probability to be in a particular microstate – 1/2 N. n ! n factorial = 1·2·....·n 0 ! 1 (exactly one way to arrange zero objects) For a two-state paramagnet in zero field, the energy of all macrostates is the same (0). A macrostate is specified by (N, N ). Its multiplicity - the number of ways of choosing N objects out of N : The multiplicity of a macrostate of a two-state paramagnet with (N, N ):
8
Stirling’s Approximation for N! (N>>1) Multiplicity depends on N!, and we need an approximation for ln(N!): Check: More accurately: because ln N << N for large N or
9
The Probability of Macrostates of a Two-State PM (B=0) (http://stat-www.berkeley.edu/~stark/Java/Html/BinHist.htm) - as the system becomes larger, the P(N,N ) graph becomes more sharply peaked: N =1 (1,N ) =1, 2 N =2, P(1,N )=0.5 NN P(1, N ) 0.5 01 n 00.5·10 23 10 23 NN NN P(15, N ) P(10 23, N ) - random orientation of spins in B=0 is overwhelmingly more probable 2 nd law!
10
Multiplicity (Entropy) and Disorder In general, we can say that small multiplicity implies “order”, while large multiplicity implies “disorder”. An arrangement with large could be achieved by a random process with much greater probability than an arrangement with small . large small
11
The Einstein Model of a Solid In 1907, Einstein proposed a model that reasonably predicted the thermal behavior of crystalline solids (a 3D bed-spring model): a crystalline solid containing N atoms behaves as if it contained 3N identical independent quantum harmonic oscillators, each of which can store an integer number n i of energy units = ħ . We can treat a 3D harmonic oscillator as if it were oscillating independently in 1D along each of the three axes: classic: quantum: the solid’s internal energy: the zero-point energy the effective internal energy: 1 2 3 3N3N ħħ all oscillators are identical, the energy quanta are the same
12
The Einstein Model of a Solid (cont.) At high k B T >> ħ (the classical limit of large n i ): solid dU/dT, J/K·mole Lead 26.4 Gold 25.4 Silver 25.4 Copper 24.5 Iron 25.0 Aluminum 26.4 To describe a macrostate of an Einstein solid, we have to specify N and U, a microstate – n i for 3N oscillators. Example: the “macrostates” of an Einstein Model with only one atom (1,0 ) =1 (1,1 ) =3 (1,2 ) =6 (1,3 ) =10 Dulong-Petit’s rule
13
The Multiplicity of Einstein Solid Proof: let’s consider N oscillators, schematically represented as follows: - q dots and N-1 lines, total q+N-1 symbols. For given q and N, the multiplicity is the number of ways of choosing n of the symbols to be dots, q.e.d. The multiplicity of a state of N oscillators (N/3 atoms) with q energy quanta distributed among these oscillators: In terms of the total internal energy U =q : Example: The multiplicity of an Einstein solid with three atoms and eight units of energy shared among them 12,870
14
Multiplicity of a Large Einstein Solid (k B T >> ) q = U/ = N - the total # of energy quanta in a solid. = U/( N) - the average # of quanta (microstates) available for each molecule Dulong-Petit’s rule:
15
Multiplicity of a Large Einstein Solid (k B T >> ) q = U/ = N - the total # of energy quanta in a solid. = U/( N) - the average # of quanta (microstates) available for each molecule General statement: For any system with N “quadratic” degrees of freedom (“unlimited” spectrum), the multiplicity is proportional to U N/2. Einstein solid: (2N degrees of freedom) high temperatures: (k B T >> , >>1, q >> N )
16
Multiplicity of a Large Einstein Solid (k B T << ) low temperatures: (k B T << , <<1, q << N ) (Pr. 2.17)
17
Microstates of a system (e.g. ideal gas) Microstate: the state of a system specified by describing the quantum state of each molecule in the system. For a classical particle – 6 parameters (x i, y i, z i, p xi, p yi, p zi ) For a macro system – 6N parameters. The evolution of a system can be represented by a trajectory in the multidimensional (configuration, phase) space of micro- parameters. Each point in this space represents a microstate. During its evolution, the system will only pass through accessible microstates – the ones that do not violate the conservation laws: e.g., for an isolated system, the total internal energy must be conserved. 1 1 2 2 i i
18
Statistics Probabilities of Macrostates Macrostate: the state of a macro system specified by its macroscopic parameters. Two systems with the same values of macroscopic parameters are thermodynamically indistinguishable. A macrostate tells us nothing about a state of an individual particle. For a given set of constraints (conservation laws), a system can be in many macrostates. The statistical approach: to connect the macroscopic observables (averages) to the probability for a certain microstate to appear along the system’s trajectory in configuration space, P( 1, 2,..., N ).
19
The Phase Space vs. the Space of Macroparameters V T P 1 1 2 2 i i the surface defined by an equation of state some macrostate 1 1 2 2 i i 1 1 2 2 i i 1 1 2 2 i i numerous microstates in a multi-dimensional configuration (phase) space that correspond the same macrostate etc., etc., etc....
20
Examples: Two-Dimensional Configuration Space motion of a particle in a one-dimensional box -L L Lx pxpx -p x “Macrostates” are characterized by a single parameter: the kinetic energy K 0 K 0 Each “macrostate” corresponds to a continuum of microstates, which are characterized by specifying the position and momentum K=K 0 Another example: one-dimensional harmonic oscillator x pxpx K + U =const x U(r)U(r)
21
The Fundamental Assumption of Statistical Mechanics The ergodic hypothesis: an isolated system in an equilibrium state, evolving in time, will pass through all the accessible microstates at the same recurrence rate, i.e. all accessible microstates are equally probable. The average over long times will equal the average over the ensemble of all equi-energetic microstates: if we take a snapshot of a system with N microstates, we will find the system in any of these microstates with the same probability. Probability for a stationary system many identical measurements on a single system a single measurement on many copies of the system The ensemble of all equi-energetic states a microcanonical ensemble. 1 1 2 2 i i microstates which correspond to the same energy
22
Probability of a Macrostate, Multiplicity The probability of a certain macrostate is determined by how many microstates correspond to this macrostate – the multiplicity of a given macrostate . This approach will help us to understand why some of the macrostates are more probable than the other, and, eventually, by considering the interacting systems, we will understand irreversibility of processes in macroscopic systems.
23
Concepts of Statistical Mechanics 1.The macrostate is specified by a sufficient number of macroscopically measurable parameters (for an Einstein solid – N and U). 2.The microstate is specified by the quantum state of each particle in a system (for an Einstein solid – # of the quanta of energy for each of N oscillators) 3.The multiplicity is the number of microstates in a macrostate. For each macrostate, there is an extremely large number of possible microstates that are macroscopically indistinguishable. 4.The Fundamental Assumption: for an isolated system, all accessible microstate are equally likely. 5.The probability of a macrostate is proportional to its multiplicity. This will be sufficient to explain irreversibility.
24
Entropy and Temperature (Ch. 2 and a bit of 3) Ideas: Each accessible microstate of an isolated system is equally probable (the fundamental assumption). Every macrostate has a countable number of microstates (follows from Q.M.). The probability of a macrostate is proportional to its multiplicity. When systems get large, multiplicities get outrageously large. On this basis, we will introduce the concept of entropy and discuss the Second Law of Thermodynamics.
25
Our plan: As our point of departure, we’ll use the models of an Einstein solid. We have already discussed one advantage of this model – “discrete” degrees of freedom. Another advantage – by considering two interacting Einstein solids, we can learn about the energy exchange between these two systems, i.e. how to reach thermal equilibrium. By using our statistical approach, we’ll identify the most probable macrostate of a combined system of two interacting Einstein solids after reaching an equilibrium; We’ll introduce the entropy as a measure of the multiplicity of a given macrostate The Second Law of Thermodynamics
26
Two Interacting Einstein Solids, Macropartitions Suppose that we bring two Einstein solids A and B (two sub-systems with N A, U A and N B, U B ) into thermal contact, to form a larger isolated system. What happens to these solids (macroscopically) after they have been brought into contact? N A, U A N B, U B energy The combined sys. – N = N A + N B, U = U A + U B Question: what would be the most probable macrostate for given N A, N B, and U ? The macropartition of the combined system is defined by macroparameter U A Macropartition: a given pair of macrostates for sub-systems A and B that are consistent with conservation of the total energy U = U A + U B. Example: the pair of macrostates where U A = 2 and U B = 4 is one possible macropartition of the combined system with U = 6 As time passes, the system of two solids will randomly shift between different microstates consistent with the constraint that U = const. Different macropartitions amount to different ways that the energy can be macroscopically divided between the sub-systems.
27
The Multiplicity of Two Sub-Systems Combined Example: two one-atom “solids” into thermal contact, with the total U = 6 . Macro- partition UAUA UBUB A A B B AB 0 : 6 0 6 128 1 : 5 1 5 32163 2 : 4 2 4 61590 3 : 3 3 10 100 4 : 2 4 2 15690 5 : 1 5 1 21363 6 : 0 6 0 281 Possible macropartitions for N A = N B = 3, U = q A +q B = 6 Grand total # of microstates: The probability of a macropartition is proportional to its multiplicity: macropartition A+B sub-system A sub-system B Exercise: check the multiplicities of macrostates for N A = N B = 100, U = q A +q B = 200
28
The Probability of Macrostates of a Two-State PM (B=0) (http://stat-www.berkeley.edu/~stark/Java/Html/BinHist.htm ) - as the system becomes larger, the P(N,N ) graph becomes more sharply peaked: N =1 (1,N ) =1, 2 N =2, P(1,N )=0.5 NN P(1, N ) 0.5 01 n 00.5·10 23 10 23 NN NN P(15, N ) P(10 23, N ) - random orientation of spins in B=0 is overwhelmingly more probable 2 nd law! Recall
29
The Multiplicity of Two Sub-Systems Combined sub-system A sub-system B In real systems, N ~10 23, U ~10 23 How to count the multiplicity? (Spreadsheet fails) How to find out the maximum multiplicity? Answer: Analytic approximation The probability of a macropartition is proportional to its multiplicity: macropartition A+B
30
Where is the Maximum? The Average Energy per Atom In general, for two systems in thermal contact, the equilibrium (most probable) macropartition of the combined system is the one where the average energy per atom in each system is the same (the basis for introducing the temperature). Let’s explore how the macropartition multiplicity for two sub- systems A and B (N A, N B, A = B = ) in thermal contact depends on the energy of one of the sub-systems: The high-T limit (q >> N):
31
Simpler argument Take-home exercise: find the position of the maximum of AB (U A ) for N A = 200, N B = 100, U = 180 AA UAUA U/2 BB UAUA AB UAUA U/2 A special case: two identical sub-systems (N A = N B ), AB (U A ) is peaked at U A = U B = ½ U :
32
Sharpness of the Multiplicity Function Example: N = 100,000 x = 0.01 (0.9999) 100,000 ~ 4.5·10 -5 << 1 How sharp is the peak? Let’s consider small deviations from the maximum for two identical sub-systems: U A = (U/2) (1+x)U B = (U/2) (1-x)(x <<1) AB UAUA U/2 More rigorously (p. 65): 2U2U The peak width: a Gaussian function When the system becomes large, the probability as a function of U A (macropartition) becomes very sharply peaked, i.e. “fluctuation” is very small
33
Implications? Irreversibility! When two macroscopic solids are in thermal equilibrium with each other, completely random and reversible microscopic processes (leading to random shuffling between microstates) tend at the macroscopic level to push the solids inevitably toward an equilibrium macropartition (an irreversible macro behavior). Any random fluctuations away from the most likely macropartition are extremely small ! The vast majority of microstates are in macropartitions close to the most probable one (in other words, because of the “narrowness” of the macropartition probability graph). Thus, (a)If the system is not in the most probable macropartition, it will rapidly and inevitably move toward that macropartition. The reason for this “directionality” (irreversibility): there are far more microstates in that direction than away. This is why energy flows from “hot” to “cold” and not vice versa. (b)It will subsequently stay at that macropartition (or very near to it), in spite of the random shuffling of energy back and forth between the two solids.
34
Problem: Consider the system consisting of two Einstein solids P and Q in thermal equilibrium. Assume that we know the number of atoms in each solid and . What do we know if we also know (a)the quantum state of each atom in each solid? (b) the total energy of each of the two solids? (c) the total energy of the combined system? the system’s macrostate the system’s microstate the system’s macropartition (a) (b) (c) +fluctuation N A, U A N B, U B energy
35
Problem: Imagine that you discover a strange substance whose multiplicity is always 1, no matter how much energy you put into it. If you put an object made of this substance (sub-system A) into thermal contact with an Einstein solid having the same number of atoms but much more energy (sub-system B), what will happen to the energies of these sub-systems? A.Energy flows from B to A until they have the same energy. B.Energy flows from A to B until A has no energy. C.No energy will flow from B to A at all.
36
Entropy of a system in a given macrostate (N,U,V...): Entropy is just another (more convenient) way of talking about multiplicity. Convenience: reduces ridiculously large numbers to manageable numbers Examples : for N~10 23,, ln ~10 23, being multiplied by k B ~ 10 -23, it gives S ~ 1J/K. The “inverse” procedure: the entropy of a certain macrostate is 4600k B. What is the multiplicity of the macromacrostate? if a system contains two or more interacting sub-systems having their own distinct macrostates, the total entropy of the combined system in a given macropartition is the sum of the entropies of the subsystems they have in that macropartition: AB = A x B x C x.... S AB = S A + S B + S C +... The entropy is a state function, i.e. it depends on the macrostate alone and not on the path of the system to this macrostate. Units: J/K
37
Problem: Imagine that one macropartition of a combined system of two Einstein solids has an entropy of 1 J/K, while another (where the energy is more evenly divided) has an entropy of 1.001 J/K. How many times more likely are you to find the system in the second macropartition compared to the first?
38
The Second Law of Thermodynamics An isolated system, being initially in a non-equilibrium state, will evolve from macropartitions with lower multiplicity (lower probability, lower entropy) to macropartitions with higher multiplicity (higher probability, higher entropy). Once the system reaches the macropartition with the highest multiplicity (highest entropy), it will stay there. Thus, The entropy of an isolated system never decreases. (one of the formulations of the second law of thermodynamics). Whatever increases the number of microstates will happen if it is allowed by the fundamental laws of physics and whatever constraints we place on the system. “Whatever” - energy exchange, particles exchange, expansion of a system, etc. ( Is it really true that the entropy of an isolated system never decreases? consider a pair of very small Einstein solids. Why is this statement more accurate for large systems than small systems? )
39
Entropy and Temperature Thus, when two solids are in equilibrium, the slope is the same for both of them. On the other hand, when two solids are in equilibrium, they have the same temperature. S AB UAUA U/2 S A S B Equilibrium: Units: T – K, S – J/K, U - J U A, V A, N A U B, V B, N B To establish the relationship between S and T, let’s consider two sub-systems, A and B, isolated from the environment. The sub-systems are separated by a rigid membrane with finite thermal conductivity (N i and V i are fixed, thermal energy can flow between the sub-systems). The sub-systems – with the “quadratic” degrees of freedom ( ~U fN/2 ). For example, two identical Einstein solids (N A = N B = N) near the equilibrium macropartition (U A = U B = U/2): equilibrium The stat. mech. definition of the temperature
40
S AB UAUA U/2 S A S B 2. The slope is inversely proportional to T. We have been considering the entropy changes in the processes where two interacting systems exchanged the thermal energy but the volume and the number of particles in these systems were fixed. In general, however, we need more than just one parameter to specify a macrostate: 1. Note that the partial derivative in the definition of T is calculated at V=const and N=const. The physical meaning of the other two partial derivatives of S will be considered in L.7. - the energy should flow from higher T to lower T; in thermal equilibrium, T A and T B should be the same. The sub-system with a larger S/ U ( lower T) should receive energy from the sub-system with a smaller S/ U ( higher T), and this process will continue until S A / U A and S B / U B become the same.
41
Problems Problem: Imagine that you discover a strange substance whose multiplicity is always 1, no matter how much energy you put into it. If you put an object made of this substance (sub-system A) into thermal contact with an Einstein solid having the same number of atoms but much more energy (sub-system B), what will happen to the energies of these sub-systems? Problem: An object whose multiplicity is always 1, no matter what its thermal energy is has a temperature that: (a) is always 0; (b) is always fixed; (c) is always infinite. Problem: If an object has a multiplicity that decreases as its thermal energy increases (e.g., a two-state paramagnetic over a certain U range), its temperature would: (a) be always 0; (b) be always fixed; (c) be negative; (d) be positive.
42
From S(N,U,V) - to U(N,T,V) Find (U,V,N,...) – the most challenging step S (U,V,N,...) = k B ln (U,V,N,...) Solve for U = f (T,V,N,...) Now we can get an (energy) equation of state U = f (T,V,N,...) for any system for which we have an explicit formula for the multiplicity (entropy)!! Thus, we’ve bridged the gap between statistical mechanics and thermodynamics! The recipe:
43
Measuring Entropy Even if we cannot calculate S, we can still measure it: For V=const and N=const : By heating a cup of water (200g, C V =840 J/K) from 20 0 C to 100 0 C, we increase its entropy by At the same time, the multiplicity of the system is increased by - in L.6, we’ll see that this equation holds for all reversible (quasi- static) processes (even if V is changed in the process). This is the “thermodynamic” definition of entropy, which Clausius introduced in 1854, long before Boltzmann gave his “statistical” definition S k B ln .
44
An Einstein Solid: from S(N,U) to U(N,T) at high T High temperatures: (k B T >> , q >>N) - in agreement with the equipartition theorem: the total energy should be ½k B T times the number of degrees of freedom. To compare with experiment, we can measure the heat capacity: the heat capacity at constant volume - in a nice agreement with experiment
45
An Einstein Solid: from S(N,U) to U(N,T) at low T Low temperatures: (k B T << , q <<N) - as T 0, the energy goes to zero as expected (Pr. 3.5). The low-T heat capacity: (more accurate result will be obtained on the basis of the Debye model of solids)
46
Example (Pr. 3.14, page 97) For a mole of aluminum, C V = aT + bT 3 at T < 50 K (a = 1.35·10 -3 J/K 2, b = 2.48·10 –5 J/K 4 ). The linear term – due to mobile electrons, the cubic term – due to the crystal lattice vibrations. Find S(T) and evaluate the entropy at T = 1K,10 K. T = 1K - at low T, nearly all the entropy comes from the mobile electrons T = 10K - most of the entropy comes from lattice vibrations - much less than the # of particles, most degrees of freedom are still frozen out.
47
Residual Entropy Glasses aren’t really in equilibrium, the relaxation time – huge. They do not have a well-defined T or C V. Glasses have a particularly large entropy at T = 0. T S liquid crystal glass supercooled liquid residual entropy Debenedetti & Stillinger, Nature (2001)
48
Entropy of an Ideal Gas Find (U,V,N,...) – the most challenging step S (U,V,N,...) = k B ln (U,V,N,...) Solve for U = f (T,V,N,...) Now we will derive the equation(s) of state for an ideal gas from the principles of statistical mechanics. We will follow the path prescribed by the ‘microcanonical’ ensemble thinking. So far we have treated quantum systems whose states in the configuration (phase) space may be enumerated. When dealing with classical systems with translational degrees of freedom, we need to learn how to calculate the multiplicity.
49
Multiplicity for a Single particle - is more complicated than that for an Einstein solid, because it depends on three rather than two macro parameters (e.g., N, U, V). Example: particle in a one- dimensional “box” -L L Lx pxpx pxpx -p x xx The number of microstates: Q.M. Quantum mechanics (the uncertainty principle) helps us to numerate all different states in the configuration (phase) space: The total number of ways of filling up the cells in phase space is the product of the number of ways the “space” cells can be filled times the number of ways the “momentum” cells can be filled
50
Multiplicity of a Monatomic Ideal Gas (simplified) For a molecule in a three-dimensional box: the state of the molecule is a point in the 6D space - its position (x,y,z) and its momentum (p x,p y,p z ). The number of “space” microstates is: There is some momentum distribution of molecules in an ideal gas (Maxwell), with a long “tail” that goes all the way up to p = (2mU) 1/2 (U is the total energy of the gas). However, the momentum vector of an “average” molecule is confined within a sphere of radius p ~ (2mU/N) 1/2 (U/N is the average energy per molecule). Thus, for a single “average” molecule: The total number of microstates for N molecules: For N molecules: p n However, we have over-counted the multiplicity, because we have assumed that the atoms are distinguishable. For indistinguishable quantum particles, the result should be divided by N! (the number of ways of arranging N identical atoms in a given set of “boxes”):
51
1 particle - 2 particles - The accessible momentum volume for N particles = the “area” of a 3N-dimensional hyper- sphere p Monatomic ideal gas: (3N degrees of freedom) N =1 f N- the total # of “quadratic” degrees of freedom The reason why m matters: for a given U, a molecule with a larger mass has a larger momentum, thus a larger “volume” accessible in the momentum space. pxpx pypy pzpz Momentum constraints: More Accurate Calculation of N (I)
52
More Accurate Calculation of N (II) For a particle in a box (L) 3 : (Appendix A) If p>> p, the total degeneracy (multiplicity) of 1 particle with energy U is: Plug in the “area” of the hyper-sphere: pxpx pypy pzpz If p>> p, the total degeneracy (multiplicity) of N indistinguishable particle with energy U is:
53
Entropy of an Ideal Gas f 3 (monatomic), 5 (diatomic), 6 (polyatomic) (Monatomic ideal gas) The Sackur-Tetrode equation: In general, for a gas of polyatomic molecules: an average volume per molecule an average energy per molecule
54
Problem Two cylinders (V = 1 liter each) are connected by a valve. In one of the cylinders – Hydrogen (H 2 ) at P = 10 5 Pa, T = 20 0 C, in another one – Helium (He) at P = 3·10 5 Pa, T=100 0 C. Find the entropy change after mixing and equilibrating. H 2 : He : The temperature after mixing: For each gas:
55
Entropy of Mixing Consider two different ideal gases (N 1, N 2 ) kept in two separate volumes (V 1,V 2 ) at the same temperature. To calculate the increase of entropy in the mixing process, we can treat each gas as a separate system. In the mixing process, U/N remains the same (T will be the same after mixing). The parameter that changes is V/N: The total entropy of the system is greater after mixing – thus, mixing is irreversible. if N 1 =N 2 =1/2N, V 1 =V 2 =1/2V
56
Gibbs “Paradox” If two mixing gases are of the same kind (indistinguishable molecules): Quantum-mechanical indistinguishability is important! (even though this equation applies only in the low density limit, which is “classical” in the sense that the distinction between fermions and bosons disappear. S total = 0 because U/N and V/N available for each molecule remain the same after mixing. - applies only if two gases are different !
57
Problem Two identical perfect gases with the same pressure P and the same number of particles N, but with different temperatures T 1 and T 2, are confined in two vessels, of volume V 1 and V 2, which are then connected. find the change in entropy after the system has reached equilibrium. at T 1 =T 2, S=0, as it should be (Gibbs paradox) - prove it!
58
An Ideal Gas: from S(N,V,U) - to U(N,V,T) Ideal gas: (fN degrees of freedom) The heat capacity for a monatomic ideal gas: - in agreement with the equipartition theorem, the total energy should be ½k B T times the number of degrees of freedom. - the “energy” equation of state
59
Partial Derivatives of the Entropy We have been considering the entropy changes in the processes where two interacting systems exchanged the thermal energy but the volume and the number of particles in these systems were fixed. In general, however, we need more than just one parameter to specify a macrostate, e.g. for an ideal gas Today we will explore what happens if we let the V vary, and analyze the physical meaning of the other two partial derivatives of the entropy: We are familiar with the physical meaning only one partial derivative of entropy: When all macroscopic quantities S,V,N,U are allowed to vary:
60
Mechanical Equilibrium and Pressure Let’s fix U A,N A and U B,N B, but allow V to vary (the membrane is insulating, impermeable for gas molecules, but its position is not fixed). Following the same logic, spontaneous “exchange of volume” between sub-systems will drive the system towards mechanical equilibrium (the membrane at rest). The equilibrium macropartition should have the largest (by far) multiplicity (U, V) and entropy S (U, V). U A, V A, N A U B, V B, N B In mechanical equilibrium: - the volume-per-molecule should be the same for both sub-systems, or, if T is the same, P must be the same on both sides of the membrane. S AB VAVA V A eq S A S B The stat. phys. definition of pressure: e.g. ideal gas
61
The “Pressure” Equation of State for an Ideal Gas Ideal gas: (fN degrees of freedom) The “energy” equation of state (U T): The “pressure” equation of state (P T): - we have finally derived the equation of state of an ideal gas from first principles!
62
Thermodynamic identity I Let’s assume N is fixed, thermal equilibrium: mechanical equilibrium: i.e.
63
Quasi-Static Processes The quasi-static adiabatic process with an ideal gas : - we’ve derived these equations from the 1 st Law and PV=RT On the other hand, from the Sackur-Tetrode equation for an isentropic process : Quasistatic adiabatic ( Q = 0) processes: isentropic processes (all processes) (quasi-static processes with fixed N) Thus, for quasi-static processes : - is an exact differential (S is a state function). Thus, the factor 1/T converts Q into an exact differential for quasi-static processes. Comment on State Functions : P V
64
Problem: (a) Calculate the entropy increase of an ideal gas in an isothermal process. (b) Calculate the entropy increase of an ideal gas in an isochoric process. You should be able to do this using (a) Sackur-Tetrode eq. and (b) (all the processes are quasi-static) Let’s verify that we get the same result with approaches a) and b) (e.g., for T=const): Since U = 0, (Pr. 2.34)
65
Problem: A bacterias of mass M with heat capacity (per unit mass) C, initially at temperature T 0 + T, is brought into thermal contact with a heat bath at temperature T 0.. (a) Show that if T<<T 0, the increase S in the entropy of the entire system (body+heat bath) when equilibrium is reached is proportional to ( T) 2. (b)Find S if the body is a bacteria of mass 10 -15 kg with C=4 kJ/(kg·K), T 0 =300K, T=0.03K. (c)What is the probability of finding the bacteria at its initial T 0 + T for t =10 -12 s over the lifetime of the Universe (~10 18 s). (a) (b)
66
Problem (cont.) (b) for the (non-equilibrium) state with T bacteria = 300.03K is greater than in the equilibrium state with T bacteria = 300K by a factor of The number of “1ps” trials over the lifetime of the Universe: Thus, the probability of the event happening in 10 30 trials:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.