Classical and Quantum Statistics

Slides:



Advertisements
Similar presentations
Dr Roger Bennett Rm. 23 Xtn Lecture 19.
Advertisements

The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Statistical Mechanics
Statistical Mechanics
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
MSEG 803 Equilibria in Material Systems 9: Ideal Gas Quantum Statistics Prof. Juejun (JJ) Hu
Lecture 22. Ideal Bose and Fermi gas (Ch. 7)
13.4 Fermi-Dirac Distribution
CHAPTER 14 THE CLASSICAL STATISTICAL TREATMENT OF AN IDEAL GAS.
The Maxwell-Boltzmann Distribution Valentim M. B. Nunes ESTT - IPT April 2015.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1.The Statistical Basis of Thermodynamics 1.The Macroscopic & the Microscopic States 2.Contact between Statistics & Thermodynamics: Physical Significance.
SOME BASIC IDEAS OF STATISTICAL PHYSICS Mr. Anil Kumar Associate Professor Physics Department Govt. College for Girls, Sector -11, Chandigarh.
Statistical Physics 2.
1 Lecture 5 The grand canonical ensemble. Density and energy fluctuations in the grand canonical ensemble: correspondence with other ensembles. Fermi-Dirac.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Lecture 23. Systems with a Variable Number of Particles. Ideal Gases of Bosons and Fermions (Ch. 7) In L22, we considered systems with a fixed number of.
1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical.
Lecture 21. Boltzmann Statistics (Ch. 6)
Statistical Mechanics
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Thermo & Stat Mech - Spring 2006 Class 18 1 Thermodynamics and Statistical Mechanics Statistical Distributions.
Entropy and the Second Law of Thermodynamics
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Statistical Mechanics Physics 313 Professor Lee Carkner Lecture 23.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
Ch 23 pages Lecture 15 – Molecular interactions.
12.3 Assembly of distinguishable particles
Statistical Thermodynamics
Chapter 2 Statistical Thermodynamics
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Quantum Distributions
Statistical Thermodynamics CHEN 689 Fall 2015 Perla B. Balbuena 240 JEB
Lecture 21. Grand canonical ensemble (Ch. 7)
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
The Ideal Monatomic Gas. Canonical ensemble: N, V, T 2.
Chemical Reactions in Ideal Gases. Non-reacting ideal gas mixture Consider a binary mixture of molecules of types A and B. The canonical partition function.
 We just discussed statistical mechanical principles which allow us to calculate the properties of a complex macroscopic system from its microscopic characteristics.
1 Chapter 7. Applications of the Second Law. 2 Consider entropy changes in various reversible (!!!) processes We have: (a) Adiabatic process Hence a reversible.
Summary Boltzman statistics: Fermi-Dirac statistics:
Lecture 20. Continuous Spectrum, the Density of States (Ch. 7), and Equipartition (Ch. 6) The units of g(  ): (energy) -1 Typically, it’s easier to work.
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Dr Roger Bennett Rm. 23 Xtn Lecture 15.
Chapter 14: The Classical Statistical Treatment of an Ideal Gas.
3.The Canonical Ensemble 1.Equilibrium between a System & a Heat Reservoir 2.A System in the Canonical Ensemble 3.Physical Significance of Various Statistical.
Chapter 19 Statistical thermodynamics: the concepts Statistical Thermodynamics Kinetics Dynamics T, P, S, H, U, G, A... { r i},{ p i},{ M i},{ E i} … How.
Electron & Hole Statistics in Semiconductors A “Short Course”. BW, Ch
Chapter 7 Bose and Fermi statistics. §7-1 The statistical expressions of thermodynamic quantities 1 、 Bose systems: Define a macrocanonical partition.
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Other Partition Functions
Chapter 6: Basic Methods & Results of Statistical Mechanics
Basic ideas of statistical physics
Chapter 13 Classical & Quantum Statistics
Chapter-2 Maxwell-Boltzmann Statistics.
Statistical Mechanics
The average occupation numbers
The units of g(): (energy)-1
Ideal Bose and Fermi gas
Lecture 19. Boltzmann Statistics (Ch. 6)
Equilibrium Carrier Statistics
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Lecture 22. Ideal Bose and Fermi gas (Ch. 7)
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
Chapter 1: Statistical Basis of Thermodynamics
Lecture 23. Systems with a Variable Number of Particles
Homework Solution Draw out all 15 possible microstates for a system of three molecules with equally-spaced (1 quantum) energy levels and four quanta of.
Fermi statistics and Bose Statistics
Statistical Thermodynamics
STATISTICAL MECHANICS
Presentation transcript:

Classical and Quantum Statistics Chapter 13 Classical and Quantum Statistics Can omit introduction to Lagrange Multipliers.

So far, with the exception of the previous chapter, we have dealt with the 1st and 2nd laws of thermodynamics. In using these laws to make numerical calculations it is usually necessary to appeal to experimental measurements. We would like to calculate all thermodynamic properties of a system from a microscopic model of that system. We have made a start in the last chapter and we will make significant progress in the present chapter.

Lagrange Undetermined Multipliers I would like to give you a trivial example of the use of Lagrange Undetermined Multipliers. The term is somewhat misleading because the multipliers can, in fact, be determined. In the following example it is not necessary to use this sophisticated method and you should solve the problem in a simple fashion. Consider the equation ax+by=0……………..(1) If y=y(x) then y=-(a/b)x However this is not true if both x and y are independent. The only solution is then a=b=0 Suppose, however, that x and y are not completely independent but satisfy a constraint condition such as x+2y=0………(2) What can we say about the coefficients a and b? The procedure, using Lagrange Multipliers, is to multiply each constraint condition by a multiplier, which is initially unknown. In the present case we have one constraint condition and let multiplier. giving

Now that the constraints have been implicitly taken into consideration we treat x and y as two independent variables. We must have This yields Placing this in equation (1) shows that the constraint condition is satisfied.

EXAMPLE: Lagrange Undetermined Multiplier A cylindrical nuclear reactor has radius R and height H We wish to minimize the volume of the reactor. There is a constraint supplied by neutron diffusion theory constraint For an extremum dV=0 differentiating the constraint

Now we consider R and H to be independent (multiplier is determined) From equation (4) Substituting into equation (3)

Boltzmann Statistics. We consider N distinguishable particles and we can place any number into a particular state. We wish to determine the thermodynamic probability for a particular macrostate. To guide us we consider a simple example: N=3 (A B C) and take the macrostate in which We begin with a box labeled and we wish to throw 2 particles into the box. 1st particle: 3 choices ( N) 2nd particle 2 choices (N-1) The total number of choices is (3)(2) (N)(N-1) These choices are shown on the next slide

Choices However we can permute the particles in the box without changing the contents of the box. The number of permutations is 2! ( !) 1st 2nd A B A C B A B C C A C B Therefore the number of distinct choices is: They are AB AC BC Now we arrange these particles into the 3 degenerate states. 1st particle has 3 possibilities 2nd particle has 3 possibilities The total number of possibilities is We will list these possibilities for the case where the box has AB

STATES We see that there are 9 possibilities. We also have the AC and BC possibilities, each with 9 possible distributions among the states 2 3 AB A B A B A B B A B A B A The total number of possibilities is therefore Now we concentrate on the general case of N particles and consider placing particles in energy level which has a degeneracy As above selecting the particles for the box, we have possibilities We can write this as

and this is evidently Now we consider the possibilities for distributing these particles into the degenerate states of The total number of possibilities is then Now we go to the level. The procedure is obviously the same except that we no longer have N particles. The number of available particles is As above we have, for this level, possibilities For the level

Hence the thermodynamic probability is Boltzmann statistics (distinguishable particles) This is the total number of accessible microstates for a particular macrostate. The value of w will be different for each particular macrostate. The greater the value of w, the greater the probability of occurrence. Remember: The equilibrium macrostate is the one for which w is a maximum.

Example: Before continuing with our discussion of Boltzmann statistics, let us consider a simple situation. Consider 6 distinguishable objects (A,B,C,D,E,F) which can be placed in 6 boxes (1,2,3,4,5,6). We will calculate w for several macrostates. The degeneracy will be unity for the boxes. (most disorderd)

Now continuing with Boltzmann Statistics: We vary the to find the maximum value for w. This will give the in terms of and For convenience we will work with ln(w) instead of w. (The range of values is much smaller.) This will also permit us to use Stirling’s fromula, valid for large x: ln(x!)=xln(x)-x Apply Stirling’s formula to the last term : constants Now we maximize to obtain the equilibrium distribution.

We have the constraints: Differentiating We multiply the constraint conditions by the Lagrange multipliers (This form of the multipliers is for convenience.) We also have the condition for a maximum that d(ln(w))=0

The are now taken as independent, so the coefficient of each must vanish. This gives Solving for yields Hence varies with the degeneracy and with the energy of the level. Now we need to determine the Lagrange multipliers. We write this in the form with Z comes from zustandsumme, which means a sum over states. It is usually referred to as a partition function. As we shall see, partition functions are very useful. It turns out that Z depends only on T and the parameters that determine the energy levels.

Once Z is known it is straightforward to calculate many thermodynamic properties, such as U, S and P. Of course there are other ways of calculating these quantities, but the simplest way is to first calculate the appropriate ln(Z). We still need to determine β.

Now we bring T into statistical mechanics. For a process taking place between two equilibrium states: đQ = dU+ PdV and dS = đQ/T so Considering S as a function of U and V, and comparing we see that The state variables S and U may be calculated by statistical methods and so this equation brings the macroscopic concept of T into statistical thermodynamics. (reciprocity relation)

Now we are in a position to discuss the Lagrange multiplier We have the following two equations (equations 1 and 2 above) Substituting the second equation into the first one We are dealing with a closed system with constant V so and This gives Using the Boltzmann relationship S=kln(w) dS=k d(ln(w))

From the reciprocity relationship: The temperature turns out to be a Lagrange multiplier! Now that we have determined the Lagrange multipliers, we write down the following two fundamental equations (see equation (3)): Boltzmann distribution (See note on next slide) The Boltzmann distribution gives the probability of occupation of a single state belonging to the ith energy level. The partition function, an explicit function of T and an implicit function of V (through the energy levels) contains the statistical information about the system.

NOTE: There is some confusion regarding terminology. The term Boltzmann distribution for the equation is unfortunate because, among other things, it is then easily confused with the Maxwell-Boltzmann Distribution. The Boltzmann distribution applies to systems which have distinguishable particles and N, V and U are fixed. The Maxwell-Boltzmann Distribution is applicable only to dilute gases.

Consider the case in which the energy levels are very closely spaced. Then, instead of considering discrete levels with degeneracies , we consider a continuum and replace the by , the number of states in the energy range between The Boltzmann distribution is then written in the form: Distinguishable particles

Example of Boltzmann distribution. Suppose that we have 1000 particles and T=10000K. The available states and their multiplicities are shown below. The partition function and the distribution of particles amongst the states are calculated using (eV) 3 2 1 gi 4 3 2 1 Z=2.045 Ε(eV) Ni 0 489 1 307 2 144 3 60

Canonical ensemble This consists of a set of systems in contact with a large thermal reservoir. For the Boltzmann distribution V, N and U are fixed. We now consider a system in which V, N and T are fixed. The energy of a system is not fixed, but will fluctuate about some average value due to the continuous interchange of energy between the system and reservoir. For large N these fluctuations are small. The formula giving the probability distribution is one of the most important in statistical mechanics. It applies to any system with a fixed number of particles in thermal equilibrium (V, N, T fixed). We will not provide a derivation of this probability formula, which is essentially the same as the Boltzmann formula {See section 13.9 of the textbook.}

is called the Boltzmann Factor We see that at low temperatures only low energy states are populated. As the temperature increases, the population shifts to higher energy states.

Example: An atmosphere at a temperature of 6000K has, as one constituent, neutral atoms with the first excited state 1.20eV above the ground state. The ground state (taken as the reference energy level of 0eV) is doubly degenerate while the 1st excited state is 6-fold degenerate. Assuming that there is negligible population of higher excited states, what fraction of these molecules are in the 1st excited state? (Use Canonical Distribution.) 1.2eV 6 0eV 2 Placing in (1)

Now we are going to discuss two other important distributions. In preparation we review some aspects of fundamental particles. These particles are divided into two classes, depending on their quantum number called the intrinsic angular momentum or “spin” Spin quantum number an integer: Bosons (examples are photons, gravitons, pi mesons,……..) Spin quantum number an odd number of (1/2): Fermions (examples are electrons, quarks, muons,……….) Fermions obey the Pauli Exclusion Principle: In an isolated system, no two fermions can occupy the same state.

Fermi-Dirac Distribution. This distribution is for indistinguishable fermions. There can be no more than one particle in any state. This places the following restriction on any macrostate: Consider the ith energy level. We wish to place particles into the states. For the 1st particle there are possibilities. For the 2nd particle there are possibilities. For the ith particle there are possibilities. The total number of possibilities is then Since the particles are indistinguishable, we can permute them in a particular distribution without obtaining a different distribution.

The total number of distinct possibilities is then or The total thermodynamic probability for a particular macrostate is {What are the Ni at equilibrium?} Again we will work with ln(w) The first term is constant. We use Stirling’s approximation for the other terms. (gi constant!)

As before, we have the constraints: We introduce the Lagrange multipliers and

With the constraints included in the equation, the are independent The coefficient of each It is not trivial to determine the Lagrange multipliers. It turns out, as before, that The other multiplier is related to the chemical potential With these assignments we have

Fermi-Dirac distribution For a continuous energy distribution, this becomes Later in the course we will use the Fermi-Dirac Distribution.

Bose-Einstein Statistics The particles are again indistinguishable, but now any number of particles can be in a particular energy state. (The Pauli Exclusion Principle does not apply.) Example: 13 particles in the ith energy level which has a degeneracy of 8 partition xxx x xxx x xx xx x 1 2 3 4 5 6 7 8 State xx xx x xx x x x xx x We can obtain different distributions by moving particles and partitions around. How many ways can we arrange particles and partitions to form different distributions given that the particles and partitions are identical? The answer is (Students: convince yourselves.) Number of ways=

Example: 3 particles in 4 degenerate states Students: show these 20 distributions (BEEx.ppt) The thermodynamic probability for a macrostate is then: We proceed as before: Using Stirling’s formula

Introducing Lagrange multipliers with the constraint conditions: Proceeding as before: Neglecting the 1 (gi large!)

Again, a more detailed analysis gives the same values as before for the Lagrange multipliers Bose-Einstein Distribution This distribution will be used later in the course.

Maxwell-Boltzmann Statistics and Review Let us consider an assembly (system) of N indistinguishable particles. A macrostate is a given distribution of particles in the various energy levels. A microstate is a given distribution of particles in the energy states. Basic postulate of statistical mechanics: All accessible microstates of an isolated system are equally probable of occurring. We have considered some general assembly with: particles in any of the states of particles in any of the states of particles in any of the states of We now impose the restriction that for all i. This gives the Maxwell-Boltzmann Statistics. This condition holds for all real gases except at very low temperatures. At low temperatures one must use either Bose-Einstein or Fermi-Dirac Statistics depending on the

spin of the molecules. This restriction means that it is very unlikely that more than one particle will be in a given state. Subject to this restriction we first consider the number of ways that distinguishable particles can be distributed among the states. The first particle can be placed in any one of the states. The second particle can then be placed in any of the remaining states, and so forth. The total number of different ways is then: Since this is approximately At present we are interested in indistinguishable particles and so many of these distributions will be the same when the condition of distinguishability is removed. We can start with one particular distribution and then obtain identical distributions by permuting the indistinguishable particles among themselves. The number of such permutations is . Hence the number of ways, subject to the restriction, that the indistinguishable particles can be distributed among the states is

A fundamental problem in statistical mechanics is to determine the particular macrostate of a system when at equilibrium. The U of the isolated system is fixed and so each microstate has this value. The laws of mechanics do not lead one to expect that the system will be found in one microstate rather than in any other. This is consistent with with our postulate that all microstates are equally probable. Of course, all such postulates must be verified by comparing the calculations based on the postulates with experimental results. Now let us return to a consideration of a particular macrostate, that is, a particular set As we stated earlier, the number of ways this particular macrostate can be achieved is called the thermodynamic probability w. We have calculated this probability (unnormalized) above for one energy level. The total thermodynamic probability is the product of the individual probabilities for all the accessible levels. From equation (1):

As shown in the textbook, the Fermi-Dirac and Bose-Einstein probabilities reduce to this equation when one makes the approximation The thermodynamic probability for Boltzmann Statistics and Maxwell-Boltzmann Statistics is Distinguishable Indistinguishable These two probabilities differ by a constant and hence their derivatives (dw) will be the same. In addition the constraint conditions (constant N and U) are the same. Hence, in using the method of Lagrange Undetermined Multipliers, we will obtain the same result. Maxwell-Boltzmann so fi<<1

đ Statistical Interpretation of Heat and Work For concreteness we return to the quantum mechanical formula for the energy levels available for molecules of a gas in a container of volume V. For a given quantum level (given n) the energy depends only upon the volume. The internal energy of a gas is đ Macroscopically dU= Q-PdV

đ đ Now suppose that the volume of the gas does not change. (No work is done on the gas.) Comparison of the two equations yields đ When energy is added by the heating process, the energies do not change, but the distribution of particles among the energy levels changes. (See Fig. 13.4). In heating the gas you are promoting molecules from lower levels to higher levels. Now suppose that, instead of heating the gas, one does work on the gas, so that its volume decreases. In this case, the above formulae shows that: đ In this case the distribution of particles among the energy levels does not change, but the energy of the levels change. Doing work on the gas raises the energy levels. (See Fig. 13.5).

W Q

The entropy, Helmholtz function and the chemical potential in terms of the partition function. We obtain formulae using the Maxwell-Boltzmann distribution. Using the Stirling approx. But

(M-B Dist.) Since we can write the Helmholtz function in terms of U and S we can easily obtain an expression for F in terms of Z F=U-TS {Note: Z=Z(T,V) so F(T,V)} We have already determined the chemical potential in terms of F: Differentiating F with respect to N (M-B Dist.)

Finally we write the Maxwell-Boltzmann distribution in terms of the chemical potential. This will permit a comparison of the three distributions. Using the above expression for the chemical potential: Substituting into the distribution:

Comparison of the distributions The chemical potential, which enters these distributions, will be discussed when we use the distributions. At this stage we will simply plot the distributions, which can all be represented by +1 Fermi-Dirac a= -1 Bose-Einstein 0 Maxwell-Boltzman We plot these distributions as a function of (MAPLE plot distrib.mws)

limit of validity (M-B)

Consider the BE distribution: For the distribution is infinite and for the distribution is negative and hence meaningless. The particles tend to condense into the lower energy states. Consider the FD distribution. For x=0 For The low-energy levels are very nearly uniformly populated with one particle per state. Consider the MB distribution This distribution is only valid for In this limit this distribution is an approximation for the BE and FD distributions.

We have introduced four distributions. These give the distribution of particles in the accessible states. Boltzmann: Distinguishable particles. Any number can go into an energy state. Fermi-Dirac: Indistinguishable particles which obey the PEP. Bose-Einstein: Indistinguishable particles. Any number can go into an energy state. Maxwell-Boltzmann: Indistinguishable particles. Any number can go into an energy state. Valid when An extremely useful function was introduced: The partition function. {Boltzmann Factor} Many thermodynamic variables and functions can be written in terms of this function. This function contains all the statistical information about the system.

We also introduced the canonical distribution, a probability which has wide applicability.