Chapter 13 Classical & Quantum Statistics

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Statistical Mechanics
13.4 Fermi-Dirac Distribution
The Maxwell-Boltzmann Distribution Valentim M. B. Nunes ESTT - IPT April 2015.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1 Lecture 6 Ideal gas in microcanonical ensemble. Entropy. Sackur-Tetrode formula. De Broglie wavelength. Chemical potential. Ideal gas in canonical ensemble.
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
The canonical ensemble System  Heat Reservoir R T=const. adiabatic wall Consider system at constant temperature and volume We have shown in thermodynamics.
MSEG 803 Equilibria in Material Systems 8: Statistical Ensembles Prof. Juejun (JJ) Hu
Energy. Simple System Statistical mechanics applies to large sets of particles. Assume a system with a countable set of states.  Probability p n  Energy.
Chapter 6: Entropy and the Boltzmann Law. S = k ℓn W This eqn links macroscopic property entropy and microscopic term multiplicity. k = Boltzmann constant.
G. H. CHEN Department of Chemistry University of Hong Kong Intermediate Physical Chemistry.
Entropy. Optimal Value Example The decimal number 563 costs 10  3 = 30 units. The binary number costs 2  10 = 20 units.  Same value as decimal.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Thermo & Stat Mech - Spring 2006 Class 18 1 Thermodynamics and Statistical Mechanics Statistical Distributions.
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Statistical Mechanics Physics 313 Professor Lee Carkner Lecture 23.
Derivative of Logarithmic Function.
Derivatives of Logarithmic Functions
Molecular Information Content
Entropy and the Second Law Lecture 2. Getting to know Entropy Imagine a box containing two different gases (for example, He and Ne) on either side of.
Statistical Thermodynamics CHEN 689 Fall 2015 Perla B. Balbuena 240 JEB
ME 083 Thermodynamic Aside: Gibbs Free Energy Professor David M. Stepp Mechanical Engineering and Materials Science 189 Hudson Annex
MSEG 803 Equilibria in Material Systems 7: Statistical Interpretation of S Prof. Juejun (JJ) Hu
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Chemical Reactions in Ideal Gases. Non-reacting ideal gas mixture Consider a binary mixture of molecules of types A and B. The canonical partition function.
Summary Boltzman statistics: Fermi-Dirac statistics:
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Thermodynamics System: Part of Universe to Study. Open or Closed boundaries. Isolated. Equilibrium: Unchanging State. Detailed balance State of System:
Chapter 14: The Classical Statistical Treatment of an Ideal Gas.
Classical and Quantum Statistics
3.The Canonical Ensemble 1.Equilibrium between a System & a Heat Reservoir 2.A System in the Canonical Ensemble 3.Physical Significance of Various Statistical.
Information theory (part II) From the form g(R) + g(S) = g(RS) one can expect that the function g( ) shall be a logarithm function. In a general format,
Entropy Change (at Constant Volume) For an ideal gas, C V (and C P ) are constant with T. But in the general case, C V (and C P ) are functions of T. Then.
Chapter 7 Bose and Fermi statistics. §7-1 The statistical expressions of thermodynamic quantities 1 、 Bose systems: Define a macrocanonical partition.
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Other Partition Functions
Chapter 2. Thermodynamics, Statistical Mechanics, and Metropolis Algorithm (2.1~2.5) Minsu Lee Adaptive Cooperative Systems, Martin Beckerman.
Chapter 6: Basic Methods & Results of Statistical Mechanics
Chapter 6 Applications of
Entropy PREPARED BY: KANZARIYA JAYESHBHAI
Entropy and the Second Law of Thermodynamics
Chapter-2 Maxwell-Boltzmann Statistics.
14.5 Distribution of molecular speeds
Statistical Mechanics
The average occupation numbers
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
The units of g(): (energy)-1
Entropy and the Second Law
Chapter 6 Basic Methods & Results of Statistical Mechanics
Equilibrium Carrier Statistics
Entropy and the Second Law of Thermodynamics
The tricky history of Black Body radiation.
Canonical Ensemble Partition Function Z
Chapter 20 Information Theory
Statistical Interpretation of Entropy
Chapter 14: The Classical Statistical Treatment of an Ideal Gas
The Lagrange Multiplier Method
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
QM2 Concept test 8.1 We have three non-interacting particles in a one-dimensional infinite square well. The energy of the three particle system is
Chapter 1: Statistical Basis of Thermodynamics
Homework Solution Draw out all 15 possible microstates for a system of three molecules with equally-spaced (1 quantum) energy levels and four quanta of.
Chapter 7 Functions of Several Variables
Phase space.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Statistical Mechanics and Canonical Ensemble
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
QM2 Concept test 8.1 We have three non-interacting particles in a one-dimensional infinite square well. The energy of the three particle system is
Presentation transcript:

Chapter 13 Classical & Quantum Statistics 13.1 Boltzmann Statistics It deals distinguishable, non-interacting particles. There are two constrains NJ = N NJ · εJ = U where NJ is the number of particles with single-particle energy εJ.

The goal: find the occupation number of each energy level when the thermodynamic probability is a maximum (i.e. the configuration of an equilibrium state)! The number of ways of selecting NJ particles from a total of N to be placed in the j level is For level 1: For level 2, there are only (N-N1) particles left, thus

We consider that each energy level may contain more then one quantum state (degeneracy, g ≥1 )! Using gj to represent the number of quantum state on energy level j. Therefore, there are g1 quantum state on level 1, where each of the N1 particles would have g1 choices. Therefore, the total possibility would be g1N1. After considering the arrangement of these N1 particles W1 becomes For the second energy level, W2 becomes

The thermodynamic probability for a system with n energy levels is The above equation is subjected to two constraints listed at the beginning of this section

13.2 Lagrange multiplier Suppose that there are only two energy levels in a system, the thermodynamic probability, W, can be expressed based on the number of particles on each level W = W (N1, N2) The arrangement of N1 and N2, which gives the largest value of W can be found via differentiating the above equation against N1 and N2, respectively. at the maximum dw = 0

If N1 and N2 are independent variables, one has and However, N1 and N2 are connected through N = N1 + N2 For N = N(N1, N2), one has

Thus, one gets = Let the ratio be a constant = α Therefore, – α = 0 For a system with n energy levels, there will be n differential equations. Note that when taking the derivative against N1 , N2 , N3 ,… Nn are kept constant!

α and β are the Lagrange multipliers. When there are two constrains, two parameters are needed! For example: N = NJ U = NJ · EJ therefore – α – β = 0 α and β are the Lagrange multipliers.

13.3 Boltzmann Distribution Since ln W is a monotonically increasing function of W, maximizing ln W is equivalent to maximizing W. Obviously, the logarithm is much easy to work with! W = N! ) ln W = lnN! + ln( ) --- using ln (x/y) = lnx + lny ln W = lnN! + ln · gJNJ – ln NJ! --- using ln (x/y) = lnx – lny ln W =lnN! + ln · gJNJ – ln(NJ!)--- using, again, ln (x/y) = lnx - lny Using stirlings application to the last term… ln W =lnN! + (NJlngJ) – ( NJlnNJ – NJ)

Using the method of Lagrange multiplier +α + β · = 0 +α + β · = 0 Therefore, lngJ – lnNJ + α + βEJ = 0 (see chalkboard) ln = 0 + α + βEJ = e α + βEJ

The ratio of NJ/gJ is called the Boltzmann distribution, which indicates the number of particles per quantum state. Now we relate α and β to some physical properties! Since lngJ – lnNJ + α + βEJ = 0, NJ·lngJ – NJ·lnNJ + αNJ + βNJεJ = 0

Sum the above eqn over all energy levels NJ·lngJ – NJ·lnNJ + αNJ + βNJεJ = 0 NJ·lngJ – NJ·lnNJ + α·N + β·U = 0 The first two term can be replaced by lnW – lnN! –N + α·N + β·U = 0 (see in-class derivation) lnW = lnN! + N - α·N - β·U

Since S = k·lnW S = k(lnN + N - αN – βU) s = klnN - αkN -βkU But klnN + αkN = so Therefore, s = so - βkU Because T·ds = dU + P·dV = ds = + ·dV = ·dU +… therefore, =

Since s = so + k βU, = -kβ thus, = -k·β  -β = = e-α·e e-α = ·e NJ = gJ e-α · e NJ = gJ·e-α · e e-α = = ·e is called partition function!