The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.

Slides:



Advertisements
Similar presentations
The thermodynamics of phase transformations
Advertisements

Dr Roger Bennett Rm. 23 Xtn Lecture 19.
The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
The Maxwell-Boltzmann Distribution Valentim M. B. Nunes ESTT - IPT April 2015.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1 Lecture 6 Ideal gas in microcanonical ensemble. Entropy. Sackur-Tetrode formula. De Broglie wavelength. Chemical potential. Ideal gas in canonical ensemble.
1.The Statistical Basis of Thermodynamics 1.The Macroscopic & the Microscopic States 2.Contact between Statistics & Thermodynamics: Physical Significance.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Entropy Physics 202 Professor Lee Carkner Lecture 17.
The canonical ensemble System  Heat Reservoir R T=const. adiabatic wall Consider system at constant temperature and volume We have shown in thermodynamics.
MSEG 803 Equilibria in Material Systems 8: Statistical Ensembles Prof. Juejun (JJ) Hu
Energy. Simple System Statistical mechanics applies to large sets of particles. Assume a system with a countable set of states.  Probability p n  Energy.
Lecture 6: Intro to Entropy Reading: Zumdahl 10.1, 10.3 Outline: –Why enthalpy isn’t enough. –Statistical interpretation of entropy –Boltzmann’s Formula.
Entropy. Optimal Value Example The decimal number 563 costs 10  3 = 30 units. The binary number costs 2  10 = 20 units.  Same value as decimal.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Phy 212: General Physics II
Physics 452 Quantum mechanics II Winter 2012 Karine Chesnel.
Entropy and the Second Law of Thermodynamics
Entropy Physics 202 Professor Lee Carkner Ed by CJV Lecture -last.
Statistical Mechanics Physics 313 Professor Lee Carkner Lecture 23.
Chapter 15 Thermodynamics. MFMcGrawChap15d-Thermo-Revised 5/5/102 Chapter 15: Thermodynamics The first law of thermodynamics Thermodynamic processes Thermodynamic.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
Boltzmann Distribution and Helmholtz Free Energy
Ch 23 pages Lecture 15 – Molecular interactions.
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Molecular Information Content
Topic 10.3 Second Law of Thermodynamics and Entropy
Statistical Thermodynamics CHEN 689 Fall 2015 Perla B. Balbuena 240 JEB
MSEG 803 Equilibria in Material Systems 7: Statistical Interpretation of S Prof. Juejun (JJ) Hu
Constant-Volume Gas Thermometer
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
The Ideal Monatomic Gas. Canonical ensemble: N, V, T 2.
24.1 Entropy of Mixing of Ideal Solutions In an ideal solution: the sizes of the solute and solvent molecules are similar and the energies of interaction.
Chemical Reactions in Ideal Gases. Non-reacting ideal gas mixture Consider a binary mixture of molecules of types A and B. The canonical partition function.
1. (1.3) (1.8) (1.11) (1.14) Fundamental equations for homogeneous closed system consisting of 1 mole:
Partial Molar Quantities and the Chemical Potential Lecture 6.
Summary Boltzman statistics: Fermi-Dirac statistics:
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Chapter21 Entropy and the Second Law of Thermodynamics.
STATISTICAL THERMODYNAMICS Wayne M. Lawton Department of Mathematics National University of Singapore 2 Science Drive 2 Singapore
Classical and Quantum Statistics
3.The Canonical Ensemble 1.Equilibrium between a System & a Heat Reservoir 2.A System in the Canonical Ensemble 3.Physical Significance of Various Statistical.
Lecture 9 Pg Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Monatomic Crystals.
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Chapter 6: Basic Methods & Results of Statistical Mechanics
1 12. Thermodynamics The essential concepts here are heat and temperature. Heat is a form of energy transferred between a system and an environment due.
Introduction to Entropy. Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The.
Boltzmann statistics, average values
12. Thermodynamics Temperature
Gibbs-Duhem and the Chemical Potential of Ideal Solutions
Chapter 13 Classical & Quantum Statistics
Entropy PREPARED BY: KANZARIYA JAYESHBHAI
Applications of the Canonical Ensemble: Simple Models of Paramagnetism
Chapter-2 Maxwell-Boltzmann Statistics.
Statistical Mechanics
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
the statistical interpretation of entropy
Applications of the Canonical Ensemble:
Lon-Capa 4th HW assignment due tonight by 5 pm.
Chemical Structure and Stability
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
Entropy and the Second Law of Thermodynamics
Homework Solution Draw out all 15 possible microstates for a system of three molecules with equally-spaced (1 quantum) energy levels and four quanta of.
Statistical Thermodynamics
Statistical Mechanics and Canonical Ensemble
Lecture 11a Ideal gas Number of states and density of states
Presentation transcript:

The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally shown by Boltzmann. Boltzmann’s definition of entropy is that where  is the probability that a given state exists. For example, we consider a system composed of 3 particles with energy levels where the energy of level 0 is zero, level 1 is u, level 2 is 2u and level 3 is 3u. Let the total energy of the system, U = 3u.

The total energy of 3u can be present with various configurations or microstate complexions. Distinguishable complexions for U = 3u. All 3 of these complexions or microstates correspond to a single “observable” macrostate. 11 22 oo 33 a =1 a; all three particles in level 1; probability of occurrence 1/10 b = 3 b; one particle in level 3, 2 particles in level 0; 3/10 u 2u 3u 0 c = 6 c; one particle in level 2, 1 particle in level 1, one particle in level 0; 6/10

In general the number of arrangements or complexions within a single distribution is given by where n particles are distributed among energy levels such that n o are in level  o, n 1 are in level  o, etc. distribution a; distribution b; distribution c;

The most probable distribution is determined by the set of numbers n i that maximizes . Since for real systems the numbers can be large (consider the number in 1 mole of gas), Stirling’s approximation will be useful, The observable macrostate is determined by constraints. constant energy in the system constant number of particles in the system

Any interchange of particles among the energy levels are constrained by the conditions: Also using the definition and Stirling’s approximation; A B

The constraints on the particle numbers impose a condition on , What we need is to find the most likely microstate or complexion and that will be given by the maximum value of . This occurs when equations A, B and C are simultaneously satisfied. C Technique of Lagrange multipliers which is a method for finding the extrema of a function of several variables subject to one or more constraints. We will multiply equation A by a quantity , which has the units of reciprocal energy. D

Equation B is multiplied by a dimensionless constant , E Equations C, D and E are added to give, i.e.,

This can only occur is each of the bracketed quantities are identically zero, rearranging for the n i, and summing over all r energy levels,

The quantity is very important and occurs very often in the study of statistical mechanics. It is called the partition function, P. Then, This allows us to write the expression for n i in convenient form,

So, the distribution of particles maximizing  is one in which the occupancy or population of the energy levels decreases exponentially with increasing energy. We can identify, the undetermined multiplier  using the following argument connecting  with entropy, S. Consider two similar systems a and b in thermal contact with entropies S a and S b and associated thermodynamic probabilities  a and  b. Since entropy (upper case) is an extensive variable, the total entropy of the composite system is

The thermodynamic probability of the composite system involves a product of the individual probabilities, Since our aim is to connect  with entropy, S,, we seek Then we must have

The only function satisfying this is the logarithm, so that we must have where k is a constant. Now we can identify the quantity . We start with the condition, C and make the substitution in C for from

Expanding rearranging = 0 and solving for ,

But we can see that, The constant volume condition results from the fixed number of energy states. The from the combined 1 st and 2 nd Law and finally

Configurational and Thermal Entropy Mixing of red and blue spheres for unmixed state 1, for mixing of red and blue spheres; Then

The total entropy will be given by The number of spatial configurations available to 2 closed systems placed in thermal contact is unity. For heat flow down a temperature gradient we only have  th changing. Similarly for mixing of particles A and B the only contribution to the entropy change will be S conf if the redistribution of particles does not cause any shift in energy levels, i.e.. This would be the case of ideal mixing since the total energy of the mixed system would be identical to the sum of the energies of the individual systems. This occurs in nature only rarely.