Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.

Slides:



Advertisements
Similar presentations
Dr Roger Bennett Rm. 23 Xtn Lecture 19.
Advertisements

Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
Lecture 4. Entropy and Temperature (Ch. 3) In Lecture 3, we took a giant step towards the understanding why certain processes in macrosystems are irreversible.
The Maxwell-Boltzmann Distribution Valentim M. B. Nunes ESTT - IPT April 2015.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
SOME BASIC IDEAS OF STATISTICAL PHYSICS Mr. Anil Kumar Associate Professor Physics Department Govt. College for Girls, Sector -11, Chandigarh.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
1 PHYS1001 Physics 1 REGULAR Module 2 Thermal Physics SECOND LAW OF THERMODYNAMICS ENTROPY AND DISORDER: MICROSCOPIC & MACROSCOPIC VIEWS dell/ap06/p1/thermal/ptF_entropy.ppt.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Entropy and the Second Law of Thermodynamics
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Statistical Mechanics Physics 313 Professor Lee Carkner Lecture 23.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
12.3 Assembly of distinguishable particles
Statistical Thermodynamics
Chapter 2 Statistical Thermodynamics
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Statistical Thermodynamics CHEN 689 Fall 2015 Perla B. Balbuena 240 JEB
CHEMISTRY 2000 Topic #3: Thermochemistry and Electrochemistry – What Makes Reactions Go? Spring 2010 Dr. Susan Lait.
Lecture 21. Grand canonical ensemble (Ch. 7)
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
MATH 110 Sec 13-4 Lecture: Expected Value The value of items along with the probabilities that they will be stolen over the next year are shown. What can.
Chemical Reactions in Ideal Gases. Non-reacting ideal gas mixture Consider a binary mixture of molecules of types A and B. The canonical partition function.
Summary Boltzman statistics: Fermi-Dirac statistics:
5-1 Random Variables and Probability Distributions The Binomial Distribution.
Lecture 8. Random variables Random variables and probability distributions Discrete random variables (Continuous random variables)
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Dr Roger Bennett Rm. 23 Xtn Lecture 15.
Lecture 9 Pg Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Binomial Probabilities IBHL, Y2 - Santowski. (A) Coin Tossing Example Take 2 coins and toss each Make a list to predict the possible outcomes Determine.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
The Binomial Distribution.  If a coin is tossed 4 times the possibilities of combinations are  HHHH  HHHT, HHTH, HTHH, THHHH  HHTT,HTHT, HTTH, THHT,
5-2 Probability Models The Binomial Distribution and Probability Model.
2/18/2014PHY 770 Spring Lecture PHY Statistical Mechanics 11 AM – 12:15 & 12:30-1:45 PM TR Olin 107 Instructor: Natalie Holzwarth.
Section 7.1 Discrete and Continuous Random Variables
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Entropy – Randomness & Disorder Mr Nelson
AP Statistics, Section 7.11 The Practice of Statistics Third Edition Chapter 7: Random Variables 7.1 Discete and Continuous Random Variables Copyright.
View on Cold in 17 th Century …while the sources of heat were obvious – the sun, the crackle of a fire, the life force of animals and human beings – cold.
AP Statistics Section 7.1 Probability Distributions.
Section Discrete and Continuous Random Variables AP Statistics.
To understand entropy, we need to consider probability.
“Teach A Level Maths” Statistics 1
Chapter-2 Maxwell-Boltzmann Statistics.
Random Variables/ Probability Models
Lecture 8.
The average occupation numbers
“Teach A Level Maths” Statistics 1
UNIT 8 Discrete Probability Distributions
الديناميكا الحرارية والميكانيكا الإحصائية للغازات المثالية
Daniela Stan Raicu School of CTI, DePaul University
View on Cold in 17th Century
Discrete Random Variables 2
Random Walks.
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
ENTROPY SEM 3RD.
Section Probability Models
“Teach A Level Maths” Statistics 1
Daniela Stan Raicu School of CTI, DePaul University
Chapter 1: Statistical Basis of Thermodynamics
Bernoulli's choice: Heads or Tails?
Homework Solution Draw out all 15 possible microstates for a system of three molecules with equally-spaced (1 quantum) energy levels and four quanta of.
Section 7.1 Discrete and Continuous Random Variables
Pascal’s Arithmetic Triangle
Statistical Mechanics and Canonical Ensemble
Section 7.1 Discrete and Continuous Random Variables
72 24) 20/ ) S = {hhh, hht, hth, thh, tth, tht, htt, ttt} 10%
Sample Spaces and Probability
Presentation transcript:

Chapter 2 Statistical Thermodynamics

1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation of the equilibrium properties of macroscopic systems. - The foundation upon which the theory rests is quantum mechanics. - A satisfactory theory can be developed using only the quantum mechanics concepts of quantum states, and energy levels. - A thermodynamic system is regarded as an assembly of submicroscopic entities in an enormous number of every-changing quantum states. We use the term assembly or system to denote a number N of identical entities, such as molecules, atoms, electrons, photons, oscillators, etc.

1- Introduction - The macrostate of a system, or configuration, is specified by the number of particles in each of the energy levels of the system. N j is the number of particles that occupy the j th energy level. If there are n energy levels, then - A microstate is specified by the number of particles in each quantum state. In general, there will be more than one quantum state for each energy level, a situation called degeneracy.

1- Introduction - In general, there are many different microstates corresponding to a given macrostate. - The number of microstates leading to a given macrostate is called the thermodynamic probability. It is the number of ways in which a given macrostate can be achieved. - The thermodynamic probability is an “unnormalized” probability, an integer between one and infinity, rather than a number between zero and one. - For a k th macrostate, the thermodynamic probability is taken to be ω k.

1- Introduction - A true probability p k could be obtained as where Ω is the total number of microstates available to the system.

2- Coin model example and the most probable distribution - We assume that we have N=4 coins that we toss on the floor and then examine to determine the number of heads N 1 and the number of tails N 2 = N-N 1. - Each macrostate is defined by the number of heads and the number of tails. - A microstate is specified by the state, heads or tails, of each coin. - We are interested in the number of microstates for each macrostate, (i.e., thermodynamic probability). - The coin-tossing model assumes that the coins are distinguishable.

Macrostate Label MacrostateMicrostate kk PkPk kN1N1 N2N2 Coin 1Coin 2Coin 3Coin 4 140HHHH11/16 231HHHT44/16 HHTH HTHH THHH 322HHTT66/16 HTHT HTTH TTHH THTH THHT 413HTTT44/16 THTT TTHT TTTH 504TTTT11/16

2- Coin model example and the most probable distribution The average occupation number is where N jk is the occupation number for the k th macrostate. For our example of coin-tossing experiment, the average number of heads is therefore

2- Coin model example and the most probable distribution Suppose we want to perform the coin-tossing experiment with a larger number of coins. We assume that we have N distinguishable coins. Question: How many ways are there to select from the N candidates N 1 heads and N-N 1 tails? N1N1 kk Figure. Thermodynamic probability versus the number of heads for a coin-tossing experiment with 4 coins.

2- Coin model example and the most probable distribution The answer is given by the binomial coefficient Figure. Thermodynamic Probability versus the number of heads for a coin-tossing experiment with 8 coins. N1N1 kk Example for N = 8 The peak has become considerably sharper

2- Coin model example and the most probable distribution What is the maximum value of the thermodynamic probability (  max ) for N=8 and for N=1000? The peak occurs at N 1 =N/2. Thus, Equation (1) gives For such large numbers we can use Stirling’s approximation:

2- Coin model example and the most probable distribution For N = 1000 we find that  max is an astronomically large number

2- Coin model example and the most probable distribution Figure. Thermodynamic probability versus the number of heads for a coin-tossing experiment with 1000 coins. The most probable distribution is that of total randomness (the most probable distribution is a macrostate for which we have a maximum number of microstates) The “ordered regions” almost never occur; ω is extremely small compared with ω max.

2- Coin model example and the most probable distribution For N very large Generalization of equation (1) Question: How many ways can N distinguishable objects be arranged if they are divided into n groups with N 1 objects in the first group, N 2 in the second, etc? Answer:

3- System of distinguishable particles The constituents of the system under study (a gas, liquid, or solid) are considered to be: - a fixed number N of distinguishable particles - occupying a fixed volume V. We seek the distribution (N 1, N 2,…, N j,…, N n ) among energy levels (ε 1, ε 2,…, ε j,…, ε n ) for an equilibrium state of the system.

3- System of distinguishable particles We limit ourselves to isolated systems that do not exchange energy in any form with the surroundings. This implies that the internal energy U is also fixed Example: Consider three particles, labeled A, B, and C, distributed among four energy levels, 0, ε, 2ε, 3ε, such that the total energy is U=3ε. a)Tabulate the 3 possible macrostates of the system. b)Calculate ω k (the number of microstates), and p k (True probability) for each of the 3 macrostates. c) What is the total number of available microstates, Ω, for the system

3- System of distinguishable particles Macrostate Label Macrostate Specification Microstate Specification Thermod. Prob. True Prob. kN0N0 N1N1 N2N2 N3N3 ABCωkωk pkpk ε 0 3ε 0 3ε ε 2ε ε 2ε 0 2ε 0 ε 2ε ε 2ε 0 ε εεε10.1

3- System of distinguishable particles - The most “disordered” macrostate is the state of highest probability. - this state is sharply defined and is the observed equilibrium state of the system (for the very large number of particles.)

4- Thermodynamic probability and Entropy In classical thermodynamics: as a system proceeds toward a state of equilibrium the entropy increases, and at equilibrium the entropy attains its maximum value. In statistical thermodynamics (our statistical model): system tends to change spontaneously from states with low thermodynamic probability to states with high thermodynamic probability (large number of microstates). It was Boltzmann who made the connection between the classical concept of entropy and the thermodynamic probability: S and Ω are properties of the state of the system (state variables).

4- Thermodynamic probability and Entropy Consider two subsystems, A and B The entropy is an extensive property, it is doubled when the mass or number of particles is doubled. Consequence: the combined entropy of the two subsystems is simply the sum of the entropies of each subsystem: Subsystem A Subsystem B

4- Thermodynamic probability and Entropy One subsystem configuration can be combined with the other to give the configuration of the total system. That is, Example of coin-tossing experiment: suppose that the two subsystems each consist of two distinguishable coins. MacrostateSubsystem ASubsystem B ( N 1, N 2 )Coin 1Coin 2Coin 1Coin 2ω kA ω kB p kA p kB ( 2, 0 )HHHH111/4 ( 1, 1 )HTHT222/4 THTH ( 0, 2 )TTTT111/4

4- Thermodynamic probability and Entropy Thus Equation (4) holds, and therefore Combining Equations (3) and (5), we obtain The only function for which this statement is true is the logarithm. Therefore Where k is a constant with the units of entropy. It is, in fact, Boltzmann’s constant:

5- Quantum states and energy levels