The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.

Slides:



Advertisements
Similar presentations
Dr Roger Bennett Rm. 23 Xtn Lecture 19.
Advertisements

The Quantized Free Electron Theory Energy E Spatial coordinate x Nucleus with localized core electrons Jellium model: electrons shield potential.
Statistical Physics Notes
Ch2. Elements of Ensemble Theory
Monte Carlo Methods and Statistical Physics
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
MSEG 803 Equilibria in Material Systems 9: Ideal Gas Quantum Statistics Prof. Juejun (JJ) Hu
Lecture 4 – Kinetic Theory of Ideal Gases
CHAPTER 14 THE CLASSICAL STATISTICAL TREATMENT OF AN IDEAL GAS.
The Maxwell-Boltzmann Distribution Valentim M. B. Nunes ESTT - IPT April 2015.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1 Lecture 6 Ideal gas in microcanonical ensemble. Entropy. Sackur-Tetrode formula. De Broglie wavelength. Chemical potential. Ideal gas in canonical ensemble.
1.The Statistical Basis of Thermodynamics 1.The Macroscopic & the Microscopic States 2.Contact between Statistics & Thermodynamics: Physical Significance.
Thermodynamics II I.Ensembles II.Distributions III. Partition Functions IV. Using partition functions V. A bit on gibbes.
The canonical ensemble System  Heat Reservoir R T=const. adiabatic wall Consider system at constant temperature and volume We have shown in thermodynamics.
Thermodynamics of interaction free systems
We consider a dilute gas of N atoms in a box in the classical limit Kinetic theory de Broglie de Broglie wave length with density classical limit high.
MSEG 803 Equilibria in Material Systems 8: Statistical Ensembles Prof. Juejun (JJ) Hu
Maximum Entropy, Maximum Entropy Production and their Application to Physics and Biology Roderick C. Dewar Research School of Biological Sciences The Australian.
1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical.
Entropy. Optimal Value Example The decimal number 563 costs 10  3 = 30 units. The binary number costs 2  10 = 20 units.  Same value as decimal.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
1 Ch3. The Canonical Ensemble Microcanonical ensemble: describes isolated systems of known energy. The system does not exchange energy with any external.
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Boltzmann Distribution and Helmholtz Free Energy
Unique additive information measures – Boltzmann-Gibbs-Shannon, Fisher and beyond Peter Ván BME, Department of Chemical Physics Thermodynamic Research.
Introduction to (Statistical) Thermodynamics
MSEG 803 Equilibria in Material Systems 6: Phase space and microstates Prof. Juejun (JJ) Hu
Statistical Thermodynamics CHEN 689 Fall 2015 Perla B. Balbuena 240 JEB
Ensemble equivalence = Probability of finding a system (copy) in the canonical ensemble with energy in [E,E+dE] example for monatomic ideal gas example.
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Ch 23 pp Lecture 3 – The Ideal Gas. What is temperature?
The Ideal Monatomic Gas. Canonical ensemble: N, V, T 2.
 We just discussed statistical mechanical principles which allow us to calculate the properties of a complex macroscopic system from its microscopic characteristics.
THEORY The argumentation was wrong. Halting theorem!
Summary Boltzman statistics: Fermi-Dirac statistics:
Lecture 20. Continuous Spectrum, the Density of States (Ch. 7), and Equipartition (Ch. 6) The units of g(  ): (energy) -1 Typically, it’s easier to work.
Supplement – Statistical Thermodynamics
Chapter 14: The Classical Statistical Treatment of an Ideal Gas.
3.The Canonical Ensemble 1.Equilibrium between a System & a Heat Reservoir 2.A System in the Canonical Ensemble 3.Physical Significance of Various Statistical.
Interacting Molecules in a Dense Fluid
Chapter 19 Statistical thermodynamics: the concepts Statistical Thermodynamics Kinetics Dynamics T, P, S, H, U, G, A... { r i},{ p i},{ M i},{ E i} … How.
Monatomic Crystals.
2/18/2014PHY 770 Spring Lecture PHY Statistical Mechanics 11 AM – 12:15 & 12:30-1:45 PM TR Olin 107 Instructor: Natalie Holzwarth.
Review Of Statistical Mechanics Continued
Entropy Change (at Constant Volume) For an ideal gas, C V (and C P ) are constant with T. But in the general case, C V (and C P ) are functions of T. Then.
Chapter 7 Bose and Fermi statistics. §7-1 The statistical expressions of thermodynamic quantities 1 、 Bose systems: Define a macrocanonical partition.
Other Partition Functions
Chapter 6: Basic Methods & Results of Statistical Mechanics
PHYS 172: Modern Mechanics Lecture 22 – Entropy and Temperature Read Summer 2012.
Boltzmann statistics, average values
Chapter 6 Applications of
Statistical Mechanics
The units of g(): (energy)-1
Entropy in statistical mechanics. Thermodynamic contacts:
6. The Theory of Simple Gases
PHYS 213 Midterm Exam HKN Review Session
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Classical Statistical Mechanics in the Canonical Ensemble: Application to the Classical Ideal Gas.
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
PHYS 213 Midterm Exam HKN Review Session
Chapter 1: Statistical Basis of Thermodynamics
Total Energy is Conserved.
Introduction to Statistical
The Basic (Fundamental) Postulate of Statistical Mechanics
Section 2.2: Statistical Ensemble
Statistical Mechanics and Canonical Ensemble
Introduction to Statistical & Thermal Physics (+ Some Definitions)
PHYS 213 Midterm Exam HKN Review Session
Presentation transcript:

The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We are not able to derive from first principles Two typical alternative approaches Postulate of Equal a Priori ProbabilityUse (information) entropy as starting concept Construct entropy expression from it and show that the result is consistent with thermodynamics Derive from maximum entropy principle

Information entropy* * Introduced by Shannon. See textbook and E.T. Jaynes, Phys. Rev. 106, 620 (1957) Idea: Find a constructive least biased criterion for setting up probability distributions on the basis of only partial knowledge (missing information) What do we mean by that? Lets start with an old mathematical problem Consider a quantity x with discrete random values Assume we do not know the probabilities We have some information however, namely and we know the average of the function f(x) ( we will also consider cases where we know more than one average ) With this information, can we calculate an average of the function g(x) ? To do so we need all the but we have only the 2 equations and we are lacking (n-2) additional equations

Shannon defined information entropy, S i : or for continuous distribution with normalization We make plausible that: - S i is a measure of our ignorance of the microstate of the system. More quantitatively - S i is a measure of the width of the distribution of the n. What can we do with this underdetermined problem? There may be more than one probability distribution creating We want the one which requires no further assumptions We do not want to prefer any p i if there is no reason to do so Information theory tells us how to find this unbiased distribution ( we call the probabilities now rather than p )

Lets consider an extreme case: An experiment with N potential outcomes (such as rolling dice) However: Outcome 1 has the probability 1 =1 Outcome 2,3,…,N have n =0 -Our ignorance regarding the outcome of the experiment is zero. -We know precisely what will happen - the probability distribution is sharp (a delta peak) Lets make the distribution broader: Outcome 1 has the probability 1 =1/2 Outcome 2 has the probability 2 =1/2 Outcome 3,4, …,N have n =0 Lets consider the more general case: Outcome 1 has the probability 1 =1/M Outcome 2 has the probability 2 =1/M. Outcome M has the probability M =1/M Outcome M+1, …,N have n =0

So far our considerations suggest : Ignorance/information entropy increases with increasing width of the distribution Which distribution brings information entropy to a maximum For simplicity lets start with an experiment with 2 outcomes Binary distribution with 1, 2 and =1 with maximum uniform distribution

Lagrange multiplier Lagrange multiplier technique with From Finding an extremum of f(x,y) under the constraint g(x,y)=c. uniform distribution Lets use Lagrange multiplier technique to find distribution that maximizes Once again at maximum constraint

with uniform distribution maximizes entropy Distribution function In a microcanonical ensemble where each system has N particles, volume V and fixed energy between E and E+ the entropy is at maximum in equilibrium. - When identifying information entropy with thermodynamic entropy Where Z (E) = # of microstate with energy in [E,E+ ] called the partition function of the microcanonical ensemble

Information entropy and thermodynamic entropy has all the properties we expect from the thermodynamic entropy When identifying k=k B (for details see textbook) We show here S is additive S1S1 S2S2 probability distribution for system 1 probability distribution for system 2 Statistically independence of system 1 and 2 probability of finding system 1 in state n and system 2 in state m

Relation between entropy and the partition function Z (E) 1 Derivation of Thermodynamics in the microcanonical Ensemble Where is the temperature ? In the microcanonical ensemble the energy, E, is fixed with and

Lets derive the ideal gas equation of state from the microcanonical ensemble despite the fact that there are easier ways to do so Major task: find Z (U) :=# states in energy shell [U,U+ ] q p U U+ correct Boltzmann counting requires qm origin of indistinguishability of atoms We derive it when discussing the classical limit of qm gas another leftover from qm: phase space quantization, makes Z a dimensionless # For a gas of N non-interacting particles we have Solves Gibbs paradox

Remember: 3N dim. sphere in momentum space

In the thermodynamic limit of ln1=0 with for