Lecture 3. Combinatorics, Probability and Multiplicity (Ch. 2 )

Slides:



Advertisements
Similar presentations
Dr Roger Bennett Rm. 23 Xtn Lecture 19.
Advertisements

The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Ch2. Elements of Ensemble Theory
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
Lecture 4. Entropy and Temperature (Ch. 3) In Lecture 3, we took a giant step towards the understanding why certain processes in macrosystems are irreversible.
We’ve spent quite a bit of time learning about how the individual fundamental particles that compose the universe behave. Can we start with that “microscopic”
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Fall 2014 Fadwa ODEH (lecture 1). Probability & Statistics.
Thermal & Kinetic Lecture 14 Permutations, Combinations, and Entropy Overview Distribution of energy at thermal equilibrium Permutations and combinations.
Thermal & Kinetic Lecture 13 Towards Entropy and the 2 nd Law: Permutations and Combinations.
MSEG 803 Equilibria in Material Systems 8: Statistical Ensembles Prof. Juejun (JJ) Hu
Lecture 4. Macrostates and Microstates (Ch. 2 ) The past three lectures: we have learned about thermal energy, how it is stored at the microscopic level,
1 Time Arrow, Statistics and the Universe II Physics Summer School 17 July 2002 K. Y. Michael Wong Outline: * Counting and statistical physics * Explaining.
Statistics. Large Systems Macroscopic systems involve large numbers of particles.  Microscopic determinism  Macroscopic phenomena The basis is in mechanics.
Lecture 21. Boltzmann Statistics (Ch. 6)
Entropy and the Second Law of Thermodynamics
Statistical Mechanics Physics 313 Professor Lee Carkner Lecture 23.
Boltzmann Distribution and Helmholtz Free Energy
Chapter 2: The Second Law. Start with Combinatorics, Probability and Multiplicity Combinatorics and probability 2-state paramagnet and Einstein solid Multiplicity.
MSEG 803 Equilibria in Material Systems 6: Phase space and microstates Prof. Juejun (JJ) Hu
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Lecture 8. Systems with a “Limited” Energy Spectrum The definition of T in statistical mechanics is consistent with our intuitive idea of the temperature.
Lecture 21. Grand canonical ensemble (Ch. 7)
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Lecture 6. Entropy of an Ideal Gas (Ch. 3) Find  (U,V,N,...) – the most challenging step S (U,V,N,...) = k B ln  (U,V,N,...) Solve for U = f (T,V,N,...)
Lecture 3. Combinatorics, Probability and Multiplicity (Ch. 2 )
Entropy and temperature Fundamental assumption : an isolated system (N, V and U and all external parameters constant) is equally likely to be in any of.
Lecture 20. Continuous Spectrum, the Density of States (Ch. 7), and Equipartition (Ch. 6) The units of g(  ): (energy) -1 Typically, it’s easier to work.
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Lecture 2 Molecular dynamics simulates a system by numerically following the path of all particles in phase space as a function of time the time T must.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Chapter 2 Statistical Description of Systems of Particles.
PHYS 172: Modern Mechanics Lecture 23 – Heat Capacity Read 12.6 Summer 2012.
Summer 2012 PHYS 172: Modern Mechanics
Byeong-Joo Lee Byeong-Joo Lee POSTECH - MSE Statistical Thermodynamic s.
PHYS 172: Modern Mechanics Lecture 22 – Entropy and Temperature Read Summer 2012.
Boltzmann statistics, average values
To understand entropy, we need to consider probability.
Lecture 3. Combinatorics, Probability and Multiplicity (Ch. 2 )
Applications of the Canonical Ensemble: Simple Models of Paramagnetism
Chapter-2 Maxwell-Boltzmann Statistics.
The average occupation numbers
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
The units of g(): (energy)-1
Lecture 6. Entropy of an Ideal Gas (Ch. 3)
Chapter 41 Atomic Structure
Introduction to Statistical Methods
Lecture 19. Boltzmann Statistics (Ch. 6)
LAWS OF THERMODYNAMICS
PHYS 213 Midterm Exam HKN Review Session
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Applications of the Canonical Ensemble:
Lon-Capa 4th HW assignment due tonight by 5 pm.
Lecture 4. Entropy and Temperature (Ch. 3)
Equipartition of Energy
Homework Solution Consider two energy levels at i = 2.05 x 10-21J and j = 2.64 x 10-21J. The occupation numbers of these in a particular system at.
Chemical Structure and Stability
Microstates, macrostates and
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
Statistical Description of Macroscopic Systems of Particles
PHYS 213 Midterm Exam HKN Review Session
Chapter 1: Statistical Basis of Thermodynamics
Lecture 23. Systems with a Variable Number of Particles
Total Energy is Conserved.
Homework Solution Draw out all 15 possible microstates for a system of three molecules with equally-spaced (1 quantum) energy levels and four quanta of.
The Basic (Fundamental) Postulate of Statistical Mechanics
Section 2.2: Statistical Ensemble
3.7. Two Theorems: the “Equipartition” & the “Virial”
Statistical Thermodynamics
The Micro/Macro Connection
Presentation transcript:

Lecture 3. Combinatorics, Probability and Multiplicity (Ch. 2 ) Combinatorics and probability 2-state paramagnet and Einstein solid Multiplicity of a macrostate Concept of Entropy (next lec.) Directionality of thermal processes (irreversibility) Overwhelmingly probable The past three lectures: we have learned about thermal energy, how it is stored at the microscopic level, and how it can be transferred from one system to another. However, the energy conservation law (the first law of thermodynamics) tells us nothing about the directionality of processes and cannot explain why so many macroscopic processes are irreversible. Indeed, according to the 1st law, all processes that conserve energy are legitimate, and the reversed-in-time process would also conserve energy. Thus, we still cannot answer the basic question of thermodynamics: why does the energy spontaneously flow from the hot object to the cold object and never the other way around? (in other words, why does the time arrow exist for macroscopic processes?). For the next three lectures, we will address this central problem using the ideas of statistical mechanics. Statistical mechanics is a bridge from microscopic states to averages. In brief, the answer will be: irreversible processes are not inevitable, they are just overwhelmingly probable. This path will bring us to the concept of entropy and the second law of thermodynamics.

Combinatorics and probability Combinatorics is the branch of mathematics studying the enumeration, combination, and permutation of sets of elements and the mathematical relations that characterize their properties. Examples: random walk, two-state systems, … Probability is the branch of mathematics that studies the possible outcomes of given events together with the outcomes' relative likelihoods and distributions. In common usage, the word "probability" is used to mean the chance that a particular event (or set of events) will occur. Math 104 - Elementary Combinatorics and Probability

Probability An event (very loosely defined) – any possible outcome of some measurement. An event is a statistical (random) quantity if the probability of its occurrence, P, in the process of measurement is < 1. The “sum” of two events: in the process of measurement, we observe either one of the events. Addition rule for independent events: P (i or j) = P (i) + P (j) (independent events – one event does not change the probability for the occurrence of the other). The “product” of two events: in the process of measurement, we observe both events. Multiplication rule for independent events: P (i and j) = P (i) x P (j) Example: What is the probability of the same face appearing on two successive throws of a dice? The probability of any specific combination, e.g., (1,1): 1/6x1/6=1/36 (multiplication rule) . Hence, by addition rule, P(same face) = P(1,1) + P(2,2) +...+ P(6,6) = 6x1/36 = 1/6 Expectation value of a macroscopic observable A: (averaged over all accessible microstates)

Two model systems with fixed positions of particles and discrete energy levels - the models are attractive because they can be described in terms of discrete microstates which can be easily counted (for a continuum of microstates, as in the example with a freely moving particle, we still need to learn how to do this). This simplifies calculation of . On the other hand, the results will be applicable to many other, more complicated models. Despite the simplicity of the models, they describe a number of experimental systems in a surprisingly precise manner. - two-state paramagnet .... (“limited” energy spectrum) - the Einstein model of a solid (“unlimited” energy spectrum)

The Two-State Paramagnet - a system of non-interacting magnetic dipoles in an external magnetic field B, each dipole can have only two possible orientations along the field, either parallel or any-parallel to this axis (e.g., a particle with spin ½ ). No “quadratic” degrees of freedom (unlike in an ideal gas, where the kinetic energies of molecules are unlimited), the energy spectrum of the particles is confined within a finite interval of E (just two allowed energy levels). A particular microstate (....) is specified if the directions of all spins are specified. A macrostate is specified by the total # of dipoles that point “up”, N (the # of dipoles that point “down”, N  = N - N ). E E2 = + B an arbitrary choice of zero energy N - the number of “up” spins N - the number of “down” spins E1 = - B  - the magnetic moment of an individual dipole (spin) The total magnetic moment: (a macroscopic observable) - B for  parallel to B, +B for  anti-parallel to B The energy of a single dipole in the external magnetic field: The energy of a macrostate:

Example Consider two spins. There are four possible configurations of microstates: M = 2 0 0 - 2 In zero field, all these microstates have the same energy (degeneracy). Note that the two microstates with M=0 have the same energy even when B0: they belong to the same macrostate, which has multiplicity =2. The macrostates can be classified by their moment M and multiplicity : M = 2 0 - 2  = 1 2 1 For three spins: M = 3    - - - -3 M = 3  -  -3  = 1 3 3 1 macrostates:

The Multiplicity of Two-State Paramagnet Each of the microstates is characterized by N numbers, the number of equally probable microstates – 2N, the probability to be in a particular microstate – 1/2N. For a two-state paramagnet in zero field, the energy of all macrostates is the same (0). A macrostate is specified by (N, N). Its multiplicity - the number of ways of choosing N objects out of N : n !  n factorial = 1·2·....·n 0 !  1 (exactly one way to arrange zero objects) The multiplicity of a macrostate of a two-state paramagnet with (N, N):

Stirling’s Approximation for N! (N>>1) Multiplicity depends on N!, and we need an approximation for ln(N!): or More accurately: Check: because ln N << N for large N

The Probability of Macrostates of a Two-State PM (B=0) - as the system becomes larger, the P(N,N) graph becomes more sharply peaked: N =1  (1,N) =1, 2N=2, P(1,N)=0.5 N P(1, N) 0.5 1 n 0.5·1023 1023  P(15, N) P(1023, N) - random orientation of spins in B=0 is overwhelmingly more probable 2nd law! (http://stat-www.berkeley.edu/~stark/Java/Html/BinHist.htm)

Multiplicity (Entropy) and Disorder In general, we can say that small multiplicity implies “order”, while large multiplicity implies “disorder”. An arrangement with large  could be achieved by a random process with much greater probability than an arrangement with small .   small  large 

The Einstein Model of a Solid In 1907, Einstein proposed a model that reasonably predicted the thermal behavior of crystalline solids (a 3D bed-spring model): a crystalline solid containing N atoms behaves as if it contained 3N identical independent quantum harmonic oscillators, each of which can store an integer number ni of energy units  = ħ. We can treat a 3D harmonic oscillator as if it were oscillating independently in 1D along each of the three axes: classic: quantum: the solid’s internal energy: the zero-point energy 1 2 3 3N ħ the effective internal energy: all oscillators are identical, the energy quanta are the same

The Einstein Model of a Solid (cont.) dU/dT, J/K·mole Lead 26.4 Gold 25.4 Silver Copper 24.5 Iron 25.0 Aluminum The Einstein Model of a Solid (cont.) At high kBT >> ħ (the classical limit of large ni): Dulong-Petit’s rule To describe a macrostate of an Einstein solid, we have to specify N and U, a microstate – ni for 3N oscillators. Example: the “macrostates” of an Einstein Model with only one atom  (1,0) =1  (1,1) =3  (1,3) =10  (1,2) =6

The Multiplicity of Einstein Solid The multiplicity of a state of N oscillators (N/3 atoms) with q energy quanta distributed among these oscillators: Proof: let’s consider N oscillators, schematically represented as follows:    - q dots and N-1 lines, total q+N-1 symbols. For given q and N, the multiplicity is the number of ways of choosing n of the symbols to be dots, q.e.d. In terms of the total internal energy U =q: Example: The multiplicity of an Einstein solid with three atoms and eight units of energy shared among them 12,870

Multiplicity of a Large Einstein Solid (kBT >> ) q = U/ = N - the total # of energy quanta in a solid.  = U/( N) - the average # of quanta (microstates) available for each molecule Dulong-Petit’s rule:

Multiplicity of a Large Einstein Solid (kBT >> ) q = U/ = N - the total # of energy quanta in a solid.  = U/( N) - the average # of quanta (microstates) available for each molecule high temperatures: (kBT >> ,  >>1, q >> N ) Einstein solid: (2N degrees of freedom) General statement: for any system with N “quadratic” degrees of freedom (“unlimited” spectrum), the multiplicity is proportional to U N/2.

Multiplicity of a Large Einstein Solid (kBT << ) low temperatures: (kBT << ,  <<1, q << N ) (Pr. 2.17)

Microstates of a system (e.g. ideal gas) The evolution of a system can be represented by a trajectory in the multidimensional (configuration, phase) space of micro-parameters. Each point in this space represents a microstate.  2  1 During its evolution, the system will only pass through accessible microstates – the ones that do not violate the conservation laws: e.g., for an isolated system, the total internal energy must be conserved. Microstate: the state of a system specified by describing the quantum state of each molecule in the system. For a classical particle – 6 parameters (xi, yi, zi, pxi, pyi, pzi), for a macro system – 6N parameters.

Statistics  Probabilities of Macrostates The statistical approach: to connect the macroscopic observables (averages) to the probability for a certain microstate to appear along the system’s trajectory in configuration space, P( 1,  2,..., N). Macrostate: the state of a macro system specified by its macroscopic parameters. Two systems with the same values of macroscopic parameters are thermodynamically indistinguishable. A macrostate tells us nothing about a state of an individual particle. For a given set of constraints (conservation laws), a system can be in many macrostates.

The Phase Space vs. the Space of Macroparameters some macrostate P numerous microstates in a multi-dimensional configuration (phase) space that correspond the same macrostate T V the surface defined by an equation of states  i  i  2  1  2  1  i etc., etc., etc. ...  i  2  2  1  1

Examples: Two-Dimensional Configuration Space motion of a particle in a one-dimensional box K=K0 -L L K “Macrostates” are characterized by a single parameter: the kinetic energy K0 Another example: one-dimensional harmonic oscillator x px K + U =const U(r) px -L L x -px Each “macrostate” corresponds to a continuum of microstates, which are characterized by specifying the position and momentum

The Fundamental Assumption of Statistical Mechanics The ergodic hypothesis: an isolated system in an equilibrium state, evolving in time, will pass through all the accessible microstates at the same recurrence rate, i.e. all accessible microstates are equally probable.  2  1 microstates which correspond to the same energy The ensemble of all equi-energetic states  a microcanonical ensemble. The average over long times will equal the average over the ensemble of all equi-energetic microstates: if we take a snapshot of a system with N microstates, we will find the system in any of these microstates with the same probability. many identical measurements on a single system Probability for a stationary system a single measurement on many copies of the system

Probability of a Macrostate, Multiplicity The probability of a certain macrostate is determined by how many microstates correspond to this macrostate – the multiplicity of a given macrostate  . This approach will help us to understand why some of the macrostates are more probable than the other, and, eventually, by considering the interacting systems, we will understand irreversibility of processes in macroscopic systems.

Concepts of Statistical Mechanics The macrostate is specified by a sufficient number of macroscopically measurable parameters (for an Einstein solid – N and U). The microstate is specified by the quantum state of each particle in a system (for an Einstein solid – # of the quanta of energy for each of N oscillators) The multiplicity is the number of microstates in a macrostate. For each macrostate, there is an extremely large number of possible microstates that are macroscopically indistinguishable. The Fundamental Assumption: for an isolated system, all accessible microstate are equally likely. The probability of a macrostate is proportional to its multiplicity. This will be sufficient to explain irreversibility.