Chapter 2. Thermodynamics, Statistical Mechanics, and Metropolis Algorithm (2.1~2.5) Minsu Lee Adaptive Cooperative Systems, Martin Beckerman.

Slides:



Advertisements
Similar presentations
The Kinetic Theory of Gases
Advertisements

The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Statistical mechanics
Review Of Statistical Mechanics
Thermodynamics versus Statistical Mechanics
General Concepts for Development of Thermal Instruments P M V Subbarao Professor Mechanical Engineering Department Scientific Methods for Construction.
The Maxwell-Boltzmann Distribution Valentim M. B. Nunes ESTT - IPT April 2015.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1 Lecture 6 Ideal gas in microcanonical ensemble. Entropy. Sackur-Tetrode formula. De Broglie wavelength. Chemical potential. Ideal gas in canonical ensemble.
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Thermodynamics II I.Ensembles II.Distributions III. Partition Functions IV. Using partition functions V. A bit on gibbes.
Maximum Entropy, Maximum Entropy Production and their Application to Physics and Biology Roderick C. Dewar Research School of Biological Sciences The Australian.
Thermodynamics can be defined as the science of energy. Although everybody has a feeling of what energy is, it is difficult to give a precise definition.
Energy. Simple System Statistical mechanics applies to large sets of particles. Assume a system with a countable set of states.  Probability p n  Energy.
Lecture 5: Learning models using EM
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
Entropy. Optimal Value Example The decimal number 563 costs 10  3 = 30 units. The binary number costs 2  10 = 20 units.  Same value as decimal.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Chapter 7 Probability. Definition of Probability What is probability? There seems to be no agreement on the answer. There are two broad schools of thought:
Entropy and the Second Law of Thermodynamics
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Statistical Mechanics Physics 313 Professor Lee Carkner Lecture 23.
Introduction to (Statistical) Thermodynamics
Statistical Thermodynamics CHEN 689 Fall 2015 Perla B. Balbuena 240 JEB
ME 083 Thermodynamic Aside: Gibbs Free Energy Professor David M. Stepp Mechanical Engineering and Materials Science 189 Hudson Annex
THERMODYNAMICS [5] Halliday, David, Resnick, Robert, and Walker, Jeart, Fundamentals of Physics, 6 th edition, John Wiley, Singapore, 2001, ISBN
The Laws of Thermodynamics
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
1 CE 530 Molecular Simulation Lecture 6 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
Chapter seven The second Law of thermodynamics The direction of thermal phenomena IF a system for some reason or other is not in a state of equilibrium.
The Ideal Monatomic Gas. Canonical ensemble: N, V, T 2.
Week 11 What is Probability? Quantification of uncertainty. Mathematical model for things that occur randomly. Random – not haphazard, don’t know what.
Chemical Reactions in Ideal Gases. Non-reacting ideal gas mixture Consider a binary mixture of molecules of types A and B. The canonical partition function.
The Second Law of Thermodynamics
 We just discussed statistical mechanical principles which allow us to calculate the properties of a complex macroscopic system from its microscopic characteristics.
THEORY The argumentation was wrong. Halting theorem!
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Lecture 2 Molecular dynamics simulates a system by numerically following the path of all particles in phase space as a function of time the time T must.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
STATISTICAL THERMODYNAMICS Wayne M. Lawton Department of Mathematics National University of Singapore 2 Science Drive 2 Singapore
3.The Canonical Ensemble 1.Equilibrium between a System & a Heat Reservoir 2.A System in the Canonical Ensemble 3.Physical Significance of Various Statistical.
Chapter 19 Statistical thermodynamics: the concepts Statistical Thermodynamics Kinetics Dynamics T, P, S, H, U, G, A... { r i},{ p i},{ M i},{ E i} … How.
Information theory (part II) From the form g(R) + g(S) = g(RS) one can expect that the function g( ) shall be a logarithm function. In a general format,
2/18/2014PHY 770 Spring Lecture PHY Statistical Mechanics 11 AM – 12:15 & 12:30-1:45 PM TR Olin 107 Instructor: Natalie Holzwarth.
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Other Partition Functions
A Brief Maximum Entropy Tutorial Presenter: Davidson Date: 2009/02/04 Original Author: Adam Berger, 1996/07/05
Statistical Mechanics and Multi-Scale Simulation Methods ChBE
Chapter 6: Basic Methods & Results of Statistical Mechanics
1 12. Thermodynamics The essential concepts here are heat and temperature. Heat is a form of energy transferred between a system and an environment due.
Ch 2. THERMODYNAMICS, STATISTICAL MECHANICS, AND METROPOLIS ALGORITHMS 2.6 ~ 2.8 Adaptive Cooperative Systems, Martin Beckerman, Summarized by J.-W.
Byeong-Joo Lee Byeong-Joo Lee POSTECH - MSE Statistical Thermodynamic s.
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
12. Thermodynamics Temperature
Chapter-2 Maxwell-Boltzmann Statistics.
Statistical Mechanics
What is Probability? Quantification of uncertainty.
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Chapter 20 Information Theory
Heat Engines Entropy The Second Law of Thermodynamics
Definition of An Instrument
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
ENTROPY SEM 3RD.
Lattice Vibrational Contribution
Chapter 1: Statistical Basis of Thermodynamics
Introduction to Statistical
Statistical Thermodynamics
Statistical Mechanics and Canonical Ensemble
Introduction to Statistical & Thermal Physics (+ Some Definitions)
Presentation transcript:

Chapter 2. Thermodynamics, Statistical Mechanics, and Metropolis Algorithm (2.1~2.5) Minsu Lee Adaptive Cooperative Systems, Martin Beckerman

Contents 2.1 Introduction 2.2 Key concepts 2.3 The measure of uncertainty 2.4 The most probable distribution –2.4.1 Configurations and weight factors –2.4.2 Entropic forms 2.5 The method of Lagrange multipliers

Introduction In this chapter, we will –Establish the formal structure and accompanying technical vocabulary for studying adaptive cooperative systems –Use probabilistic (or stochastic) language –Examine concepts underlying the study of cooperative systems in the thermodynamic setting that gave them meaning –Delineate the correspondence between the probabilistic, universal formalism, and thermodynamics

Introduction Thermodynamics (therme: heat, dynamis: power) –Study of the conversion of energy into work and heat and its relation to macroscopic variables such as temperature and pressure. –Statistical thermodynamics: statistical predictions of the collective motion of particles from their microscopic behavior –Caloric theory (heat was regarded as a fluid substance)  Mechanistic theory (heat was correctly identified with molecular activity) History of thermodynamics –Daniel Bernoulli (1738) Modern billiard-ball model in his kinetic theory of gases (mechanistic view) –Sadi Carnot (1824) Mechanical model 1 st law of thermodynamics – Conservation of energy –Thermodynamic setting in terms of internal energy, work done by system, and heat absorbed by it 2 nd law of thermodynamics – Entropy –Existence of a non-decreasing parameter of state

Introduction History of thermodynamics –Max Planck ( ) Quantum theory –Answer problems posed by radiant heat phenomena –Formulate statistical mechanics by Boltzmann and Gibbs relating the macroscopic thermodynamic observations to microscopic, stochastic process –Clausius The increase in entropy of a system that is not thermally isolated is associated with the differential amount of heat absorbed –Shannon (1948) Generalization of the thermodynamic concept of entropy to that of a universal measure of uncertainty –Jaynes (1956) Entropy can be used for generating prob. distributions including those of the statistical mechanics of Boltzmann and Gibbs, in particular

Key Concepts Entropy (2.3) –Information-theoretic measure of missing information and uncertainty Maximum entropy distribution(2.4) –In the special case where the probabilities are known from measurements of the relative frequencies of a set of events, the maximum entropy distribution is the most probable one Method of Lagrange’s undetermined multipliers (2.5) –Deduce the maximally noncommittal form of the probability distribution for the vast majority of experimental instances, where the available data are limited

Key Concepts Statistical mechanics (2.6) –Formal structure of inferencing theory Concepts and formalism associated with statistical mechanics, and thermodynamics (2.7, 2.8) –Uniform prob. distribution, and the broad class of exponential forms that includes the Gibbs distributions, emerge as maximum entropy distributions subject to simple moment constraints –Micro-canonical and canonical (Gibbs) distribution Monte Carlo method (2.9) Markov chains (2.10) –Convergence theorem for Markov chains Metropolis sampling algorithm (2.11) –Designed to generate a markov chain in a manner that satisfies the requisite conditions for the convergence theorem to hold –For simulating a Gibbs distribution

The Measure of Uncertainty Suppose –X: a variable (values: x 1, x 2,..., x r ) –p i : probabilities for these values, where i=1,...,r –S: a function called entropy of the probability distribution a measure of the uncertainty associated with probability distribution the amount of missing information which, if provided, would allow us to infer with complete certainty the value for the variable X Determine the form of the entropy function based solely upon consistency conditions with respect to continuity, monotonicity, and composition

The Measure of Uncertainty Continuity –Uncertainty measure S, is a continuous function of the probabilities associated with each outcome Monotonicity –If the probabilities are all equal to one another, then the uncertainty A(r) is a monotonic increasing function of r When the possible outcomes are equally likely, the uncertainty increases as the number of possibilities grows Composition –If a choice can be decomposed into two successive choices, the original uncertainty measure should equal the weighted sum of the component uncertainties

The Measure of Uncertainty More generally, –Let us place each of the r prob. into one of s distinct groups, with prob. p 1 to p m1 placed into the first group, prob. p m1+1 to p m1+m2 put into the second group, and so on. –The prob. for observing one of the elements in the first group is q 1 =(p 1 +p p m1 ) the second group is q 2 =(p m p m1+m2 ) and so on –Composition rule assumes the general form q 1, q 2,... : weight factors (take into account that the uncertainties due to the additional groupings are encountered prob. q i )

The Measure of Uncertainty Deduce the form of uncertainty measure S –Represent probabilities as ratios of integers m i : –Subdivide each p i to the point where each grouping is generated by events with probability 1/M –Within each group there are m i elements –Assume that the m i are each equal to an integer m –Assume that we successively partition the elements into equally likely groups of equally likely elements, ( ∵ monotonicity) ( ∵ composition) or  (K: arbitrary positive constant)

The Measure of Uncertainty Entropy: measure of uncertainty as to the value of the variable X

The Most Probable Distribution Configurations and Weight Factors –N elements, place each element into one of M bins –n i : # of elements in the ith bin... –p i =n i /N : prob. of finding an element belonging to a particular bin –example situations Systems of N molecules distributed among i=1,...,M quantum states Messages containing N symbols chosen from an alphabet of M letters, the ith letter occurring n i times –A distribution of the N elements among the M cells constitutes a particular configuration of the system –# of ways a specific configuration can be realized multinomial coefficient

The Most Probable Distribution Configurations and Weight Factor –Ex.1) N=2, M=2 (1) n 1 =2, n 2 =0, (2) n 1 =1, n 2 =1, (3) n 1 =0, n 2 =2 Polynomial coefficients (weight): W = 1, 2, and 1, respectively –Ex. 2) M=2, N=6 & n 1 =2, n 2 =4  there are 15 configurations First of the two elements of cell one occurring first: 5 –122221, , , , First “1” occurring second: 4 –212221, , , Initial “1” third: 3 –221221, , Initial “1” fourth: 2 –222121, “1” fifth –222211

The Measure of Uncertainty Configurations and Weight Factors –Apply Stirling’s formula take the logarithm, & assume that the n i and N are large –Maximizing the entropy corresponds to maximizing the multiplicity or weight factor –Maximizing the entropy is equivalent to determining the set {n i }, which can be realized in the greatest number of ways –In those instances where the prob. can be interpreted as a frequency distribution, the maximum entropy distribution is the most probable one

The Most Probable Distribution Entropic Form –S = K ln W Essential in some formulations of statistical mechanics The relationship between S and ln W is represented as a proportionality by Boltzmann The full equality, with the constant K identified as Boltzmann’s constant –W = e S/K Rapid increase in the number of states with increasing entropy –Entropy is closely connected with probability –The entropy of a physical system in a definite state depends solely on the probability of the state

The Method of Lagrange Multipliers Limited data with constraints How to make reasonable inferences or construct hypotheses, from limited information?  Identification of measure of uncertainty  Determine the form of the prob. when the entropy is maximal subject to the constraints

The Method of Lagrange Multipliers Lagrange’s method of undetermined multipliers –Find the extremum of a function f(x,y) –For arbitrary independent variations in the x- and y-directions, the extremum will occur when –Find the extremum along a curve g(x,y)=C, where C is a constant –Combining two conditions –If λ denotes the common ratio (Lagrange multiplier) –Define a new generating function

The Method of Lagrange Multipliers Lagrange’s method of undetermined multipliers –Generalize the results (independent variables x 1,..., x n ) –Further extend to instances where we have k=1,...,m, m<n, constraint equation (fewer constraint equations than independent variables) –m Lagrange multipliers, one for each constraint equation –The conditions for an extremum becomes –The generating function becomes –Apply this procedure for finding the extremum of a function subject to constraints to derive the form of the maximally noncommittal probability distributions