THEORY The argumentation was wrong. Halting theorem!

Slides:



Advertisements
Similar presentations
Grand Canonical Ensemble and Criteria for Equilibrium
Advertisements

Dr Roger Bennett Rm. 23 Xtn Lecture 19.
The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Thermodynamics, Systems, Equilibrium & Energy
Thermodynamics versus Statistical Mechanics
General Concepts for Development of Thermal Instruments P M V Subbarao Professor Mechanical Engineering Department Scientific Methods for Construction.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Temperature, Heat, and the First Law of Thermodynamics
Energy. Simple System Statistical mechanics applies to large sets of particles. Assume a system with a countable set of states.  Probability p n  Energy.
1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical.
Theoretical Mechanics - PHY6200 Chapter 6 Introduction to the calculus of variations Prof. Claude A Pruneau, Physics and Astronomy Department Wayne State.
Observables. Molar System The ratio of two extensive variables is independent of the system size.  Denominator N as particle  Denominator N as mole.
Entropy and the Second Law of Thermodynamics
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Entropy Physics 202 Professor Lee Carkner Ed by CJV Lecture -last.
The Second Law of Thermodynamics Chapter Introduction The first law of thermodynamics is simple, general, but does not constitute a complete theory.
PTT 201/4 THERMODYNAMIC SEM 1 (2012/2013). Objectives Apply the second law of thermodynamics to processes. Define a new property called entropy to quantify.
Chapter 15 Thermodynamics. MFMcGrawChap15d-Thermo-Revised 5/5/102 Chapter 15: Thermodynamics The first law of thermodynamics Thermodynamic processes Thermodynamic.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
Thermodynamics AP Physics 2.
Boltzmann Distribution and Helmholtz Free Energy
Thermodynamics Chapter 15. Expectations After this chapter, students will:  Recognize and apply the four laws of thermodynamics  Understand what is.
Introduction to (Statistical) Thermodynamics
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Thermodynamic Properties of Water PSC 151 Laboratory Activity 7 Thermodynamic Properties of Water Heat of Fusion of Ice.
Topic 10.3 Second Law of Thermodynamics and Entropy
Summary: Isolated Systems, Temperature, Free Energy Zhiyan Wei ES 241: Advanced Elasticity 5/20/2009.
THERMODYNAMICS Branch of science which deals with the processes involving heat and temperature inter conversion of heat and other forms of energy.
Lecture slides by Mehmet Kanoglu
The Laws of Thermodynamics
Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
Chapter seven The second Law of thermodynamics The direction of thermal phenomena IF a system for some reason or other is not in a state of equilibrium.
The Second Law of Thermodynamics
7.6 Entropy Change in Irreversible Processes It is not possible to calculate the entropy change ΔS = S B - S A for an irreversible process between A and.
Summary Boltzman statistics: Fermi-Dirac statistics:
Zeroth law of thermodynamics The problem is that an arbitrary final state cannot be reached from the reference state via the quasi-static adiabatic.
PHY203: Thermal Physics Topic 4: Second Law of Thermodynamics Heat Engines Statements of the Second Law (Kelvin, Clausius) Carnot Cycle Efficiency of a.
Entropy (YAC- Ch. 6)  Introduce the thermodynamic property called Entropy (S)  Entropy is defined using the Clausius inequality  Introduce the Increase.
ENTROPY AND THIRD LAW OF THERMODYNAMICS. 2 ND LAW OF THERMODYNAMICS  Kelvin-Planck Statement  It is impossible to construct an engine which operating.
6. ENTROPY. Objectives Apply the second law of thermodynamics to processes. Define a new property called entropy to quantify the second-law effects. Establish.
Other Partition Functions
Compressing a rod: The equation of state for a rod is: Again, if the compression is done quasi-statically (e.g., with a vise), then.
Thermodynamics Thermodynamics is a branch of physics concerned with heat and temperature and their relation to energy and work.
Chapter 6: Basic Methods & Results of Statistical Mechanics
Made by, Vasava vipul [ ]. Thermodynamics Thermodynamics is the science of energy conversion involving heat and other forms of energy, most.
THE SECOND LAW OF THERMODYNAMICS Entropy. Entropy and the direction of time Microscopically the eqs. of physics are time reversible ie you can turn the.
Chemical Thermodynamics Lecture 1. Chemical Thermodynamics Prepared by PhD Halina Falfushynska.
1 12. Thermodynamics The essential concepts here are heat and temperature. Heat is a form of energy transferred between a system and an environment due.
AP Physics B Ch. 12: Laws of Thermodynamics. Internal energy (U) Sum of the kinetic energy of all particles in a system. For an ideal gas: U = N K ave.
Chapter 20 Lecture 35: Entropy and the Second Law of Thermodynamics HW13 (problems):19.3, 19.10, 19.44, 19.75, 20.5, 20.18, 20.28,
Government Engineering College, Dahod Mechanical Engineering Department SUB- Engg. thermodynamics ( ) Topic: First law of thermodynamics Prepared.
Boltzmann statistics, average values
12. Thermodynamics Temperature
Statistical Mechanics
Second Law of Thermodynamics
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
and Statistical Physics
Chapters 17, thermodynamics
The Laws of Thermodynamics
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Entropy & Energy Quality
Topic 10.3 Second Law of Thermodynamics and Entropy
Lecture 45 Entropy Clausius theorem Entropy as a state function
Chapter 3 The 2nd law of thermodynamics
Heat Engines Entropy The Second Law of Thermodynamics
Thermal Properties of Materials
Chapter 1: Statistical Basis of Thermodynamics
Presentation transcript:

THEORY

The argumentation was wrong. Halting theorem!

It is possible to construct codes for letters such that no sequence of symbols which is a true code for some symbol appears as a prefix in the sequence of some other letter. Then a sequence of symbols without any gaps can be cut into pieces representing individual letters (the gaps can be introduced automatically).

How to measure the quantity of information? The starting point is the requirement that the message saying an improbable event happened carries more information then a message reporting that a probable event happened. The quantity of information is therefore related to the probability of the message carrying the information. Information about the event is related to the probability of that event before it happened. Now we require that the information can be communicated in parts. For example I can first announce that the unknown number is odd, and then only the number itself. It is natural to require that the information from partial messages should be added to get the total information. Having in mind that the probability of independent events combine multiplicatively, but the corresponding independent pieces of information should be combined additively it is natural to assume that the amount of information is given by the logarithm of the corresponding probability.

Information is, in fact, measured by the price of telex carrying that information. If telex is priced by some amount of money per letter. Using the logarithms of different bases just changes the units in which we measure the amount of information. (Changing the base just multiplicatively renormalizes the logarithm.) The unit corresponding to binary logarithm is called bit, the unit corresponding to natural logarithm is called nat.

By the way we have proved again the optimal coding theorem, that is that one should use the codewords of the length log(q i ) = log(p i ).

So we see that the statistical entropy can be interpreted in the following way. Let us imagine that somebody gives us a sample of some macrostate. A macrostate is a virtual notion, he must actually give us some specific microstate. He just does not tell us which specific microstate from the corresponding ensemble he gave us. So our knowledge about the system considered just corresponds to its macrostate, we are completely unaware of the microstate actually delivered. Now imagine that someone tells us which particular microstate was actually delivered. Then our original unawareness is changed to a complete knowledge. The amount of information contained in the message was −log(p i ) where p i is the probability assigned in the statistical ensemble to the microstate actually delivered.

So the mean amount of information needed to complete our knowledge from macrostate to microstate level is In thermodynamic: therefore thermodynamical entropy is measured in J/K

Entropy is maximal

Free energy is minimal

How to calculate a non-equilibrium free energy at a given temperature

Problem: how to define microcanonical ensemble in classical physics, which is continuous. For continuous system we generally cannot define uniform probability, since this notion does not survive a change of continuous variables. Solution: only canonical transformations are allowed.

Entropy for a classical system

Blind Monte Carlo Importance sampling needed Nature:

Zeroth law of thermodynamics

The problem is that an arbitrary final state cannot be reached from the reference state via the quasi-static adiabatic curve.

Energy conservation law holds for irreversible processes as well. So we can measure in principle the energy of any macrostate. We first go from the reference state by adiabatic process to the state which has the same values of the external parameters as the desired final state, but is still a different state differing for example by the value of the conjugated parameters (practically it means by temperature). Now we keep the external parameters constant and perform heat to get to the desired final state. We perform (”produce”) heat by an irreversible process (Joule) performing mechanical work on some other (macroscopic but very small with respect to our system of interest) auxiliary system which is in contact with our system and exchanges heat with it, with the auxiliary system performing an irreversible process where external work is performed upon it. The auxiliary system is small, so even if its temperature is gradually increasing its energy increases by negligible amount. Historically the procedure was still different from the procedure using the Joule irreversible (work to heat transfer) process as described above. The notion of heat was older then the fact that it can be measured using the units for work. The notion of heat developed in calorimetric measurements.

Problem: heat capacity may depend on temperature, so the more realistic calorimetric equation is Idea: parametrize heat capacity by a function with a few free parameters, then perform a fit. measure work measure heat How to measure heat. Fixed volume, no work. Contact with hot water until desired temperature reached. Measure heat exchanged by measuring the hot water the hot water final temperature.

Historical problem: work and heat measured by different units. First law of thermodynamics: there exists a universal unit converting constants that the sum lof work and converted calories to get from an initial state to a final state does not depend on trajectory Therefore there exists a state function E, which can be determined as thus the energy conservation law was discovered: any energy change is balanced by the work (mechanical work + heat). To perform work there must be an agent. The work performed by an agent on the system is the same but with an opposite sign as the work performed by the system on the agent. Therefore the energy change of the agent is just opposite to the energy change of the system and the overall energy of system + agent is conserved.

Second law of thermodynamics: does not depend on the trajectory. So entropy is a state function. Cyclic reversible engine:

What was essential to derive the concavity property was that the parameters were extensive. Other thermodynamic potentials do not have extensive natural parameters. However, they are derived by Legendre transformation. And the Legendre transformation preserves (or changes the sighn) of concavity. So we get concavity property for other thermodynamic potentials as well.