Thermodynamics and the Gibbs Paradox Presented by: Chua Hui Ying Grace Goh Ying Ying Ng Gek Puey Yvonne.

Slides:



Advertisements
Similar presentations
The Kinetic Theory of Gases
Advertisements

Third law of Thermodynamics
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1.The Statistical Basis of Thermodynamics 1.The Macroscopic & the Microscopic States 2.Contact between Statistics & Thermodynamics: Physical Significance.
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Dr. Baljeet Kaur Lecturer Chemistry Government Polytechnic College for Girls Patiala.
1 Time Arrow, Statistics and the Universe I Physics Summer School 16 July 2002 K. Y. Michael Wong Outline: * The problem of time arrow * What solution.
The entropy, S, of a system quantifies the degree of disorder or randomness in the system; larger the number of arrangements available to the system, larger.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Thermodynamics Chapter 19 Liquid benzene Production of quicklime Solid benzene ⇅ CaCO 3 (s) ⇌ CaO + CO 2.
Chapter 19 Chemical Thermodynamics
Entropy and the Second Law of Thermodynamics
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
The Second Law of Thermodynamics Chapter Introduction The first law of thermodynamics is simple, general, but does not constitute a complete theory.
By Kyle Ireton and Ian Winter. What is it? BEC consists of particles cooled down to a temperature within a few billionths of a degree centigrade above.
Chapter Thermodynamics
PTT 201/4 THERMODYNAMIC SEM 1 (2012/2013). Objectives Apply the second law of thermodynamics to processes. Define a new property called entropy to quantify.
Chapter 15 Thermodynamics. MFMcGrawChap15d-Thermo-Revised 5/5/102 Chapter 15: Thermodynamics The first law of thermodynamics Thermodynamic processes Thermodynamic.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
Heat. Heat and Temperature Kinetic Molecular Theory – Is the theory that matter is made up of atoms (smallest piece of matter) and that these atoms are.
Thermodynamics. Terms used frequently in thermodynamics System Surroundings Isolated system Closed system Open system State of a system State variables.
Ch 23 pages Lecture 15 – Molecular interactions.
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Molecular Information Content
Physics 12 Giancoli Chapter 15
Chapter 17 Free Energy and Thermodynamics Lesson 1.
Lecture slides by Mehmet Kanoglu
Constant-Volume Gas Thermometer
IPQI – Gibbs paradox and Information Gibbs’ Paradox and Quantum Information Unnikrishnan. C. S. Gravitation Group & Fundamental Interactions Lab Tata Institute.
L.I. Petrova “Specific features of differential equations of mathematical physics.” Investigation of the equations of mathematical physics with the help.
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
Guidelines for Thermodynamics Jillian Campbell, Karly Johnson, Jared Ostler, Daniel Borbolla.
Section 4.4: Heat Capacity & Specific Heat. The Heat Capacity of a substance is defined as: C y (T)  (đQ/dT) y The subscript y indicates that property.
Chapter seven The second Law of thermodynamics The direction of thermal phenomena IF a system for some reason or other is not in a state of equilibrium.
AP Chapter 19.  Energy can not be created nor destroyed, only transferred between a system and the surroundings.  The energy in the universe is constant.
Explanation of the Gibbs Paradox within the Framework of Quantum Thermodynamics Armen E. Allahverdyan (Yerevan Physics Institute) Theo M. Nieuwenhuizen.
Two and Three Dimensional Problems: Atoms and molecules have three dimensional properties. We will therefore need to treat the Schrödinger equation for.
Ludwid Boltzmann 1844 – 1906 Contributions to Kinetic theory of gases Electromagnetism Thermodynamics Work in kinetic theory led to the branch of.
Monatomic Crystals.
ENTROPY AND THIRD LAW OF THERMODYNAMICS. 2 ND LAW OF THERMODYNAMICS  Kelvin-Planck Statement  It is impossible to construct an engine which operating.
Thermodynamics I Inter - Bayamon Lecture 7 Thermodynamics I MECN 4201 Professor: Dr. Omar E. Meza Castillo
Spontaneous Processes and Entropy First Law “Energy can neither be created nor destroyed“. The energy of the universe is constant. Spontaneous Processes.
 State Function (°)  Property with a specific value only influenced by a system’s present condition  Only dependent on the initial and final states,
Entropy (YAC- Ch. 6) Introduce the thermodynamic property called Entropy (S) Entropy is defined using the Clausius inequality Introduce the Increase of.
Entropy – Randomness & Disorder Mr Nelson
Chapter 11 Thermodynamics Heat and Work and Internal Energy o Heat = Work and therefore can be converted back and forth o Work  heat if work.
Chapter 6: Basic Methods & Results of Statistical Mechanics
Chemistry 101 : Chap. 19 Chemical Thermodynamics (1) Spontaneous Processes (2) Entropy and The Second Law of Thermodynamics (3) Molecular Interpretation.
CHAPTER 19 CHEMICAL THERMODYNAMICS SECTION 3 THE MOLECULAR INTERPRETATION OF ENTROPY.
THEME: Theoretic bases of bioenergetics. LECTURE 6 ass. prof. Yeugenia B. Dmukhalska.
MS811Material Thermodynamics (3 Credit Hours Course) Prof. Nasir Ahmad Lecture 6-Mostly Revision for Exam: Wednesday, 25 November 2009 Lecture 7-Carnot.
 The force exerted per unit area  As the number and/or the speed of the collisions increases, the rate of change of the momentum of the particles.
Chapter 19 Spontaneity, entropy and free energy (rev. 11/09/08)
Thermodynamics the study of energy transformations and transfer THREE QUESTIONS to be addressed: 1. Will a reaction occur when two substances are mixed.
Applications of the Canonical Ensemble: Simple Models of Paramagnetism
G.K.BHARAD INSTITUTE OF ENGINEERING(059)
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
LAWS OF THERMODYNAMICS
Gibbs’ Paradox.
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Applications of the Canonical Ensemble:
Chapter 19 Chemical Thermodynamics
Heat Engines Entropy The Second Law of Thermodynamics
Entropy (S) a measure of disorder
Thermodynamics & Gibbs Paradox
Thermodynamics and the Gibbs Paradox
ENTROPY SEM 3RD.
Chapter 1: Statistical Basis of Thermodynamics
THE 2ND LAW OF THERMODYNAMICS:
UNIT 4 NOTEPACK: States of Matter, Gas Laws, and Energy
Presentation transcript:

Thermodynamics and the Gibbs Paradox Presented by: Chua Hui Ying Grace Goh Ying Ying Ng Gek Puey Yvonne

Overview The three laws of thermodynamics The three laws of thermodynamics The Gibbs Paradox The Gibbs Paradox The Resolution of the Paradox The Resolution of the Paradox Gibbs / Jaynes Gibbs / Jaynes Von Neumann Von Neumann Shu Kun Lin’s revolutionary idea Shu Kun Lin’s revolutionary idea Conclusion Conclusion

The Three Laws of Thermodynamics 1 st Law 1 st Law Energy is always conserved Energy is always conserved 2 nd Law 2 nd Law Entropy of the Universe always increase Entropy of the Universe always increase 3 rd Law 3 rd Law Entropy of a perfect crystalline substance is taken as zero at the absolute temperature of 0K. Entropy of a perfect crystalline substance is taken as zero at the absolute temperature of 0K.

Unravel the mystery of The Gibbs Paradox

The mixing of non-identical gases The mixing of non-identical gases

Shows obvious increase in entropy (disorder)

The mixing of identical gases The mixing of identical gases

Shows zero increase in entropy as action is reversible

Compare the two scenarios of mixing and we realize that……

To resolve the Contradiction Look at how people do this Look at how people do this 1. Gibbs /Jaynes 2. Von Neumann 3. Lin Shu Kun

Gibbs’ opinion When 2 non-identical gases mix and entropy increase, we imply that the gases can be separated and returned to their original state When 2 non-identical gases mix and entropy increase, we imply that the gases can be separated and returned to their original state When 2 identical gases mix, it is impossible to separate the two gases into their original state as there is no recognizable difference between the gases When 2 identical gases mix, it is impossible to separate the two gases into their original state as there is no recognizable difference between the gases

Gibbs’ opinion (2) Thus, these two cases stand on different footing and should not be compared with each other Thus, these two cases stand on different footing and should not be compared with each other The mixing of gases of different kinds that resulted in the entropy change was independent of the nature of the gases The mixing of gases of different kinds that resulted in the entropy change was independent of the nature of the gases Hence independent of the degree of similarity between them Hence independent of the degree of similarity between them

Entropy S max Similarity S=0 Z=0Z = 1

Jaynes’ explanation The entropy of a macrostate is given as The entropy of a macrostate is given as Where S(X) is the entropy associated with a chosen set of macroscopic quantities W(C) is the phase volume occupied by all the microstates in a chosen reference class C

Jaynes’ explanation (2) This thermodynamic entropy S(X) is not a property of a microstate, but of a certain reference class C(X) of microstates This thermodynamic entropy S(X) is not a property of a microstate, but of a certain reference class C(X) of microstates For entropy to always increase, we need to specify the variables we want to control and those we want to change. For entropy to always increase, we need to specify the variables we want to control and those we want to change. Any manipulation of variables outside this chosen set may cause us to see a violation of the second law. Any manipulation of variables outside this chosen set may cause us to see a violation of the second law.

Von Neumann’s Resolution Makes use of the quantum mechanical approach to the problem Makes use of the quantum mechanical approach to the problem He derives the equation He derives the equation Where  measures the degree of orthogonality, which is the degree of similarity between the gases.

Von Neumann’s Resolution (2) Hence when = 0 entropy is at its highest and when = 1 entropy is at its lowest Hence when  = 0 entropy is at its highest and when  = 1 entropy is at its lowest Therefore entropy decreases continuously with increasing similarity Therefore entropy decreases continuously with increasing similarity

Entropy S max Similarity S=0 Z=0Z = 1

Resolving the Gibbs Paradox - Using Entropy and its revised relation with Similarity proposed by Lin Shu Kun. Draws a connection between information theory and entropy proposed that entropy increases continuously with similarity of the gases

Analyse 3 concepts! (1) high symmetry = high similarity, (2) entropy = information loss and (3) similarity = information loss. Why “entropy increases with similarity” ? Due to Lin’s proposition that entropy is the degree of symmetry and information is the degree of non-symmetry

(1) high symmetry = high similarity symmetry is a measure of indistinguishability high symmetry contributes to high indistinguishability  similarity can be described as a continuous measure of imperfect symmetry High Symmetry Indistinguishability High similarity

(2) entropy = information loss  an increase in entropy means an increase in disorder.  a decrease in entropy reflects an increase in order.  A more ordered system is more highly organized  thus possesses greater information content.

Do you have any idea what the picture is all about?

From the previous example, Greater entropy would result in least information registered  Higher entropy, higher information loss Thus if the system is more ordered, This means lower entropy and thus less information loss.

(3) similarity = information loss. 1 Particle(n-1) particles For a system with distinguishable particles, Information on N particles = different information of each particle = N pieces of information High similarity (high symmetry)  there is greater information loss. For a system with indistinguishable particles, Information of N particles = Information of 1 particle = 1 piece of information

Concepts explained: (1) high symmetry = high similarity (2) entropy = information loss and (3) similarity = information loss After establishing the links between the various concepts, If a system is highly symmetrical high similarity Greater information loss Higher entropy

The mixing of identical gases (revisited) The mixing of identical gases (revisited)

Lin’s Resolution of the Gibbs Paradox Compared to the non-identical gases, we have less information about the identical gases Compared to the non-identical gases, we have less information about the identical gases According to his theory, According to his theory, less information=higher entropy less information=higher entropy Therefore, the mixing of gases should result in an increase with entropy.

Comparing the 3 graphs Entropy S max Similarity S=0 Z=0Z = 1 Entropy S max Similarity S=0 Z=0Z = 1 Z=0 Entropy S max Similarity S=0 Z = 1 GibbsVon NeumannLin

Why are there different ways in resolving the paradox? Different ways of considering Entropy Different ways of considering Entropy Lin—Static Entropy: consideration of configurations of fixed particles in a system Lin—Static Entropy: consideration of configurations of fixed particles in a system Gibbs & von Neumann—Dynamic Entropy: dependent of the changes in the dispersal of energy in the microstates of atoms and molecules Gibbs & von Neumann—Dynamic Entropy: dependent of the changes in the dispersal of energy in the microstates of atoms and molecules

We cannot compare the two ways of resolving the paradox! Since Lin’s definition of entropy is essentially different from that of Gibbs and von Neumann, it is unjustified to compare the two ways of resolving the paradox. Since Lin’s definition of entropy is essentially different from that of Gibbs and von Neumann, it is unjustified to compare the two ways of resolving the paradox.

Conclusion The Gibbs Paradox poses problem to the second law due to an inadequate understanding of the system involved. The Gibbs Paradox poses problem to the second law due to an inadequate understanding of the system involved. Lin’s novel idea sheds new light on entropy and information theory, but which also leaves conflicting grey areas for further exploration. Lin’s novel idea sheds new light on entropy and information theory, but which also leaves conflicting grey areas for further exploration.

Acknowledgements We would like to thank We would like to thank Dr. Chin Wee Shong for her support and guidance throughout the semester Dr Kuldip Singh for his kind support And all who have helped in one way or another