Introduction to Entropy. Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The.

Slides:



Advertisements
Similar presentations
The Kinetic Theory of Gases
Advertisements

Spontaneous Processes
The entropy, S, of a system quantifies the degree of disorder or randomness in the system; larger the number of arrangements available to the system, larger.
Copyright 1999, PRENTICE HALLChapter 191 Chemical Thermodynamics Chapter 19 David P. White University of North Carolina, Wilmington.
Thermodynamics B. Thermodynamics –Deals with the interconversion of heat an other forms of energy First Law: Energy can be converted from one form to.
System. surroundings. universe.
Thermodynamics Chapter 19 Liquid benzene Production of quicklime Solid benzene ⇅ CaCO 3 (s) ⇌ CaO + CO 2.
Thermodynamics: Spontaneity, Entropy and Free Energy.
Chapter 19 Chemical Thermodynamics
Copyright McGraw-Hill 2009 Chapter 18 Entropy, Free Energy and Equilibrium.
Chapter 17 THERMODYNAMICS. What is Thermodynamics? Thermodynamics is the study of energy changes that accompany physical and chemical processes. Word.
CHEM 163 Chapter 20 Spring minute exercise Is each of the following a spontaneous change? Water evaporates from a puddle A small amount of sugar.
Chapter 19 Chemical Thermodynamics Lecture Presentation John D. Bookstaver St. Charles Community College Cottleville, MO © 2012 Pearson Education, Inc.
Chemical Thermodynamics The chemistry that deals with the energy and entropy changes and the spontaneity of a chemical process.
Energy Changes in Chemical Reactions -- Chapter First Law of Thermodynamics (Conservation of energy)  E = q + w where, q = heat absorbed by system.
Daniel L. Reger Scott R. Goode David W. Ball Chapter 17 Chemical Thermodynamics.
Thermodynamics Chapter 19 Brown-LeMay. I. Review of Concepts Thermodynamics – area dealing with energy and relationships First Law of Thermo – law of.
Spontaneity, Entropy, and Free Energy
Chemical Thermodynamics the study of Reaction Feasibility.
Ch. 19: Chemical Thermodynamics (Thermochemistry II) Chemical thermodynamics is concerned with energy relationships in chemical reactions. - We consider.
First Law of Thermodynamics-The total amount of energy in the universe is constant. Second Law of Thermodynamics- All real processes occur spontaneously.
Thermodynamics Chapter 18.
Chapter 20: Thermodynamics
First Law of Thermodynamics  You will recall from Chapter 5 that energy cannot be created nor destroyed.  Therefore, the total energy of the universe.
Prentice Hall © 2003Chapter 19 Chapter 19 Chemical Thermodynamics CHEMISTRY The Central Science 9th Edition David P. White.
THERMODYNAMICS!!!! Nick Fox Dan Voicu.
Thermodynamics. Spontaneity What does it mean when we say a process is spontaneous? A spontaneous process is one which occurs naturally with no external.
Thermodynamics Chapter 19. First Law of Thermodynamics You will recall from Chapter 5 that energy cannot be created or destroyed. Therefore, the total.
A.P. Chemistry Spontaneity, Entropy, and Free Energy.
Chapter 6. = the capacity to do work or to produce heat Kinetic energy = the energy due to motion depends on mass & velocity Potential Energy = energy.
Thermodynamics Follow-up Kinetics The reaction pathway Thermodynamics the initial and final states.
Chapter 17 Lecture © 2014 Pearson Education, Inc. Sherril Soman Grand Valley State University Lecture Presentation Chapter 17 Free Energy and Thermodynamics.
Chapter 20 Thermodynamics and Equilibrium. Overview First Law of Thermodynamics Spontaneous Processes and Entropy –Entropy and the Second Law of Thermodynamics.
IB Physics Topic 3 – Introduction to Thermo physics Mr. Jean.
Prentice Hall © 2003Chapter 19 Chapter 19 Chemical Thermodynamics CHEMISTRY The Central Science 9th Edition.
AP Chapter 19.  Energy can not be created nor destroyed, only transferred between a system and the surroundings.  The energy in the universe is constant.
Chemistry 100 Chapter 19 Spontaneity of Chemical and Physical Processes: Thermodynamics.
Entropy ( ) Entropy (S) is a measure of disorder in a system – Nature likes to create disorder (i.e., ΔS > 0) – Larger entropies mean that more energy.
Thermodynamics. study of energy changes that accompany physical and chemical processes. Thermochemistry is one component of thermodynamics which focuses.
A science that includes the study of energy transformations and the relationships among the physical properties of substances which are affected by.
Ludwid Boltzmann 1844 – 1906 Contributions to Kinetic theory of gases Electromagnetism Thermodynamics Work in kinetic theory led to the branch of.
Monatomic Crystals.
CHE 116 No. 1 Chapter Nineteen Copyright © Tyna L. Meeks All Rights Reserved.
Energy & Heat Energy – ability to produce heat Heat - energy in the process of flowing from a warmer object to a cooler object. In chemical reactions.
Chapter 19 Lecture presentation
Chapter 18 Entropy, Free Energy, and Equilibrium Overview: Spontaneity and Entropy Entropy and Probability Second Law of Thermodynamics Free Energy and.
 State Function (°)  Property with a specific value only influenced by a system’s present condition  Only dependent on the initial and final states,
Thermodynamics: Spontaneity, Entropy and Free Energy.
A science that includes the study of energy transformations and the relationships among the physical properties of substances which are affected by.
Prentice Hall © 2003Chapter 19 Chapter 19 Chemical Thermodynamics CHEMISTRY The Central Science 9th Edition David P. White.
Chapter 19 Chemical Thermodynamics Entropy, Enthalpy, and Free Energy.
THEME: Theoretic bases of bioenergetics. LECTURE 6 ass. prof. Yeugenia B. Dmukhalska.
Chemical Thermodynamics First Law of Thermodynamics You will recall from earlier this year that energy cannot be created nor destroyed. Therefore, the.
THERMODYNAMICS – ENTROPY AND FREE ENERGY 3A-1 (of 14) Thermodynamics studies the energy of a system, how much work a system could produce, and how to predict.
Energy Changes in Chemical Reactions -- Chapter First Law of Thermodynamics (Conservation of energy)  E = q + w where, q = heat absorbed by system.
A science that includes the study of energy transformations and the relationships among the physical properties of substances which are affected by.
Chapter 19 Spontaneity, entropy and free energy (rev. 11/09/08)
Thermodynamics Chander Gupta and Matt Hagopian. Introduction into Thermo Thermodynamics is the study of energy and its transformations Thermochemistry.
Chemical Thermodynamics Chapter 19 Chemical Thermodynamics 19.1 Spontaneous Processes 19.2 Entropy and the Second Law of Thermodynamics 19.3 The Molecular.
Chapter 19 Chemical Thermodynamics
Chapter 19 Chemical Thermodynamics
Chapter 19 Chemical Thermodynamics
Chapter 19 Chemical Thermodynamics
Chapter 19 Chemical Thermodynamics
Chapter 19 Chemical Thermodynamics
Chapter 19 Chemical Thermodynamics
Chemical Thermodynamics Lecture 1. Chemical Thermodynamics.
Chemistry: The Central Science
Presentation transcript:

Introduction to Entropy

Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The measure of micro uncertainty (or “mixedupness” –Gibbs) which remains after macro observables such as temperature, pressure, and volume have been measured

“Wasted” heat “Wasted” heat refers to the amount of heat released by a thermodynamic process that the system cannot use to do work. This heat is present, but unavailable for work. Or, the amount of heat necessary to do the work required to keep the system in order – Example : heat loss in an internal combustion engine

Order to disorder Nature tends to favor systems that move from an ordered state to a disordered state. Therefore, randomness is always increasing. In order to create a stable (that is, ordered) thermodynamic system, I must “siphon off” some of the heat energy and use it to do work required to keep the system in order. This energy/work is NOT available to help the system itself do work on its surroundings. Thus, entropy. Remember – work is a transfer of energy, and not a quantity of energy! Work is not a state function.

Disorder in chemical reactions Chemical systems tend to move toward equilibrium. Equilibrium – state where reactants and products are present in concentrations that have no further tendency to change with time. In equilibrium, all reactions are continuously reversible; forward and backward reaction rates are equal Think of an “ion soup” where reactants and products are bonding and dissociating continuously

Equilibrium = Disorder! The equilibrium state of a chemical reaction is its highest possible entropy state. The rates of forward and reverse reactions and bonding-dissociation are equivalent. Therefore, on a molecular level, the highest state of disorder exists at equilibrium (even though we think of equilibrium as “balance”)

Equilibrium = Disorder! Equilibrium state is the highest entropy state for a system because equilibrium creates the largest number of possible combinations of microstates Entropy is also maximized at equilibrium because the equilibrium state contains no traces of the initial conditions of the system Example – a glass of water at thermal equilibrium with the room. System-surroundings-universe all at thermal equilibrium. What was the initial temperature of the water in the glass? We have no way of knowing.

Formal definitions of entropy First formal definition, via statistics Statistically, entropy involves defining a numerical quantity that measures randomness or uncertainty relating to the number of ways a system’s molecules can be arranged. Macro state: readily observed characteristics such as T, P, V, n Micro state: position, velocity, energy for each individual molecule. This must be computed (or approximated) via probability/statistics.

Statistical definition of entropy Statistical thermodynamics is based on the fundamental assumption that all possible micro configurations of a given system, which satisfy the macro boundary conditions such as temperature, pressure, volume and number of particles, are equally likely to occur in a system that is in thermal equilibrium. For our purposes, we will consider thermal and chemical equilibrium to be equivalent

Statistical definition of entropy In other words, any micro particle in a system in thermal/chemical equilibrium has an equally probable chance of being in any permissible microstate. The total energy level of a system in equilibrium remains constant over time, so probabilities are not time-dependent. Mathematically, we say that the sum of the probabilities of all allowed microstates must equal 1 (1.00 = 100% probability) Likewise, the probability of any particle to be in any one of W microstates in a given fraction of time is given by:

Statistical definition of entropy One more important rule: The Equipartition Theorem, which says that for any system in thermal equilibrium, the total kinetic and potential energies are equally partitioned among all particles, and among all degrees of freedom available to each particle. In most situations, “degrees of freedom” involve vibrational, rotational, and translational motion.

Statistical definition of entropy Aside: the equipartition theorem fails miserably(!) for electromagnetic energy. It was the total failure of the ET to accurately model emitted radiation that forced scientists to rethink some things at the atomic level and to propose that some types of energy may exist in quantized, rather than continuous, states.

Statistical definition of entropy But for our purposes (kinetic and potential energy) the ET works very well! If we simplify things a little by noting that most atoms at STP exist at only their vibrational ground states, we can use the ET to derive two VERY important mathematical relationships for the translational motion of particles. (The derivation of these results is beyond the scope of this class.)

Statistical definition of entropy Result #1 – The “partition function” that tells us the number of microstates of a system that are thermally accessible at a given temperature: where E i = energy level of the i th particle, k = Boltzmann’s Constant, and T = temperature

Statistical definition of entropy Result #2 – The average translational kinetic energy of a molecule of ideal gas: or ½ kT for each dimensional degree of freedom (x, y, z) By the way, Boltzmann’s constant “k” = x J/K = R (ideal gas constant) / Avogadro’s number

Okay, this is all very interesting … but what happened to entropy? Good question! Fortunately we have an answer. By using our definitions of probability distribution for the i th particle in an equilibrium state system:

What it all boils down to … … we can show that the function which most accurately quantifies total randomness is: Because S is defined in terms of the natural logrithm of the total number of permissible states (W): → the entropies of sub-systems can be directly added ln (a+b) = ln(a) + ln(b) i.e. Hess’s Law → S is maximized when W is maximized and minimized when W is minimized. Also, when W=1, S=0.

Finally The minimum condition for entropy (S=0 when W=1; i.e. the ground state) leads us directly to the Third Law of Thermodynamics: “The absolute entropy of a system at zero Kelvin (a perfectly ordered ground state crystal with zero kinetic energy) is zero.”

Absolute entropy The third law of thermodynamics allows us to empirically determine a value of absolute entropy for a chemical element or compound based on its molecular structure and kinetic energy. This is the value “standard absolute entropy” S° that you will find in thermo data tables.

Formal definitions of entropy Second formal definition, from observables From observing phase changes, we note that as the total heat energy q p of a system increases, the randomness of the system’s molecules increases proportionally: ΔS ≈ Δq p Phase changes are a good system to observe, since they occur at a constant T.

Since changes in entropy are observed per change in unit temperature, a more accurate expression would be: ΔS = q p / T We can also define: ΔS = S final temperature – S initial temperature

or more formally: ΔS system = ΔS final system state – ΔS initial system state Also, for any thermodynamic system: ΔS universe = ΔS surroundings + ΔS system

But wait … there’s more! (+) ΔS system = increase in system entropy, therefore an increase in system disorder. This is the most likely scenario to occur naturally. (-) ΔS system = decrease in system entropy, therefore decrease in system disorder. Not likely to occur naturally; usually an indication that work was done on the system

You’ll also get … Ability to make basic kinetics predictions: Recall: ΔS universe = ΔS surroundings + ΔS system If ΔS univ > 0 → spontaneous forward reaction If ΔS univ < 0 → non-spontaneous forward, but spontaneous backward reaction If ΔS univ = 0 → surroundings and system are in equilibrium

Problem solving for entropy values Hess’s Law allows us to take tabulated values for standard absolute entropy of products and reactants, and combine them to find entropy changes for reactions ΔS° (reaction) = ) ∑nS° (products) - ∑nS° (reactants) Recall that for our purposes, thermodynamic systems are generally chemical reactions (formation, combustion, etc.)

Problem solving for entropy values Reaction spontaneity can be determined from entropy values, but we must know entropy for system (reaction) and surroundings. But this is not as difficult as it may seem. Universe = surroundings + system = surroundings + reaction ΔS univ = ΔS surroundings + ΔS° reaction

Problem solving for entropy values Let’s think about this … we measure changes to the system via changes to the surroundings, specifically: heat of surroundings = - (heat of system) In other words, q p(surr) = -q p(syst) = -q p(rxn) = - ΔH (rxn)

Problem solving for entropy values Or, using our heat/temp definition of ΔS … ΔS surr = -(ΔH rxn / T) Very common to see temperature conditions given for reaction, either “STP” or “the reaction occurs at 298 K.”