Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Entropy. Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The.

Similar presentations


Presentation on theme: "Introduction to Entropy. Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The."— Presentation transcript:

1 Introduction to Entropy

2 Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The measure of micro uncertainty (or “mixedupness” –Gibbs) which remains after macro observables such as temperature, pressure, and volume have been measured

3 “Wasted” heat “Wasted” heat refers to the amount of heat released by a thermodynamic process that the system cannot use to do work. This heat is present, but unavailable for work. Or, the amount of heat necessary to do the work required to keep the system in order – Example : heat loss in an internal combustion engine

4 Order to disorder Nature tends to favor systems that move from an ordered state to a disordered state. Therefore, randomness is always increasing. In order to create a stable (that is, ordered) thermodynamic system, I must “siphon off” some of the heat energy and use it to do work required to keep the system in order. This energy/work is NOT available to help the system itself do work on its surroundings. Thus, entropy. Remember – work is a transfer of energy, and not a quantity of energy! Work is not a state function.

5 Disorder in chemical reactions Chemical systems tend to move toward equilibrium. Equilibrium – state where reactants and products are present in concentrations that have no further tendency to change with time. In equilibrium, all reactions are continuously reversible; forward and backward reaction rates are equal Think of an “ion soup” where reactants and products are bonding and dissociating continuously

6 Equilibrium = Disorder! The equilibrium state of a chemical reaction is its highest possible entropy state. The rates of forward and reverse reactions and bonding-dissociation are equivalent. Therefore, on a molecular level, the highest state of disorder exists at equilibrium (even though we think of equilibrium as “balance”)

7 Equilibrium = Disorder! Equilibrium state is the highest entropy state for a system because equilibrium creates the largest number of possible combinations of microstates Entropy is also maximized at equilibrium because the equilibrium state contains no traces of the initial conditions of the system Example – a glass of water at thermal equilibrium with the room. System-surroundings-universe all at thermal equilibrium. What was the initial temperature of the water in the glass? We have no way of knowing.

8 Formal definitions of entropy First formal definition, via statistics Statistically, entropy involves defining a numerical quantity that measures randomness or uncertainty relating to the number of ways a system’s molecules can be arranged. Macro state: readily observed characteristics such as T, P, V, n Micro state: position, velocity, energy for each individual molecule. This must be computed (or approximated) via probability/statistics.

9 Statistical definition of entropy Statistical thermodynamics is based on the fundamental assumption that all possible micro configurations of a given system, which satisfy the macro boundary conditions such as temperature, pressure, volume and number of particles, are equally likely to occur in a system that is in thermal equilibrium. For our purposes, we will consider thermal and chemical equilibrium to be equivalent

10 Statistical definition of entropy In other words, any micro particle in a system in thermal/chemical equilibrium has an equally probable chance of being in any permissible microstate. The total energy level of a system in equilibrium remains constant over time, so probabilities are not time-dependent. Mathematically, we say that the sum of the probabilities of all allowed microstates must equal 1 (1.00 = 100% probability) Likewise, the probability of any particle to be in any one of W microstates in a given fraction of time is given by:

11 Statistical definition of entropy One more important rule: The Equipartition Theorem, which says that for any system in thermal equilibrium, the total kinetic and potential energies are equally partitioned among all particles, and among all degrees of freedom available to each particle. In most situations, “degrees of freedom” involve vibrational, rotational, and translational motion.

12 Statistical definition of entropy Aside: the equipartition theorem fails miserably(!) for electromagnetic energy. It was the total failure of the ET to accurately model emitted radiation that forced scientists to rethink some things at the atomic level and to propose that some types of energy may exist in quantized, rather than continuous, states.

13 Statistical definition of entropy But for our purposes (kinetic and potential energy) the ET works very well! If we simplify things a little by noting that most atoms at STP exist at only their vibrational ground states, we can use the ET to derive two VERY important mathematical relationships for the translational motion of particles. (The derivation of these results is beyond the scope of this class.)

14 Statistical definition of entropy Result #1 – The “partition function” that tells us the number of microstates of a system that are thermally accessible at a given temperature: where E i = energy level of the i th particle, k = Boltzmann’s Constant, and T = temperature

15 Statistical definition of entropy Result #2 – The average translational kinetic energy of a molecule of ideal gas: or ½ kT for each dimensional degree of freedom (x, y, z) By the way, Boltzmann’s constant “k” = 1.381 x 10 -23 J/K = R (ideal gas constant) / Avogadro’s number

16 Okay, this is all very interesting … but what happened to entropy? Good question! Fortunately we have an answer. By using our definitions of probability distribution for the i th particle in an equilibrium state system:

17 What it all boils down to … … we can show that the function which most accurately quantifies total randomness is: Because S is defined in terms of the natural logrithm of the total number of permissible states (W): → the entropies of sub-systems can be directly added ln (a+b) = ln(a) + ln(b) i.e. Hess’s Law → S is maximized when W is maximized and minimized when W is minimized. Also, when W=1, S=0.

18 Finally The minimum condition for entropy (S=0 when W=1; i.e. the ground state) leads us directly to the Third Law of Thermodynamics: “The absolute entropy of a system at zero Kelvin (a perfectly ordered ground state crystal with zero kinetic energy) is zero.”

19 Absolute entropy The third law of thermodynamics allows us to empirically determine a value of absolute entropy for a chemical element or compound based on its molecular structure and kinetic energy. This is the value “standard absolute entropy” S° that you will find in thermo data tables.

20 Formal definitions of entropy Second formal definition, from observables From observing phase changes, we note that as the total heat energy q p of a system increases, the randomness of the system’s molecules increases proportionally: ΔS ≈ Δq p Phase changes are a good system to observe, since they occur at a constant T.

21 Since changes in entropy are observed per change in unit temperature, a more accurate expression would be: ΔS = q p / T We can also define: ΔS = S final temperature – S initial temperature

22 or more formally: ΔS system = ΔS final system state – ΔS initial system state Also, for any thermodynamic system: ΔS universe = ΔS surroundings + ΔS system

23 But wait … there’s more! (+) ΔS system = increase in system entropy, therefore an increase in system disorder. This is the most likely scenario to occur naturally. (-) ΔS system = decrease in system entropy, therefore decrease in system disorder. Not likely to occur naturally; usually an indication that work was done on the system

24 You’ll also get … Ability to make basic kinetics predictions: Recall: ΔS universe = ΔS surroundings + ΔS system If ΔS univ > 0 → spontaneous forward reaction If ΔS univ < 0 → non-spontaneous forward, but spontaneous backward reaction If ΔS univ = 0 → surroundings and system are in equilibrium

25 Problem solving for entropy values Hess’s Law allows us to take tabulated values for standard absolute entropy of products and reactants, and combine them to find entropy changes for reactions ΔS° (reaction) = ) ∑nS° (products) - ∑nS° (reactants) Recall that for our purposes, thermodynamic systems are generally chemical reactions (formation, combustion, etc.)

26 Problem solving for entropy values Reaction spontaneity can be determined from entropy values, but we must know entropy for system (reaction) and surroundings. But this is not as difficult as it may seem. Universe = surroundings + system = surroundings + reaction ΔS univ = ΔS surroundings + ΔS° reaction

27 Problem solving for entropy values Let’s think about this … we measure changes to the system via changes to the surroundings, specifically: heat of surroundings = - (heat of system) In other words, q p(surr) = -q p(syst) = -q p(rxn) = - ΔH (rxn)

28 Problem solving for entropy values Or, using our heat/temp definition of ΔS … ΔS surr = -(ΔH rxn / T) Very common to see temperature conditions given for reaction, either “STP” or “the reaction occurs at 298 K.”


Download ppt "Introduction to Entropy. Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The."

Similar presentations


Ads by Google