Introduction to Entropy
Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The measure of micro uncertainty (or “mixedupness” –Gibbs) which remains after macro observables such as temperature, pressure, and volume have been measured
“Wasted” heat “Wasted” heat refers to the amount of heat released by a thermodynamic process that the system cannot use to do work. This heat is present, but unavailable for work. Or, the amount of heat necessary to do the work required to keep the system in order – Example : heat loss in an internal combustion engine
Order to disorder Nature tends to favor systems that move from an ordered state to a disordered state. Therefore, randomness is always increasing. In order to create a stable (that is, ordered) thermodynamic system, I must “siphon off” some of the heat energy and use it to do work required to keep the system in order. This energy/work is NOT available to help the system itself do work on its surroundings. Thus, entropy. Remember – work is a transfer of energy, and not a quantity of energy! Work is not a state function.
Disorder in chemical reactions Chemical systems tend to move toward equilibrium. Equilibrium – state where reactants and products are present in concentrations that have no further tendency to change with time. In equilibrium, all reactions are continuously reversible; forward and backward reaction rates are equal Think of an “ion soup” where reactants and products are bonding and dissociating continuously
Equilibrium = Disorder! The equilibrium state of a chemical reaction is its highest possible entropy state. The rates of forward and reverse reactions and bonding-dissociation are equivalent. Therefore, on a molecular level, the highest state of disorder exists at equilibrium (even though we think of equilibrium as “balance”)
Equilibrium = Disorder! Equilibrium state is the highest entropy state for a system because equilibrium creates the largest number of possible combinations of microstates Entropy is also maximized at equilibrium because the equilibrium state contains no traces of the initial conditions of the system Example – a glass of water at thermal equilibrium with the room. System-surroundings-universe all at thermal equilibrium. What was the initial temperature of the water in the glass? We have no way of knowing.
Formal definitions of entropy First formal definition, via statistics Statistically, entropy involves defining a numerical quantity that measures randomness or uncertainty relating to the number of ways a system’s molecules can be arranged. Macro state: readily observed characteristics such as T, P, V, n Micro state: position, velocity, energy for each individual molecule. This must be computed (or approximated) via probability/statistics.
Statistical definition of entropy Statistical thermodynamics is based on the fundamental assumption that all possible micro configurations of a given system, which satisfy the macro boundary conditions such as temperature, pressure, volume and number of particles, are equally likely to occur in a system that is in thermal equilibrium. For our purposes, we will consider thermal and chemical equilibrium to be equivalent
Statistical definition of entropy In other words, any micro particle in a system in thermal/chemical equilibrium has an equally probable chance of being in any permissible microstate. The total energy level of a system in equilibrium remains constant over time, so probabilities are not time-dependent. Mathematically, we say that the sum of the probabilities of all allowed microstates must equal 1 (1.00 = 100% probability) Likewise, the probability of any particle to be in any one of W microstates in a given fraction of time is given by:
Statistical definition of entropy One more important rule: The Equipartition Theorem, which says that for any system in thermal equilibrium, the total kinetic and potential energies are equally partitioned among all particles, and among all degrees of freedom available to each particle. In most situations, “degrees of freedom” involve vibrational, rotational, and translational motion.
Statistical definition of entropy Aside: the equipartition theorem fails miserably(!) for electromagnetic energy. It was the total failure of the ET to accurately model emitted radiation that forced scientists to rethink some things at the atomic level and to propose that some types of energy may exist in quantized, rather than continuous, states.
Statistical definition of entropy But for our purposes (kinetic and potential energy) the ET works very well! If we simplify things a little by noting that most atoms at STP exist at only their vibrational ground states, we can use the ET to derive two VERY important mathematical relationships for the translational motion of particles. (The derivation of these results is beyond the scope of this class.)
Statistical definition of entropy Result #1 – The “partition function” that tells us the number of microstates of a system that are thermally accessible at a given temperature: where E i = energy level of the i th particle, k = Boltzmann’s Constant, and T = temperature
Statistical definition of entropy Result #2 – The average translational kinetic energy of a molecule of ideal gas: or ½ kT for each dimensional degree of freedom (x, y, z) By the way, Boltzmann’s constant “k” = x J/K = R (ideal gas constant) / Avogadro’s number
Okay, this is all very interesting … but what happened to entropy? Good question! Fortunately we have an answer. By using our definitions of probability distribution for the i th particle in an equilibrium state system:
What it all boils down to … … we can show that the function which most accurately quantifies total randomness is: Because S is defined in terms of the natural logrithm of the total number of permissible states (W): → the entropies of sub-systems can be directly added ln (a+b) = ln(a) + ln(b) i.e. Hess’s Law → S is maximized when W is maximized and minimized when W is minimized. Also, when W=1, S=0.
Finally The minimum condition for entropy (S=0 when W=1; i.e. the ground state) leads us directly to the Third Law of Thermodynamics: “The absolute entropy of a system at zero Kelvin (a perfectly ordered ground state crystal with zero kinetic energy) is zero.”
Absolute entropy The third law of thermodynamics allows us to empirically determine a value of absolute entropy for a chemical element or compound based on its molecular structure and kinetic energy. This is the value “standard absolute entropy” S° that you will find in thermo data tables.
Formal definitions of entropy Second formal definition, from observables From observing phase changes, we note that as the total heat energy q p of a system increases, the randomness of the system’s molecules increases proportionally: ΔS ≈ Δq p Phase changes are a good system to observe, since they occur at a constant T.
Since changes in entropy are observed per change in unit temperature, a more accurate expression would be: ΔS = q p / T We can also define: ΔS = S final temperature – S initial temperature
or more formally: ΔS system = ΔS final system state – ΔS initial system state Also, for any thermodynamic system: ΔS universe = ΔS surroundings + ΔS system
But wait … there’s more! (+) ΔS system = increase in system entropy, therefore an increase in system disorder. This is the most likely scenario to occur naturally. (-) ΔS system = decrease in system entropy, therefore decrease in system disorder. Not likely to occur naturally; usually an indication that work was done on the system
You’ll also get … Ability to make basic kinetics predictions: Recall: ΔS universe = ΔS surroundings + ΔS system If ΔS univ > 0 → spontaneous forward reaction If ΔS univ < 0 → non-spontaneous forward, but spontaneous backward reaction If ΔS univ = 0 → surroundings and system are in equilibrium
Problem solving for entropy values Hess’s Law allows us to take tabulated values for standard absolute entropy of products and reactants, and combine them to find entropy changes for reactions ΔS° (reaction) = ) ∑nS° (products) - ∑nS° (reactants) Recall that for our purposes, thermodynamic systems are generally chemical reactions (formation, combustion, etc.)
Problem solving for entropy values Reaction spontaneity can be determined from entropy values, but we must know entropy for system (reaction) and surroundings. But this is not as difficult as it may seem. Universe = surroundings + system = surroundings + reaction ΔS univ = ΔS surroundings + ΔS° reaction
Problem solving for entropy values Let’s think about this … we measure changes to the system via changes to the surroundings, specifically: heat of surroundings = - (heat of system) In other words, q p(surr) = -q p(syst) = -q p(rxn) = - ΔH (rxn)
Problem solving for entropy values Or, using our heat/temp definition of ΔS … ΔS surr = -(ΔH rxn / T) Very common to see temperature conditions given for reaction, either “STP” or “the reaction occurs at 298 K.”