What is Entropy?

The second law of thermodynamics involves a thermodynamic quantity we call entropy (S). Entropy is a measure of the disorder of a system, measured in joules (J). The second law of thermodynamics states that the entropy of the universe is always increasing.

One consequence of the second law of thermodynamics is that in any engine there will be some energy lost as heat that cannot be harnessed to do work. We observe this in our everyday lives. If you touch the hood of your car while the engine is running, the hood of the car will feel hot. This is because some of the energy from your car engine is lost as heat. Because of this, the second law of thermodynamics explains why a perpetual motion machine can never exist.

Read this text. The first section explains the difference between reversible and irreversible processes. A reversible process can be modeled as a series of tiny steps, while an irreversible process must be modeled as a single large change. The second section discusses the meaning of entropy, and what disorder means on a microscopic level. 

Entropy is a state function, which means we can apply Hess' Law to it. Absolute entropies of most common substances are tabulated, allowing us to calculate the entropy of a reaction in the same way we can calculate enthalpy of reaction from standard enthalpies of formation.

Entropy is one of the most fundamental concepts of physical science, with far-reaching consequences ranging from cosmology to chemistry. It is also widely misrepresented as a measure of disorder, as we discuss below. The German physicist Rudolf Clausius originated the concept as energy gone to waste in the early 1850s, and its definition went through a number of more precise definitions over the next 15 years.

We have explored how the tendency of thermal energy to disperse as widely as possible is what drives all spontaneous processes, including chemical reactions. Now we need to understand how the direction and extent of the spreading and sharing of energy can be related to measurable thermodynamic properties of substances – that is, of reactants and products.

You will recall that when a quantity of heat q flows from a warmer body to a cooler one, permitting the available thermal energy to spread into and populate more microstates, that the ratio q/T measures the extent of this energy spreading. It turns out that we can generalize this to other processes as well, but there is a difficulty with using q because it is not a state function; that is, its value is dependent on the pathway or manner in which a process is carried out. This means, of course, that the quotient q/T cannot be a state function either, so we are unable to use it to get differences between reactants and products as we do with the other state functions. The way around this is to restrict our consideration to a special class of pathways that are described as reversible.


Reversible and Irreversible Changes

A change is said to occur reversibly when it can be carried out in a series of infinitesimal steps, each one of which can be undone by making a similarly minute change to the conditions that bring the change about.

For example, the reversible expansion of a gas can be achieved by reducing the external pressure in a series of infinitesimal steps; reversing any step will restore the system and the surroundings to their previous state. Similarly, heat can be transferred reversibly between two bodies by changing the temperature difference between them in infinitesimal steps each of which can be undone by reversing the temperature difference.

The most widely cited example of an irreversible change is the free expansion of a gas into a vacuum. Although the system can always be restored to its original state by recompressing the gas, this would require that the surroundings perform work on the gas. Since the gas does no work on the surrounding in a free expansion (the external pressure is zero, so PΔV = 0,) there will be a permanent change in the surroundings. Another example of irreversible change is the conversion of mechanical work into frictional heat; there is no way, by reversing the motion of a weight along a surface, that the heat released due to friction can be restored to the system.


Reversible and Irreversible Gas Expansion and Compression

Image that shows work in multistage expansions of a gas


These diagrams show the same expansion and compression ±ΔV carried out in different numbers of steps ranging from a single step at the top to an infinite number of steps at the bottom. As the number of steps increases, the processes become less irreversible; that is, the difference between the work done in expansion and that required to re-compress the gas diminishes. In the limit of an infinite number of steps (bottom), these work terms are identical, and both the system and surroundings (the world) are unchanged by the expansion-compression cycle. In all other cases the system (the gas) is restored to its initial state, but the surroundings are forever changed.

A reversible change is one carried out in such as way that, when undone, both the system and surroundings (that is, the world) remain unchanged.


Reversible = Impossible: So Why Bother?

It should go without saying, of course, that any process that proceeds in infinitesimal steps would take infinitely long to occur, so thermodynamic reversibility is an idealization that is never achieved in real processes, except when the system is already at equilibrium, in which case no change will occur anyway! So why is the concept of a reversible process so important?

The answer can be seen by recalling that the change in the internal energy that characterizes any process can be distributed in an infinity of ways between heat flow across the boundaries of the system and work done on or by the system, as expressed by the First Law ΔU = q + w. Each combination of q and w represents a different pathway between the initial and final states. It can be shown that as a process such as the expansion of a gas is carried out in successively longer series of smaller steps, the absolute value of q approaches a minimum, and that of w approaches a maximum that is characteristic of the particular process.

Thus when a process is carried out reversibly, the w-term in the First Law expression has its greatest possible value, and the q-term is at its smallest. These special quantities wmax and qmin (which we denote as qrev and pronounce q-reversible) have unique values for any given process and are therefore state functions.


Work and Reversibility

Image showing heat and work in reversible and irreversible processes

Note that the reversible condition implies wmax and qmin. The impossibility of extracting all of the internal energy as work is essentially a statement of the Second Law.


For a process that reversibly exchanges a quantity of heat qrev with the surroundings, the entropy change is defined as:

\Delta S=\frac{q_{rev}}{T}


This is the basic way of evaluating ΔS for constant-temperature processes such as phase changes, or the isothermal expansion of a gas. For processes in which the temperature is not constant such as heating or cooling of a substance, the equation must be integrated over the required temperature range, as in the equation S=k\: In\:  \Omega .

...but if no real process can take place reversibly, what use is an expression involving qrev? This is a rather fine point that you should understand: although transfer of heat between the system and surroundings is impossible to achieve in a truly reversible manner, this idealized pathway is only crucial for the definition of ΔS; by virtue of its being a state function, the same value of ΔS will apply when the system undergoes the same net change via any pathway.

For example, the entropy change a gas undergoes when its volume is doubled at constant temperature will be the same regardless of whether the expansion is carried out in 1,000 tiny steps (as reversible as patience is likely to allow) or by a single-step (as irreversible a pathway as you can get!) expansion into a vacuum.


The Physical Meaning of Entropy

Entropy is a measure of the degree of spreading and sharing of thermal energy within a system.

This spreading and sharing can be spreading of the thermal energy into a larger volume of space or its sharing amongst previously inaccessible microstates of the system. The following table shows how this concept applies to a number of common processes.

 


System and Process


Source of Entropy Increase of System

A deck of cards is shuffled, or 100 coins, initially heads up, are randomly tossed. This has nothing to do with entropy because macro objects are unable to exchange thermal energy with the surroundings within the time scale of the process
Two identical blocks of copper, one at 20°C and the other at 40°C, are placed in contact. The cooler block contains more unoccupied microstates, so heat flows from the warmer block until equal numbers of microstates are populated in the two blocks.
A gas expands isothermally to twice its initial volume. A constant amount of thermal energy spreads over a larger volume of space
1 mole of water is heated by 1C°. The increased thermal energy makes additional microstates accessible. (The increase is by a factor of about 1020,000,000,000,000, 000,000,000.)
Equal volumes of two gases are allowed to mix. The effect is the same as allowing each gas to expand to twice its volume; the thermal energy in each is now spread over a larger volume.
One mole of dihydrogen, H2, is placed in a container and heated to 3000K. Some of the H2 dissociates to H because at this temperature there are more thermally accessible microstates in the 2 moles of H.
The above reaction mixture is cooled to 300K. The composition shifts back to virtually all H2 because this molecule contains more thermally accessible microstates at low temperatures.


Entropy is an extensive quantity; that is, it is proportional to the quantity of matter in a system; thus 100 g of metallic copper has twice the entropy of 50 g at the same temperature. This makes sense because the larger piece of copper contains twice as many quantized energy levels able to contain the thermal energy.


Entropy and Disorder

The late Frank Lambert contributed greatly to debunking the entropy = disorder myth in chemistry education.

Entropy is still described, particularly in older textbooks, as a measure of disorder. In a narrow technical sense this is correct, since the spreading and sharing of thermal energy does have the effect of randomizing the disposition of thermal energy within a system. But to simply equate entropy with disorder without further qualification is extremely misleading because it is far too easy to forget that entropy (and thermodynamics in general) applies only to molecular-level systems capable of exchanging thermal energy with the surroundings. Carrying these concepts over to macro systems may yield compelling analogies, but it is no longer science. it is far better to avoid the term disorder altogether in discussing entropy.

Photos of a tidy and disorganized bedroom: This is NOT entropy!

Fig. 2-1 Source: Charles Mallery, University of Miami)


Standard Entropies of Substances

The standard entropy of a substance is its entropy at 1 atm pressure. The values found in tables are normally those for 298K, and are expressed in units of J K–1 mol–1. The table below shows some typical values for gaseous substances.

He 126 H2 131 CH4 186
Ne 146 N2 192 H2O(g) 187
Ar 155 CO 197 CO2 213
Kr 164 F2 203 C2H6 229
Xe 170 O2 205 n -C3H8 270
    Cl2 223 n -C4H10 310


Table 1: Standard entropies of some gases at 298K, J K –1 mol –1


Note especially how the values given in this table illustrate these important points:

  • Although the standard internal energies and enthalpies of these substances would be zero, the entropies are not. This is because there is no absolute scale of energy, so we conventionally set the energies of formation of elements in their standard states to zero. Entropy, however, measures not energy itself, but its dispersal amongst the various quantum states available to accept it, and these exist even in pure elements.
  • It is apparent that entropies generally increase with molecular weight. For the noble gases, this is of course a direct reflection of the principle that translational quantum states are more closely packed in heavier molecules, allowing of them to be occupied.
  • The entropies of the diatomic and polyatomic molecules show the additional effects of rotational quantum levels.

 

C (diamond) C (graphite) Fe Pb Na S (rhombic) Si W
2.5 5.7 27.1 51.0 64.9 32.0 18.9 33.5


Table 2: Entropies of some solid elements at 298 K, J K–1 mol–1


The entropies of the solid elements are strongly influenced by the manner in which the atoms are bound to one another. The contrast between diamond and graphite is particularly striking; graphite, which is built up of loosely-bound stacks of hexagonal sheets, appears to be more than twice as good at soaking up thermal energy as diamond, in which the carbon atoms are tightly locked into a three-dimensional lattice, thus affording them less opportunity to vibrate around their equilibrium positions. Looking at all the examples in the above table, you will note a general inverse correlation between the hardness of a solid and its entropy. Thus sodium, which can be cut with a knife, has almost twice the entropy of iron; the much greater entropy of lead reflects both its high atomic weight and the relative softness of this metal. These trends are consistent with the oft-expressed principle that the more disordered a substance, the greater its entropy.

 

Solid Liquid Gas
41 70 186


Table 3: Entropy of water at 298K, J K–1 mol–1


Gases
, which serve as efficient vehicles for spreading thermal energy over a large volume of space, have much higher entropies than condensed phases. Similarly, liquids have higher entropies than solids owing to the multiplicity of ways in which the molecules can interact (that is, store energy.)

 


Source: Stephen Lower, http://www.chem1.com/acad/webtext/thermeq/TE2.html
Creative Commons License This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 License.

Last modified: Monday, May 17, 2021, 3:03 PM