When the German scientist Rudolf Clausius introduced the word entropy into the lexicon of thermodynamics in 1865, he used a simple formula to construct the word based on his understanding of the etymology of energy. Intending for the prefix en- to mean contents, and the segment -trop- to mean transformation, he spliced together entropy to mean “contents that have been transformed.” Although his goal must have been to give an unambiguous meaning to a newly defined thermodynamic quantity, he could not have anticipated how the various meanings of this word over time could be a case study in disorder and uncertainty.

1

Originally, Clausius intended for the entropy of a system to be associated with the amount of thermal energy put into a system that could not be extracted as mechanical work. Quantitatively, if a system is held at constant temperature T, and an amount of heat Q is added to the system, then the change in entropy S is given by the formula S = Q/T, where T is the temperature given in absolute units, such as kelvins. Clausius was also the source of the thermodynamic principle “The entropy of an isolated, closed system is either numerically constant or increases with time.”

2

As the scientific understanding of entropy evolved, a subjective sense of entropy developed that associated entropy with energy that is irreversibly lost and with disorder. Here, disorder means the number of ways that you can rearrange a system so that it looks exactly the same. For example, when an ice sculpture melts, the water molecules go from a fixed arrangement to a relatively free arrangement. Because there are far more ways to rearrange water molecules in a puddle (so that it looks like a puddle) than there are ways to rearrange water molecules in an ice statue (so that it looks like the same ice statue), the entropy of the puddle is greater than the entropy of the ice statue. The disorder of the ice statue has increased through melting. This particular understanding of entropy was first mathematically demonstrated in 1896 by another German scientist, Ludwig Boltzmann.

3

In the late 19th century, the concept of entropy suggested to philosophers a physical means by which the universe would ultimately wind down. The combination of entropy as disorder and the principle of Clausius evolved into a narrative that could be stated: the disorder and randomness of our world is only increasing. In more recent years, as people have tried to explore the implications of this principle, entropy has been defined as broadly as “the smashing down of our world by random forces that don’t reverse” (Stephen Leacock) and as “the assassin of the truths of the Modern Age” (Jeremy Rifken). In these uses, entropy is no longer seen as a thermodynamic quantity; it has become a dark force in the universe.

4

Despite the narrative force that the concept of entropy appears to evoke in everyday writing, in scientific writing entropy remains a thermodynamic quantity and a mathematical formula that numerically quantifies disorder. When the American scientist Claude Shannon found that the mathematical formula of Boltzmann defined a useful quantity in information theory, he hesitated to name this newly discovered quantity entropy because of its philosophical baggage. The mathematician John Von Neumann encouraged Shannon to go ahead with the name entropy, however, since “no one knows what entropy is, so in a debate you will always have the advantage.”