by Rishidev Chaudhuri and Jason Merrill
C.P. Snow famously said that not knowing the second law of thermodynamics is like never having read Shakespeare. Whatever the particular merits of this comparison, it does speak to the centrality of the idea of entropy (and its increase) to the physical sciences. Entropy is one of the most important and fundamental physical concepts and, because of its generality, is frequently encountered outside physics. The pop conception of entropy is as a measure of the disorder in a system. This characterization is not so much false as misleading (especially if we think of order and information as being similar). What follows is a brief explanation of entropy, highlighting its origin in the particular ways we describe the world, and an explanation of why it tends to increase. We've made some simplifying assumptions, but they leave the spirit of things unchanged.
The fundamental distinction that gives rise to entropy is the separation between different levels of description. Small systems, systems with only a few components, can be described by giving the state of each of their components. For a large system, say a gas with billions of molecules, describing the state of each molecule is impossible, both because it would be tedious and because we don't know the state of each molecule. And, as we'll point out again later, for many purposes knowing the exact state of the system isn't useful. In theory we can predict how a system evolves by knowing its exact state, but in practice this is much too complicated to do unless the system is very small. So we instead build probabilistic predictions taking into account only a few parameters of the system, which gives us a coarser but more relevant level of description, and we seek to describe changes in the world at this level.
