On the previous page, I gave a general description of entropy: it is a quantity that increases as a system evolves toward equilibrium. But it will be worthwhile to explain more carefully what this mysterious quantity really is. We can gain a better understanding of entropy by distinguishing between two ways of describing the state (physical condition) of a system at a particular time.
Imagine a room full of nothing but air. The physical state of the room at a given time could be described very precisely, at least in principle, by specifying the exact position and velocity of each air molecule in the room. (Of course it’s not possible to give such a description in real life.) The precise state of a system at a given time, in terms of the positions and velocities of its microscopic particles, is called the system’s microstate.
On the other hand, we could describe the state of the room less precisely, in terms of more easily observable properties. For example, we could describe how the temperature and pressure of the air varies throughout the room, without specifying the exact positions and velocities of all the molecules. The state of a system described in terms of volume, temperature, and pressure is called the system’s thermodynamic condition.
Many different microstates have the same volume, temperature, and pressure. For example, each molecule could be located at a slightly different position or be travelling in a different direction, without changing the volume, temperature, or pressure of the air in the room. Thus, any given thermodynamic condition corresponds to many possible microstates. The system will be in exactly one microstate at a time, of course, but its microstate is constantly changing as molecules bounce around. Although the microstate is constantly changing, the system could remain in the same thermodynamic condition for a while, because many different microstates correspond to the same thermodynamic condition.
We can think of entropy as a measure of how many possible microstates correspond to a system’s thermodynamic condition. The higher the entropy of a system, the more possible microstates correspond to its thermodynamic condition. That’s still not an exact definition of entropy. But it captures the main idea behind the definition of entropy used in statistical mechanics—a branch of physics that relates the laws of thermodynamics to the laws governing motion and energy at the molecular level. For a more detailed explanation of the statistical mechanical definition of entropy, see appendix C [link coming soon].
The higher the entropy of a thermodynamic condition, the more microstates correspond to it. A system’s equilibrium condition is the condition of maximum entropy, and corresponds to more possible microstates than any other thermodynamic condition. The equilibrium condition of air in a room—the condition in which the air has the same temperature and pressure throughout the room—corresponds to more microstates than a condition in which the pressure is much greater at the left side of the room, for instance. There are more possible ways for the air molecules to be distributed evenly throughout the whole room than for most of the molecules to be clustered near the left side. Technically, there are infinitely many ways for the molecules to be distributed throughout the whole room, and also infinitely many ways for them to be clustered near the left side. Nonetheless, there is a definite mathematical sense in which there are more ways for the molecules to be distributed throughout the whole room. See appendix C for an explanation.