Entropy

Entropy << EHN truh pee >> is a measure of the amount of disorder or randomness in a system. Because there are many more random ways of arranging a group of things than there are organized ways, disorder is much more probable. For example, shuffling a deck of cards always leads to a jumbled distribution of cards, and never to an ordered sequence.

The idea of entropy is the basis of the second law of thermodynamics. According to this law, the direction of spontaneous change in isolated systems is toward maximum disorder. Thus, heat flows of its own accord only from a hotter substance to a cooler one. As the cooler substance gains heat, the motion of its molecules becomes more disorderly and its entropy increases (see Heat (Changes in state)).

Furthermore, a gas will always expand to fill its container but will never contract to occupy only a fraction of the container. The entropy of the gas increases as the gas expands because more positions are available for the molecules to occupy. Every substance has greater entropy as a gas than as a liquid.

Some changes may decrease entropy in one system. But this decrease is more than offset by an entropy increase in connected systems. For example, the entropy of water decreases as the liquid freezes, but the heat released in the process increases the entropy of the surrounding air.

The entropy of a substance increases whenever the substance loses some of its ability to do work. For example, air forced into an empty balloon has low entropy because the air molecules are compressed in a small volume. The compressed air does work by expanding to inflate the balloon. In the inflated balloon, the molecules can occupy a larger number of positions and thus have greater entropy. But they have lost the ability to do more work by expanding the balloon further.

Taken together, all processes occurring now will result in a universe of greater disorder. Because the entropy of the universe is always increasing, a state of greater entropy must be one that occurs later in time. For this reason, entropy has been called “time’s arrow.”

See also Time (Arrows of time).