Entropy (from Greek entropía, "transformation"): a magnitude expressing the number of possible states of a system.

The concept of entropy is at home in the field of thermodynamics, where it is used to identify a system's changes in heat energy proportionally to its temperature. Around 1880, the physicist Ludwig Boltzmann discovered that a given system's entropy depends on the number of possible states of its particles. The more ways are possible to reach the current macroscopic state, the higher is the entropy of the system.* Since in a macroscopic system such as a liquid or gas we can measure only the external magnitudes, like temperature or pressure, but can not determine the individual states of the molecules, the entropy has some similarity to our ignorance about the system.

Entropy and Bloody Mary

Whenever we mix a cocktail we thereby increase its entropy. That is because our knowledge of the system is greater at the beginning of the mixing process, when we can still observe the drink's various ingredients (ice cubes, vodka, tomato juice, hot sauce) individually, then it is at the end of the process, when those elements have been blended together. The number of possible states of the molecules is also greater in the final state of the mixing process because the alcohol, water, and other molecules in our drink may now be anywhere inside the glass. Thus, the mixing process increases the level of "disorder" among the molecules.

The more order there is in a system, the more we usually can know about it, and the lower is its entropy value. A smooth sheet of paper has lower entropy than the same paper when it's crumbled up. A log in your fireplace has lower entropy before combustion than after. The collapse of a building increases its entropy. Tidying up our desk decreases its entropy, for there are now fewer possible positions for the remaining things on the desk than before. Even a moderately useful dictionary has a considerably lower entropy than the original cellulose and printer's ink of which it consists.

The Entropy Misconception

The above examples might tempt to equate lower entropy with higher 'order' of a system. Indeed you'll find this concept in many school books and articles. Unfortunately it's a popular misconception. There are systems where increasing entropy - and increasing ignorance - leads to increasing order. In fact, the overwhelming majority of systems in our universe behave this way. Even our above described cocktail.

Mix a Bloody Mary and let it stand for a couple of days. During the experiment you'll notice a stange behavior of the tomato juice particles. They will concentrate towards the bottom of the drink, with the lighter water alcohol blend above and a clearly defined border inbetween. Gravity has caused an increased drink order. Nevertheless entropy has increased also, as under gravity conditions there are a lot more ways leading to a de-blended cocktail than to a uniformly blended distribution.

The objection that a drink exposed to gravity is no closed system can be easily debunked. Just shoot the cocktail into outer space, and let it freely float far away from any gravity fields and external influences. But let it rotate with high speed. Centrifugal force will likewise cause a separation of tomato and alcohol. As soon as a macroscopic force, such as gravity of centrifugal force, affects particles it can increase the order of the system**.

Heat Death of the Universe

The Second Law of Thermodynamics tells us that the entropy of a closed system can only increase, never decrease. This has two significant implications for the development of our universe.

The first implication is that entropy indicates a definite direction in time. If we compare a closed system at a low entropy level with the same system at a higher entropy level, we know that the latter state is later in time. The second implication is that the entropy of the universe also constantly increases***. Suns will be extinguished, galaxies dissolve, and temperature differences balance out. The final state of the universe is its heat death. This is a state of maximal entropy. You can read up on what this state will be like in more detail under Universe.


* The exact relation is S = kB ln(Ω), where S = entropy, kB =Boltzmann constant 1.3806505∙10-23 J/K, and Ω = number of possible states.

** Theoretically, our cocktail would be able to return to its earlier state even without macroscopic forces. The likelihood of such an occurrence, moreover, decreases the further the more liquid molecules the drink contains. Nevertheless, any cocktail will go back to start after an enormous, but still finite amount of time. Using computer models, the physicist Dieter Zeh has calculated the time that a two-dimensional "drink" consisting of only 50 molecules would need in order to randomly group itself in an area the size of the sixth part of the available space. The calculated time amounts to 1017 times the age of the universe.

*** At least within "mostly finite" time periods. It would be different when our universe had not only infinite size, but also infinite age. In such a universe, molecules would occupy over time all physically possible states infinitely often. Therefore, the entropy would also infinitely often 'jump back' to a minimum value, like in the state of maximum density at the time of the Big Bang.

© Johann Christian Lotter   ■  Infinity  ■  Links  ■  Forum