Further Reading

Thermodynamic equilibrium and equipartition


Thermodynamic equilibrium

A crucial idea in the understanding of how energy tends to distribute itself in a system is the idea of thermodynamic equilibrium. When a system is in thermodynamic equilibrium, energy is continually flowing within the system, but as much goes one way as another so the overall result is that the system looks static from a macroscopic view.

For an example, consider two containers of liquid at different temperatures mixed together. The energy will flow from the hot liquid to the cold until they are at the same temperature. Then they'll look as if they are no longer changing. They are at a uniform temperature — at thermodynamic equilibrium. Actually, energy is continually exchanging among the molecules, so there are fluctuations, but for large enough systems, the fluctuations are small. On the average, as much energy goes one way as the other, so at time scales long compared to the time of a molecular collision, things look pretty much static. 

Of course in biology, thermodynamic equilibrium is not the state a living organism wants to be in. If you're in thermodynamic equilibrium with your environment, you're dead! So in biological systems it's a fact that energy always tends towards equilibrium and an organism (or part of an organism) has to be continually adjusting its environment so that it doesn't reach equilibrium. It's in the flow of energy towards equilibrium where organisms make their living.

Using energy

As energy makes its way towards equilibrium, it's flowing in a particular way — not in equal amounts back and forth like it would do at equilibrium. This directionality might be evening out a distribution in space but it might not — it might be in the approach of a concentration of chemicals toward chemical equilibrium. It is in this unbalanced flow of energy, trying to make its way towards equilibrium, that living beings can drain of some of that energy and do (physical or chemical) work with it.

Let's return to our simplest example to see how this works: two blocks of matter at different temperatures put next to each other. When they are not at the same temperature energy will flow in a particular direction -- from the hot to the cold. Since we know this, we can put a heat engine at the boundary of the two blocks and drain a bit of the energy off to do work as it flows in a predictable direction. If the two blocks were at the same temperature, we would still have energy flowing from one block to the other — but there would be equal amounts flowing in each direction. We couldn't figure out how to orient our heat engine to get work out.

Generalizing, we can propose the following idea:

Energy is useful to the extent that it is not shared uniformly in all of the possible ways it can be distributed.

To make any practical sense of this, we are going to have to think about the different ways that energy can be distributed in a system. 

Degrees of freedom

In any object, there are lots of different places where energy might be found. In a gas, a molecule might be moving; in a solid, energy might be stored in the (spring-like) potential energy between two molecules; a molecule in a gas might have gravitational potential energy as it rises; a molecule might be vibrating or rotating; an electron in a molecule might be kicked up to an excited state. Each place some energy can reside in a physical system is called a degree of freedom.

For one of the simplest macroscopic physical systems, the ideal gas, energy can only be in the motion of the molecules. We treat each molecule of the gas as a point particle that can have no energy exchanged with its internal structure and where the molecules interact so rarely that no energy is stored on the average in their potential energy. (Even an ideal gas has to have some interactions; otherwise the kinetic energy of the different molecules wouldn't be shared through collisions and the gas would not be in thermodynamic equilibrium.)

Each molecule of the gas can move in three independent directions in space, so that motion actually corresponds to three degrees of freedom. An ideal gas consisting of N molecules therefore has 3N degrees of freedom.  


As the chaos of collisions among the many molecules takes place, energy tends to be shared among all the degrees of freedom of the system being considered.  The statement

When a system is in thermodynamic equilibrium, each degree of freedom will have (on the average) the same amount of energy;

is called the equipartition theorem - from "equi-partition" meaning "equally divided". Since we know from our study of the ideal gas that each of the 3 degrees of freedom of a molecule of ideal gas at equilibrium has an energy $½k_BT$, we might expect (and we would be correct) that this is how much energy will be stored in each degree of freedom of a more complex substance at thermodynamic equilibrium. 

Of course, we are interested in states away from equilibrium, so finding where energy can hide in a system — in particular, in molecular structures, in electric potential energy, and in concentration gradients — will be of great importance in understanding biological processes.

Energy distributions

One technical correction: It's not enough for there to be "the same amount of energy in each degree of freedom" on the average. It also has to be distributed in the way that is natural at equilibrium. For example, in the ideal gas, although at equilibrium each molecule has a kinetic energy of $3(½k_BT)$, the speeds of the molecules are NOT all the same at equilibrium; they are distributed according to the Maxwell-Boltzmann distribution. Their KE has a maximum probability to be found at $3(½k_BT)$), but molecules that move both faster and slower are also found. For other distributions — such as the gravitational PE of the atmosphere — a simple exponential known as the Boltzmann distribution is found at equilibrium. We will study this in some follow-ons.

Joe Redish 1/29/12


Article 479
Last Modified: April 1, 2019