# Fluctuations -- How energy is distributed

#### Prerequisites

- Entropy -- implications of the 2nd law of thermodynamics
- Expected values
- Example: Degrees of freedom
- Sharing -- a way to think about entropy

Our emerging picture of energy flow in thermal systems is that in a system with many interacting molecules there are lots of places to put energy. (See Example: Degrees of Freedom.) As a result of the interactions of the molecules (collisions, forces), the energy is "tossed around" — exchanged back and forth from one degree of freedom to another. One result we found was the 2nd law of thermodynamics, that energy tended to move spontaneously so as, on the average, it gets shared equally among all the degrees of freedom ™ "equipartition". (See Thermodynamic Equilibrium and Equipartition.)

This picture, however, may lead us into an apparent contradiction between two heuristics we have for thinking about entropy:

*Increasing entropy means that energy tends to get "spread around and shared" more evenly.**Increasing entropy means that we are losing information about where energy is.*

Heuristic 2 comes from the idea that more entropy corresponds to more microstates — more ways that the energy can be distributed. But if energy is shared uniformly, then there is an equal amount of energy in each degree of freedom. That would mean that at equilibrium we would know *exactly* where the energy was. If we had $M$ degrees of freedom and and energy $U$, we would know that each DoF had $U/M$ energy. This would be a specific microstate and therefore would correspond to an entropy of 0 — perfect order.

The problem here is that a critical phrase that was included in the first paragraph was omitted from heuristic 1: "on the average".

Because the system is continually interacting and tossing energy around, sometimes you get more energy than the average in a degree of freedom, sometimes less. These rapid variations in the amount of energy in any degree of freedom are called **fluctuations**.

"Thermodynamic equilibrium" is a dynamic equilibrium, not a static one. Just as at chemical equilibrium, a reaction that can go both ways doesn't stop — it just goes in both directions at equal rates. At thermodynamic equilibrium, energy exchange doesn't stop, just "on the average" you get as much energy flowing into any degree of freedom as you get flowing out.

This means if you look at any particular instant of time, any degree of freedom might have more or less than the average, continually fluctuating. Quantifying these fluctuations is what defines equilibrium. At equilibrium, it's not that "the energy in each degree of freedom" is the same, but rather

*At thermodynamic equilibrium, the probability of finding a certain amount of energy is the same in each degree of freedom.*

That is, there is a function, *P*(*E*), that tells you the probability that each degree of freedom will at any instant have an energy *E*. This energy could be any value at all: 0 or a lot. The average energy in a single degree of freedom is defined by weighting each *E* by the probability that it will have that energy:

$$\langle E \rangle = \int^{\infty}_0 E P(E) dE$$

The probability function, $P(E)$, is critical to understanding thermodynamic equilibrium. It is called ** the Boltzmann factor** and in our follow-on readings we will see that it is a decaying exponential, and that the probability of finding low energy is high and the probability of finding high energy is small and goes to zero as the energy gets very large. The probability depends on the temperature. The higher the temperature, the more likely you are to find higher energies in any degree of freedom. (That makes sense!)

In summary, the critical idea that resolves the apparent contradiction between heuristics 1 and 2 is this:

*At thermodynamic equilibrium, the energy in each degree of freedom fluctuates rapidly, sometimes having more, sometimes less. But at equilibrium the probability distribution of finding energy in any particular degree of freedom is the same. You don't know where energy is going to be at any particular instant, but you do know both the time average of the fluctuating energy and how that energy will fluctuate in each degree of freedom.*

To understand thermal equilibrium we have to simultaneously see that we have the same average energy and the same fluctuations in each degree of freedom. On the average we have perfect knowledge, but at any instant we have no knowledge at all of how the energy will be distributed.

Joe Redish 2/7/16

#### Follow-ons

Last Modified: April 9, 2019