# Example: Arranging energy and entropy

#### Prerequisites

- Entropy -- implications of the 2nd law of thermodynamics
- Probability
- Sharing -- a way to think about entropy

## Understanding the situation

The 2nd law of thermodynamics says that energy will tend to spontaneously distribute itself so that it is, on the average (and this phrase is very important — see Fluctuations -- How energy is distributed), spread equally to all degrees of freedom. (See Sharing -- a way to think about entropy.) It is not easy to see what this means, so let's consider a problem and work out a simple example in detail.

The entropy of a particular macrostate is proportional to the logarithm of the number of microstates corresponding to that macrostate: $S = k_B \ln W$. To see what that means and why entropy tells us about how a system will spontaneously tend to redistribute its energy, let's consider a "toy model" — one that is sufficiently simplified that we can understand clearly the mechanism behind the mathematics.

One of the reasons that it is difficult to understand entropy as about energy distribution is that many of the degrees of freedom we deal with — kinetic energy in three directions, energy of rotation,... — are continuous. (See Example: Degrees of freedom.) The energy in them can take any value. This makes it hard to see that entropy is actually about *counting* — counting the number of ways energy can be distributed. The math showing this involves breaking the continuum up into bits, counting the arrangements of those bits, and then taking a limit as the size of the bits go to zero. This involves more math than we would like to get into at this point.

Fortunately, some degrees of freedom are not continuous: they are discrete — their energies can only take on specific values. Here's an example that has relevance to how an MRI works.

## A real model: The alignment of the magnetic moment of protons in a magnetic field

The protons that make up the nucleus of hydrogen atoms are little bar magnets. Because of the laws of quantum mechanics, if they are placed in a magnetic field they can only either line up with or against the magnetic field. If it's lined up with the magnetic field it has a lower energy, which, if we are only discussing magnetic energy, we can choose to call this our zero of energy. (This is the way the little bar magnet "wants" to be — its lowest energy state.) If it is lined up in the opposite direction from the magnetic field, it has a higher energy, which we will call *E*_{0}.*

If you start with a bunch of protons in a magnetic field and you have some energy, you can distribute it by flipping some number of protons against the field. In the figure at the left, we have 6 protons, all aligned with the magnetic field, so the (magnetic) energy of the system is 0. In the figure at the right, we have flipped 3 of the protons to be anti-aligned with the magnetic field, so the energy of the system is 3*E*_{0}.

The orientation of the proton in a magnetic field is not only a discrete degree of freedom, it can only hold one "packet" of energy. It either has the energy *E*_{0} or it has none. It can't hold 2 or 3 packets. It's like an on-or-off switch.

## A toy model: An abstract sample problem

Now that we have (we hope!) convinced you that a model with discrete packets of energy is "not just a toy" but also useful in real physical situations, let's solve a typical problem. But we'll write it in a more general way, without specifying what the degree of freedom actually corresponds to,

1. Consider a system consisting of four discrete degrees of freedom, each of which can only hold 1 packet of energy. Suppose we have 2 packets of energy to distribute. This system of 4 bins with 2 packets is a *macrostate* — it's a system with a given amount of energy. How many *microstates* — states corresponding to specific ways of distributing those energy packets — correspond to that macrostate?

What is the entropy of the macrostate with 4 bins and 2 energy packets?

2. What if we have two adjacent identical systems, A and B, of 4 bins each. What is the entropy of the state with 4 packets of energy all residing in A? Compare that to the entropy of the state with 2 packets of energy in A and 2 packets in B.

## Solving this problem

1. How many ways can we put 2 packets of energy into 4 bins? Let's label the bins 1, 2, 3, and 4. We can only have one packet in a bin or none, so counting is pretty straightforward.

We can put the first packet into our bins in any of 4 ways. The second packet can't go in the bin that we have used, so we have only three places to put it. So we have 12 (= 4 x 3) ways in which we can place our 2 bits of energy into 4 places.

But energy is energy. If we have one packet in bin 1 and one packet in bin 3, it doesn't matter whether we put a packet into 1 first and 3 second or the other way round. Counting 4 x 3 counts both those ways separately. So we have counted each arrangement twice. The real results is half of 4 x 3 or 6.

We can easily enumerate them: We can have the bins occupied by an energy packet as: (12), (13), (14), (23), (24), and (34). A total of 6, just as we calculated.

So there are 6 ways (microstates) of making a state with 2 energy packets in 4 bins (a macrostate). Since the entropy is

$$S = k_B \ln W$$

where $W$ is the number of microstates, we have $S = k_B \ln 6 = 1.79 k_B$. Note that entropy has the units of $k_B$ — energy per degree Kelvin.

2. For the second situation, our macrostate is specified by how many energy packets are in A and how many in B. (This is like specifying the temperature of each object.) Our number of microstates is the number of ways to get that result.

For the first situation, all 4 packets in A, there is only one way we can do it. Since each bin can only hold one packet, we have to put one in each. Since there is only one way, the entropy of this macrostate is

$$S = k_B \ln W = k_B \ln 1 = 0$$

For the second situation, since the order doesn't matter (all energy packets being equivalent), we can put 2 packets into A first and then 2 packets into B. In part 1 we calculated that there were 6 ways to put 2 packets into 4 bins. So there are 6 ways to put 2 packets into A and 6 ways to put 2 packets into B. You can enumerate them just as we did in part 1. So how many total ways are there? Just the product of 6 x 6 = 36. We can easily see this by looking at what a particular microstate looks like. For A we have a list of 6 possibilities: (12), (13), (14), (23), (24), (34). For B we have a similar list. Any AB microstate is a specification something like: A(13)B(24). Clearly there are 6 x 6 possibilities. So the entropy of this macrostate is

$$S = k_B \ln W = k_B \ln 36 = 3.58 k_B$$

It should be no surprise that this is twice the entropy of the state found in 1.

This tells us that the transition from all the energy in A (A hot, B cold — entropy = 0) can spontaneously go to a state where A and B have equal energies (same temperature — entropy =3.58 $k_B$.)

This calculation tells us that the entropy of the macrostate where the energy is equally divided between the two bins is higher than the entropy of the macrostate where all the energy is in one bin. Only 1 corresponds to all the energy being in A, while 36 correspond to there being equal amounts of energy in each. This tells us that for these 8 bins, as energy is continually fluctuating from bin to bin, that it is 36 times as likely for us to find the energy equally divided than to find it in one bin.

This toy model shows how entropy works. Even though it's purely probabilistic, and there is some chance of finding a higher energy density (temperature) in one part of the system than in another, it's not very likely. And the probability of a non-uniform temperature drops precipitously as the number of particles rises.

* This has the value *μB*, where *μ* is the magnetic moment of the proton and *B* is the strength of the magnetic field.

** The reduced mass of two masses, m_{1} and m_{2}, is equal to m_{1}m_{2}/(m_{1}+m_{2}). This adjusts the KE for the fact that both masses are moving in coordinated ways. See Center of Mass and Diatomic Vibrations.

Joe Redish 2/4/16

#### Follow-on

Last Modified: May 12, 2019