# Why entropy is logarithmic

#### Prerequisites

In Entropy -- implications of the 2nd law of thermodynamicse, we defined the entropy, $S$ of a particular macrostate of a system as equal to $k_B \ln W$, where $W$ is the number of possible arrangements of the system (microstates) corresponding to that macrostate. But why? Why not just say that entropy **is** the number of arrangements? Let's think through why it has to be defined this way.

We want to define entropy to be an * extensive property.* This means that if I have two systems A and B, the total entropy should be the entropy of A plus the entropy of B. This is like mass (2 kg + 2 kg = 4 kg), and not an

*like temperature. (If you combine two systems that are each at 300 K, you have a system at 300 K,*

**intensive property****not**at 600 K!)

What happens to the number of possible arrangements when you combine two systems? If system A can be in 3 different arrangements and system B can be in 5 different arrangements, then there are $3 \times 5 = 15$ possible combinations. They multiply! This '80s music video explains why.

So we can't just define entropy as the number of possible arrangements, because we need the entropy to **add**, not **multiply**, when we combine two systems.

How do you turn multiplication into addition? Just take the logarithm: $3 \times 5 = 15$, but $\ln 3 + \ln 5 = \ln 15$.

So that's why entropy is defined as a constant times ln $W$. $W \times$ (the number of arrangements) is a dimensionless number, so $\ln W$ is too.

The constant out in front could be any constant, but we use Boltzmann's constant, $k_B = 1.38 \times 10^{-23} \mathrm{ J/K}$. When we get to Gibbs free energy, we'll see that this constant has the right units, we see that it's very convenient for entropy to be in units of energy/temperature.

Workout: Why entropy is logarithmic

Last Modified: April 7, 2021