Further Reading
- Example: Arranging energy and entropy
- Example: Calculating entropy by counting microstates
- Example: Entropy and heat flow
Calculating entropy
Prerequisites
Entropy is a macroscopic thermodynamic quantity that tells how randomly the energy in a system is distributed among its degrees of freedom. The more randomly it is distributed, the higher the entropy is. Since random thermal motions tend to make things more random rather than less, the entropy is a function that can tell us how macroscopic systems will spontaneously evolve (see The 2nd Law of Thermodynamics: A Probabilistic Law). Entropy bridges the microscopic and the macroscopic and can be calculated from both points of view. The way this connection works is described in our page Example: Arranging energy and entropy.
We have two ways to calculate entropy: a macroscopic way and a microscopic way.
A macroscopic equation for entropy
An important part of what entropy is about is thermal equilibrium: how energy moves spontaneously from one part of a system to another when the temperatures of the two parts are different. For this, we can take a macroscopic view of the system and calculate the entropy change of a part of the system that gains or loses thermal energy by the equation
$$ΔS = Q/T$$
where $Q$ is the heat absorbed by the system and $T$ is the temperature of the system. How to use this is illustrated in our page, Example: Entropy and Heat Flow.
While this is fine if the only issue is moving thermal energy, entropy is a lot more general. For this, we need a more microscopic view.
A microscopic equation for entropy
The microscopic way to calculate the entropy of a system that is in a macrostate A is given by the equation
$$S_A = k_B \ln W_A$$
where $k_B$ is Boltzmann's constant ($= 1.38 \times 10^{-23} \mathrm{J/K} = 0.86 \times 10^{-4} \mathrm{eV/K}$) and $W_A$ is the number of microstates corresponding to the macrostate A.
This version makes the connection between a microscopic picture — where the microscopic detail of the energy distribution in the system is specified — and the macroscopic description. (This implicitly emphasizes that entropy is a property of a macrostate. It makes no sense to ask what is the entropy of a microstate.) How to use this is illustrated in our page Example: Calculating entropy by counting microstates.
Microstate counting can be a bit tricky because of an issue that comes from quantum physics: indistinguishability. We learned in our page on Degrees of Freedom that rotating a single atom does not produce a distinguishable state so we don't count the spinning of a single atom as a degree of freedom. Similarly, we cannot distinguish the rotation of a diatomic molecule about its symmetry axis.
From a classical point of view we might imagine atoms as little blocks of something that can be marked. But in quantum physics, it's clear that all atoms (or molecules) of the same kind are indistinguishable. Therefore exchanging them does not lead to a new arrangement. The effect of this on counting is called Gibb's Paradox and is described in our page Example: Entropy of mixing and the Gibbs' Paradox.
Joe Redish 2/15/18
Follow-ons
Last Modified: April 7, 2021