# Permutations and combinations

#### Prerequisite

In our reading on probability, we found that counting the number of different ways that we could get a result (number of microstates) that had the same more general result (macrostate) led to our being able to find the probability of a given macrostate. (For example: the fraction of the number of distinct strings of coin flips that lead to half heads and half tails gives the probability of getting that result.)

When we apply this idea to the concept of energy and entropy, the calculations are a bit less obvious. We rely on the mathematical tool of Permutations and Combinations. While many of you have studied this in a high school math class, it's worth reviewing the basic ideas and notations.

## Permutations without repetition

Here's the basic question:

Suppose we have a set of N distinct objects labeled 1, 2, 3, ... N. How many different ways are there to arrange those N objects?

Although some classes that teach permutations just give a result and ask you to memorize it, it's much better to understand how you get it. It's not that hard and it will be easy to figure out the result in case you forget it.

We just work our way through, counting the number of different ways we can select.

Since we have $N$ objects, we want to create strings in which each number appears once and only once — strings that are a rearrangement of the original set. Let's go through the process of selecting an arrangement and count how many ways there are to do it.

We have $N$ different ways of selecting the first object. Let's select one and set it aside.

Now we only have $N-1$ objects left to choose from. We have $N-1$ different ways of selecting the second object. The total number of ways we could have selected the first two objects is $N \times (N-1)$ since we could put our second object with any of the $N$ objects that we selected first.

Let's see how this works just for concreteness. Suppose we have 3 objects: 1, 2, and 3. Our first results are 1, 2, and 3. Now let's select a second object. If we had selected 1 as our first choice, we only have 2 or 3 to choose from: two possible results. If we had selected 2 as our first choice, we only have 1 or 3 to choose from: two possible results: (12) and (13). Similarly if we had selected 3 first we only have 1 or 2 to choose, so we get the two possible results (31) and (32). So our only possible strings of 2 numbers selected from the three are (I've grouped them by the first number chosen):

(12) (13)     (21)(23)     (31)(32)

The "$N \times (N-1)$" structure ( 3 x 2) is clear.

As we go on, we see we have one fewer object to select from each time, and it increases our number of strings by multiplying by the number of different ways. If we select 4 objects the result would be $N \times (N-2) \times (N-3) \times (N-4)$.

In general, the case of counting the number of permutations without repetition — the number of ways of choosing $n$ objects out a larger collection of $N$ objects (as before we chose 2 objects out of a collection of 3):

$$P(N,n) = N \times (N-1) \times ... \times (N-n+1)$$

(To see why we have the "+1" in the final parentheses, consider our example. In that case, $N = 3$ and $n =2$ and we know the result is $3 \times 2$. So we get $P(3,2) = 3 x (3 -2 +1) = 3 \times 2$.)

For both permutations and combinations, the factorial function is useful. For any integer, the factorial of $N$ is defined as the product of all the integers from $N$ down to $1$:

$$N! = N \times (N-1) \times (N-2) \times (N-3) ... \times 1$$

For consistency, it turns out that one needs to define

$$0! = 1$$.

With this notation, we can see that $P(N.n)$ is the full factorial of $N$ divided by the factorial of $n$:

$$P(N,n) = \frac{N!}{n!}$$

Check it out for some specific case so you see why this works.

## Permutations with repetition

If we allow repetition, the answer is a lot simpler. Suppose we have $N$ types of objects and we want to see how many ways we can select $n$ of them but we can have as many duplications as we want.

We can choose the first $N$ different ways, the second we can also choose $N$ different ways, the third one $N$ different ways and so on. So our result will be $N \times N \times N ... \times N$ $n$ times, or $N^n$.

Compare the result for the case of 3 objects of which 2 are selected. If we don't allow repetition, we get these possible pairs:

(1,2)     (1,3)

(2,1)     (2,3)

(3,1)     (3.2)

Six choices, just as given by the formula: $3 \times 2 = 6$

If repeats are allowed, we get these possible pairs

(1,1)     (1,2)     (1,3)

(2,1)     (2,2)     (2,3)

(3,1)     (3.2)     (3,3)

Nine choices, just as given by the formula: $3^3 = 9$.

## Combinations

Now suppose that we care about selecting but don't care what order they are in. In our example, we got (1,2) and (2,1) and considered them to be different arrangements. What if we want to consider them the same?

Why might we do that? Well, suppose we were choosing up sides for a game from a group of possible players. What would matter would be the team we selected, not what order we chose them. Where we use it is in studying entropy: there we analyze how many different ways energy can be arranged in a system. Since "energy is energy" it doesn't matter in which order we choose packets of energy, just how many go in each bin.

This is pretty easy to construct. We know how many ways we can select $n$ objects out of $N$: it's just $P(N,n)$. We also know how many different ways we can rearrange those $n$ objects: $n!$. So our result will be:

$$C(N,n) = \frac{P(N,n)}{n!}$$

Putting in our expression for $P(N,n)$, this has a nice symmetry:

$$C(N,n) = \frac{N!}{n!(N-n)!}$$

Checking with our case of $N = 3$ and $n = 2$ our formula gives: 3!/(2! 1!). That's still 6. And we see in our table for $P(3,2)$ that there are no repetitions. Check it for yourself with $C(4,2)$ to see that it works correctly.

Note that $C(N,n)$ is sometimes written as $\bigg(\begin{matrix}N \\ n \end{matrix}\bigg)$

Joe Redish 3/20/19

Article 565