“If a currency system could only have two coins, what should the values of those coins be?” – from Numberplay. Implicit in the question (the way it’s stated there) is that there are 100 cents in a dollar; I’m going to generalize to a dollar consisting of k cents.

Let’s say that we’re requiring that you be able to make change for any number of cents from 0 to k-1. and that we’d like to be able to do this with the smallest possible number of coins, on average. In order to make change for one cent we will need a one-cent coin. So there’s really only one parameter to play with — we’ll have two coins, of values 1 and n — and we want to choose n to minimize the average number of coins needed. We’ll assume that every possible amount of change from 0 to k-1 is equally likely. (This is probably not true, but the way in which it’s not true should evolve in tandem with the currency system. For example, in the US we have a 25-cent coin and so lots of prices are multiples of 25 cents. In the eurozone the comparable coin is a 20-cent coin; are prices which are multiples of 20 cents common?)

So we can break this down into, on average, how many 1-cent we’ll need and how many n-cent coins we’ll need when making change. On average we’ll need (n-1)/2 1-cent coins, if n happens to be a factor of 100. For example if n is 5 (that is, the other coin is a nickel) then on average we need 2 pennies, since we’re equally likely to need 0, 1, 2, 3, or 4 pennies. Let’s ignore the 1 and call this n/2. And to make change for m cents we’ll need m/n n-cent coins. On average we’re making change for k/2 cents, so on average we need k/2n n-cent coins.

So we want to minimize $k/2n + n/2$ as a function of n. Differentiating with respect to n gives $-k/(2n^2) + 1/2$; this is zero when $n = \sqrt{k}$. So if you only have two coins, you want a one-cent coin and a (√ k)-cent coin. Then on average you’d need (√ k)/2 pennies and (√ k)/2 n-cent coins, or a total of √k coins, when making change.

In the US case this means you want a penny and a dime. As it turns out an 11-cent coin would work just as well. The R function

f = function(k,n){sum(((0:(k-1))%%n) + floor((0:(k-1))/n))}

will give you the total number of coins needed to make each of 0, 1, …, k-1 cents out of 1-cent and n-cent coins. Interestingly, f(100, 10) and f(100, 11) both return 900 — that is, we’d need 900/100 = 9 coins, on average, if we had only pennies and dimes, or if we had pennies and 11-cent coins. For practical purposes, of course, we also want coins that are easily positioned for mental arithmetic.

It seems reasonable to guess that if you have three coins you’d want coins worth roughly 1, k1/3, and k2/3 cents, and in general denominations should be evenly spaced; this seems to be the principle that, say, euro coins/notes are based on. These are valued at 1 cent, 2 cents, 5 cents, and powers of ten times these. It’s also the principle that US currency would be based on except for the historical factors that have led to people not using half-dollar coins and \$2 bills.) But it takes a bit more thought to figure out the optimal ways to make change in that situation, and it’s more than a one-liner to do the computation… any thoughts?

I’m looking for a job, in the SF Bay Area. See my linkedin profile.