An n-gram is a word of length that we can create from a set of letters. With two letters , we can form these 3-grams: , , , , , , , and . We will let stand for tails and for heads, so that a 3-gram can represent an outcome if we flip a coin 3 times.

Penney’s game concerns two good friends and their -grams. The game is simple and seemingly fair. Soon Player 1 will call off their long friendship and leave, however. We will find that, whenever , Player 2 is always more likely to win.

# 1. Problem description

Let . Player 1 selects an -gram, then Player 2 selects another. They record the most current outcomes each time they toss a fair coin. The player whose -gram matches the most current outcomes first wins.

Show that, for any Player 1’s choice, Player 2 can select an -gram that is more than 50% likely to occur first.

Let’s make the game more concrete. Suppose Player 1 selects , then Player 2 . We can use the digits of as our fair coin. These digits are known to be nearly uniformly distributed. That is, for all intents and purposes, all ten digits are equally likely to occur. So, if the digit is even, and if the digit is odd.

What do we get?

Since appears first, after coin toss number five, Player 2 wins. Player 1 objects at this point. It should have been if the digit is odd, and if even.

Player 2 smirks. still occurs first, this time after coin toss number eight.

# 2. A single game

Given Player 1 and Player 2’s choices, we want to find the probability that Player 2 wins. If we play Penney’s game times, the ratio

will be close to the exact probability that Player 2 wins. So the output of a simulation is whether Player 2 won or lost.

And the input? Like with craps, we consider the sequence of coin tosses. Any sequence that ends with Player 2’s choice results in the output of “win,” and any that ends with Player 1’s choice results in the output of “loss.” Identifying the input and the output has become easy now, right?

In fact, the code for a single Penney’s game is almost identical to that for craps:

% Set Player 1 and Player 2's choices player1_choice = 'HHH'; player2_choice = 'THH'; % Create a two-sided coin coin = 'TH'; % Form a 3-gram (i.e. toss the coin 3 times) three_gram = coin(randi(2, 1, 3)); % Check if either player has won if (strcmp(player2_choice, three_gram)) display('Player 2 won!'); elseif (strcmp(player1_choice, three_gram)) display('Player 2 lost...'); else while (true) % Keep the last two tosses and add in the new toss three_gram(1) = three_gram(2); three_gram(2) = three_gram(3); three_gram(3) = coin(randi(2)); if (strcmp(player2_choice, three_gram)) display('Player 2 won!'); break; elseif (strcmp(player1_choice, three_gram)) display('Player 2 lost...'); break; end end end

I highlighted the main differences between the two code. Line 9 shows that we stored the three most current outcomes as a **string**. We can think of a string as an **array** (a sequence) of characters. In craps, we had a** scalar** variable, sum_new, to keep track of such information. We can remember the three most current outcomes by shifting the characters by one. Lines 21-23 say, “Goodbye, first character (previously third most current outcome). Second character, you are the first now, and third, you are the second. Why hello, new coin toss!”

On line 12, I used the Matlab command **strcmp** (“string compare”) to check if two strings are identical. In Python and R, the comparison operator** ==** (“equals”) works fine, i.e. we can write **player2_choice == three_gram** and this returns a value of true or false. But when Matlab compares two arrays (like strings) using ==, it returns an array of true’s and false’s. Each value indicates whether a character from one string matched the character from the other. Since two strings are the same if all their characters match, we could write **all(player2_choice == three_gram)**. Calling strcmp is simpler.

# 3. Monte Carlo simulation

We can play a single Penney’s game. Let’s now use a loop to run the game many times and estimate the probability that Player 2 wins.

function p = calculate_probability(player1_choice, player2_choice) % Reset the number of wins numWins = 0; % Set the number of simulations N = 10^4; % Run the simulations for i = 1 : N % Run a single Penney's game and increase the number % of wins by 1 if Player 2 wins the game [...] end % Return the probability that Player 2 wins p = numWins / N; end

Notice that I declared a (user-defined) function on line 1. Not only do we want the winning probability when Player 1 selects and Player 2 selects , but also for all other possible combinations of choices. Our code would be more structured and reusable if we allow sending player choices as an input to a function. The function calculates the probability that Player 2 wins under the circumstances, then returns the value to us as an output.

The table below shows probabilities that I obtained from running 10,000 simulations for each possible combination of choices. The row labels correspond to Player 1’s choices, and the column labels to Player 2’s choices.

Do you notice something striking? No matter what Player 1 selects, Player 2 can select another so that he has greater than 50% chance of winning. Depending on Player 1’s choice, Player 2 can win 66.7% , 75% , or even 87.5% of the time!

# 4. Mathematical proof

Yukata Nishiyama provides an elegant proof for how Player 2 wins with the exact probabilities that I listed above. Given Player 1’s choice, showing that Player 2’s optimal choice leads to a probability greater than 50% follows one of these three scenarios:

● Player 1 selects , and Player 2 selects .

Player 1 wins only when the first three coin tosses are heads. There is a chance that this occurs. In all other scenarios (they occur with a probability of ), a tails precedes a run of heads and Player 2 becomes the victor.

● Player 1 selects , and Player 2 selects .

This time, Player 1 wins only from sequences such as , , , and so on. Add the probabilities that these occur (recall the sum of a geometric series):

Hence, the probability that Player 2 wins is .

● Player 1 selects , and Player 2 selects .

We can ignore all initial tosses that are tails. So take the first toss to be heads, and consider what must happen in the subsequent tosses in order for Player 2 to win. Can you see why the following equation is true? (Again, a lot like craps!)

If the second toss results in heads (occurs with a probability of ), Player 2 is guaranteed to win. If the second toss is tails, then as long as the third toss is also tails (the two events occur with a probability of ), the sequence starts over and Player 2 gets another try at a win. Solve for to get .

# Notes

We can easily modify our code to analyze 4-grams. The table below lists the optimal play by Player 2 and the corresponding winning probability, which I got from Monte Carlo simulation (with 10,000 simulations) and from Martin Gardner’s book “Time Travel and Other Mathematical Bewilderments.”

John Conway (the Game of Life man) developed an algorithm that finds the exact probability that Player 2 wins. It works on any -gram, even when two players choose grams of different length! Unequal lengths provide additional insight to our game of waiting. For example, a 4-gram is more likely to occur first than a 3-gram . Strange, no?

The key to Conway’s algorithm is how he measures, what I’d call, “self-similarity” (how similar an -gram is to itself) and “contra-similarity” (how similar the -gram is to the other), and the difference between these two numbers. We measure this difference from Player 1’s perspective and Player 2’s, and obtain and . The ratio is the probability that Player 2 wins. Interestingly, Conway did not prove why his algorithm works. Even Gardner admitted to the following in his book:

“I have no idea why it works. It just cranks out the answer as if by magic, like so many of Conway’s other algorithms.”

Recently, Nishiyama gave a short proof using the idea of average wait time. His proof makes use of three theorems by Stanley Colling in a 1982 paper. At the time of writing this post, I could not find a copy of Colling’s paper to understand Nishiyama’s proof better.

You can find Yukata Nishiyama’s paper on Penney’s game here:

Pattern Matching Probabilities and Paradoxes

You can find the code in its entirety and in other languages here:

Download from GitHub