In a game, two players, and , performing Bernoulli trials and alternately, with occurance probabilities and respectively. Player make his turn first. The game ends whenever a player succeed in his trial but continues if neither wins.

Consider a round as both player completed one trial. Then define:

  • to be the probability that player wins before the end of -th trial
  • to be the probability that player wins before the end of -th trial
  • to be the probability that the game draws at the end of -th trial

Then the following are established:

From them, we can derive

We know that these probabilities are correct because

Now consider again

At $n\to\infty$,

Therefore, this is the probability of winning the game by respective players.

In fact, there is a easier way to solve for and . In one round, the two players win with probabilities and respectively. If none of them wins, the game is reset and play again. So consider a Markov chain with three states, “A wins”, “B wins”, and “draw”, where the former two are terminating states. We can see that, as long as we are at the terminating state, wins with probability

and wins with probability

This is easier as no infinite series or recurrence relations are involved.

Further, to make the game fair, i.e. , we have

In other words, the first player has an advantage of just because he goes first.