Consider a game, first proposed by Nicolaus Bernoulli, in which a player bets on how many tosses of a coin
 will be needed before it first turns up heads. The player pays a fixed amount initially,
 and then receives  dollars if the coin comes up heads on the 
th toss. The expectation value of the gain is then
| 
 
(1)
 
 | 
dollars, so any finite amount of money can be wagered and the player will still come out ahead on average.
Feller (1968) discusses a modified version of the game in which the player receives nothing if a trial takes more than a fixed number  of tosses. The classical theory of this modified game concluded
 that 
 is a fair entrance fee, but Feller notes that "the modern student will hardly
 understand the mysterious discussions of this 'paradox.' "
In another modified version of the game, the player bets $2 that heads will turn up on the first throw, $4 that heads will turn up on the second throw (if it did not turn up on the first), $8 that heads will turn up on the third throw, etc. Then the expected payoff is
| 
 
(2)
 
 | 
so the player can apparently be in the hole by any amount of money and still come out ahead in the end. This paradox can clearly be resolved by making the distinction
 between the amount of the final payoff and the net amount won in the game. It is
 misleading to consider the payoff without taking into account the amount lost on
 previous bets, as can be shown as follows. At the time the player first wins (say,
 on the th
 toss), he will have lost
| 
 
(3)
 
 | 
dollars. In this toss, however, he wins  dollars. This means that the net gain for the player is
 a whopping $2, no matter how many tosses it takes to finally win. As expected, the
 large payoff after a long run of tails is exactly balanced by the large amount that
 the player has to invest. In fact, by noting that the probability of winning on the
 
th
 toss is 
,
 it can be seen that the probability distribution for the number of tosses needed
 to win is simply a geometric distribution
 with 
.