The St. Petersburg Paradox

On the topic of numeric paradoxes, here’s another one that drove a lot of work in economic and decision theory: the St. Petersburg paradox.

Here’s the deal. You’re offered a chance to play a game wherein you repeatedly flip a coin until it comes up heads, at which point the game is over. If the coin comes up heads the first time, you win a dollar. If it takes two flips to come up heads, you win two dollars. The third time, four dollars. The fourth time, eight dollars. And so on; the rule is, if you see heads on the th flip, you win dollars.

How much would you pay to play this game?

The paradox is: the expected value of this game is infinity, so according to all your pretty formulas, you should immediately pay all your life savings for a single chance at this game. (Each possible outcome has an expected value of 50 cents, and there are an infinite number of them, and expectation distributes over summation, so the expected value is an infinite sum of 50 cents, which works out to be a little thing I like to call infinity dollars.)

Of course that’s a paradox because it’s crazy talk to bet more than a few bucks on such a game. The paradox highlights at least two problems with blithely using positive EV as the reward you’ll get if you will play the game:

  1. It assumes that the host of the game actually has infinite funds. The Wikipedia article has a very striking breakdown of what happens to the St. Petersburg paradox when you have finite funds. It turns out that even if your backer has access to the entire GDP of the world in 2007, the expected value is only $23.77, which is quite a bit short of infinity dollars.
  2. It assumes you play the game an infinite number of times. That’s the only way you’ll get the expected value in your pocket. And the St. Petersburg paradox is a great example of just how quickly your actual take-home degenerates when subject to real-world constraints like finite repetitions. It turns out that if you want to make $10, you’ll have to play the game one million times; if you’re satisfied with $5, you’ll still have to play a thousand times.

The classical answer to the paradox has been to talk about utility, marginal utility and things like that; i.e., people with lots of money value more money less than people without very much money. And recent answers to the paradox, e.g. cumulative prospect theory, are along the lines of modeling how humans perceive risk, which (unsurprisingly) is not really in line with the actual probabilities.

But it seems to me that these solutions all involve modeling human behavior and explaining why a human wouldn’t pay a lot of money to play the game, either because money means less as it gets bigger or because they mis-value risks. But the actual paradox is not about human behavior or psychology. It’s the fact that the expected value of a game is not a good estimate of the real-world value of a game, because expected value can make assumptions about infinite funds and infinite plays, and we don’t have those.

So my solution to the St. Petersburg paradox is this: drop all events that have a probability less than some small epsilon, or a value more than some large, um, inverse epsilon. That neatly solves both of the infinity assumptions. (In this particular case one bound would do, because the probabilities drop exponentially as the values rise exponentially, but not in general.) I’ll call this the REV: the realistically expected value.

In this case, if you set the lower probability bound to be .01, and the upper value bound to be one million, then the REV of the St. Petersburg paradox is just about three bucks. (The upper value bound doesn’t even come into play.) And that’s about what I’d pay to play it.

So there you go. Fixed economics for ya.

To reply to the article, enter your email address. A copy of the article will be sent to you via email.