# St. Petersburg Paradox

The St. Petersburg Paradox is a problem in economics that was first proposed by Nicolas Bernoulli in 1713 in a letter.  It involves a lottery where you buy a ticket to play a game where a coin is flipped until heads comes up.  If heads comes up on the $n$th toss you get $2^{n-1}$ dollars.  So if heads comes up on the first toss you get one dollar and if it comes up on the fourth you would get 8 dollars.  The question is how much would you pay for a ticket to play this game.  In economics theory, the idea is that you would play if the expectation value of  the payout minus the ticket price is positive. The paradox for this game is that the expectation value of the payout is infinite but most people would pay no more than ten dollars.  The solution to the paradox  has been debated for the past three centuries.  Now, physicist Ole Peters argues that everyone before has missed a crucial  point and provides a new resolution to the paradox.  Peters also shows that a famous paper by Karl Menger in 1934 about this problem contains two critical errors that nullify Menger’s results.  I’ll give a summary of the mathematical analysis below, including my even simpler resolution.

The reason the expectation value of the payout is infinite is that the distribution is not normalizable. This can be seen easily because while the probability of getting $n$ heads in a row decreases exponentially as $p(n)=(1/2)^n$, the payout increases exponentially as $S(n)=2^{n-1}$. The product is always 1/2 and never decays.  The expectation value is thus

$E[S]=\sum_{n=1}^\infty p(n)S(n) = 1/2+1/2 + \cdots$

and diverges.  The first proposed resolution of the paradox was by Daniel Bernoulli in a 1738  paper submitted to the Commentaries of the Imperial Academy of Science of St. Petersburg, from which the paradox received its name.  Bernoulli’s suggestion was that people don’t really value money linearly and proposed a utility function $U(S) = \log S$, so the utility of money decreases with wealth. Given that this now grows sub-exponentially, the expectation value of $U(S)$ is thus finite and resolves the paradox.  People have always puzzled over this solution because it seems ad hoc.  Why should my utility function be the same as someone else’s?  Menger suggested that he could always come up with an even faster growing payout function to make the expectation value still divergent and declared that all utility functions must be bounded.  According to Peters, this has affected the course of economics for the twentieth century and may have led to more risk taking than warranted mathematically.

Peter’s resolution is that the expectation value of the Bernoulli utility function is actually the time average of  the growth rate in wealth of a person that plays repeatedly.  Hence, if they pay more than a certain price they would certainly become bankrupt.  The proof is quite simple.  The factor by which a person’s wealth at round $i$ changes is given by the expression

$r_i = \frac{W_i-C+S_i}{W_i}$

where $W_i$ is the wealth, $C$ is the cost to play, and $S_i$ is the payout at round $i$.  The total fractional change after $T$ rounds is thus $\bar{r}_T=(\prod_{i=1}^T r_i)^{1/T}$.  Now transform from rounds of the game played into $n$ the number of tosses until the first heads.  This brings in the number of the occurrence of $n$, $k_n$ to yield

$\bar{r}_T= \prod_{n=1}^{n_{\infty}} r_n^{k_n/T}=\prod_{n=1}^{n_{\infty}} r_n^{p_n}$,

where $p_n$ is the probability for $n$ tosses.  The average growth rate is given by taking the log, which gives the expression

$\sum_{n=1}^\infty \left(\frac{1}{2}\right)^n (\ln(W-c+2^{n-1})-\ln W)$

which is equivalent to the Bernoulli solution without the need for a utility function.

Now my solution, which has probably been proposed previously, is that we don’t really evaluate the expectation value of the payout but we take the payout of the expected number of tosses, which is a finite amount.  Thus we replace $E(S(n))$ with $S(E(n))$, where

$E(n) =\sum_{n=1}^\infty n \left(\frac{1}{2}\right)^n=2$,

which means we wouldn’t really want to play for more than 2 dollars. This might be a little conservative but it’s what I would do.