I’ve been thinking about the Sleeping Beauty problem a lot and I have come up with a few variations that help with intuition.  So far I don’t see any clear normative argument why your belief should be anything in particular (although some beliefs are obviously wrong.)  My original argument was circular because I wanted to prove that your willingness to be reveals that you assign equal probability but I essentially assumed you assigned equal probability in calculating the payoffs to those bets.

Nevertheless the argument does show that the belief of 1/2 is a consistent belief in that it leads to betting behavior with a resulting expected payoff which is correct.  On the other hand a belief of 1/3 is not consistent.  If, upon waking, you assign probability 1/3 to Heads you will bet on Tails and you will expect your payoff to be (2/3)2 – (1/3)1 = $1.  But your true expected payoff from betting tails is 50 cents.  This means that you are vulnerable to the following scheme.   At the end of their speech, the researchers add “In order to participate in this bet you must agree to pay us 75 cents.  You will pay us at the end of the experiment, and only once.   But you must decide now, and if you reject the deal in any of the times we wake you up, the bet is off and you pay and receive nothing.”

If your belief is 1/3 you will agree to pay 75 cents because you will expect that your net payoff will be $1 – 75 cents = 25 cents.  But by agreeing the deal you are actually giving yourself an expected loss of 25 cents (50 cents – 75 cents.)  If your belief is 1/2 you are not vulnerable to these Dutch books.

Here are the variations.

  1. (Clones in their jammies) The speech given by the researchers is changed to the following.  “We tossed a fair coin to decide whether we would clone you, and then wake up both instances of you.  The clone would share all of your memories, and indeed you may be that clone.  Tails: clone, Heads: no clone (but still we would wake you and give you this speech and offer.)  You (and your clone if Tails) can bet on the coin.  In the event of tails, your payoff will be the sum of the payoffs from you and your clone’s bet (and the same for you if you are the clone.)”
  2. (Changing the odds)  Suppose that the stakes in the event of Heads is $1.10.  Now those with belief 1/2 strictly prefer to bet Heads (in the original example they were indifferent.)  And this gives them an expected loss, whereas the strategy of betting Tails every time would still give an expected gain.  This exaggerates the weirdness but it is not a proof that 1/2 is the wrong belief.  The same argument could be applied to the clones where we would have something akin to a Prisoner’s dilemma.  It is not an unfamiliar situation to have an individual incentive to do something that is bad for the pair.
  3. Suppose that the coin is not fair, and the probability of Tails is 1/n.  But in the event of Tails you will be awakened n times.  The simple counting exercise that leads to the 1/3 belief seemed to rely on the fair coin in order to treat each awakening equal.  Now how do you do it?
  4. The experimenters give you the same speech as before but add this:  “each time we wake you, you will place your bet BUT in the event of Tails, at your second awakening, we will ignore your choice and substitute a bet on Tails on your behalf.”  Now your bet only matters in the first awakening.  How would you bet now?  (“Thirders” who are doing simple counting would probably say that, conditional on the first awakening, the probability of Heads is 1/2.  Is it?)
  5. Same as 4 but the bet is substituted on the first awakening in the event of Tails.  Now your bet only matters if the coin came up Heads or it came up Tails and this is the second awakening.  Does it make any difference?
About these ads