I’ve been thinking about the Sleeping Beauty problem a lot and I have come up with a few variations that help with intuition. So far I don’t see any clear normative argument why your belief *should *be anything in particular (although some beliefs are obviously wrong.) My original argument was circular because I wanted to prove that your willingness to be reveals that you assign equal probability but I essentially assumed you assigned equal probability in calculating the payoffs to those bets.

Nevertheless the argument does show that the belief of 1/2 is a *consistent* belief in that it leads to betting behavior with a resulting expected payoff which is correct. On the other hand a belief of 1/3 is not consistent. If, upon waking, you assign probability 1/3 to Heads you will bet on Tails and you will expect your payoff to be (2/3)2 – (1/3)1 = $1. But your true expected payoff from betting tails is 50 cents. This means that you are vulnerable to the following scheme. At the end of their speech, the researchers add “In order to participate in this bet you must agree to pay us 75 cents. You will pay us at the end of the experiment, and only once. But you must decide now, and if you reject the deal in any of the times we wake you up, the bet is off and you pay and receive nothing.”

If your belief is 1/3 you will agree to pay 75 cents because you will expect that your net payoff will be $1 – 75 cents = 25 cents. But by agreeing the deal you are actually giving yourself an expected loss of 25 cents (50 cents – 75 cents.) If your belief is 1/2 you are not vulnerable to these Dutch books.

Here are the variations.

- (Clones in their jammies) The speech given by the researchers is changed to the following. “We tossed a fair coin to decide whether we would clone you, and then wake up both instances of you. The clone would share all of your memories, and indeed you may be that clone. Tails: clone, Heads: no clone (but still we would wake you and give you this speech and offer.) You (and your clone if Tails) can bet on the coin. In the event of tails, your payoff will be the sum of the payoffs from you and your clone’s bet (and the same for you if you are the clone.)”
- (Changing the odds) Suppose that the stakes in the event of Heads is $1.10. Now those with belief 1/2
*strictly*prefer to bet Heads (in the original example they were indifferent.) And this gives them an expected loss, whereas the strategy of betting Tails every time would still give an expected gain. This exaggerates the weirdness but it is not a proof that 1/2 is the wrong belief. The same argument could be applied to the clones where we would have something akin to a Prisoner’s dilemma. It is not an unfamiliar situation to have an individual incentive to do something that is bad for the pair. - Suppose that the coin is not fair, and the probability of Tails is 1/n. But in the event of Tails you will be awakened n times. The simple counting exercise that leads to the 1/3 belief seemed to rely on the fair coin in order to treat each awakening equal. Now how do you do it?
- The experimenters give you the same speech as before but add this: “each time we wake you, you will place your bet BUT in the event of Tails, at your second awakening, we will ignore your choice and substitute a bet on Tails on your behalf.” Now your bet only matters in the first awakening. How would you bet now? (“Thirders” who are doing simple counting would probably say that, conditional on the first awakening, the probability of Heads is 1/2. Is it?)
- Same as 4 but the bet is substituted on the first awakening in the event of Tails. Now your bet only matters if the coin came up Heads or it came up Tails and this is the second awakening. Does it make any difference?

## 9 comments

Comments feed for this article

March 28, 2010 at 12:52 pm

Kevin DickI think your framing of the Dutch Book is wrong because it uses conditionals. Assume my assertion that if your memory is erased, you should evaluate this problem prospectively with no updating. There are two possibilities:

Coin is Tails: Your payoff is 1 + 1 – .75 = 1.25

Coin is Heads: Your payoff -1 – .75 = -1.75

So you would see that the expected value is -.25. However, you would pay any amount less than .50 and this bet would be profitable.

It is actually the conditional 1/2 believer that is inconsistent, because he would pass up these profitable bets.

March 28, 2010 at 4:56 pm

Emil TemnyalovHere’s the variation that I proposed in one of the comments to the original post: The problem is stated just as before, except you will be awoken once on Heads and n (instead of 2) times on Tails, and each time you are awoken you will be offered the bet and have your memory erased.

I think it gives by far the most intuition for why 1/2 should not be your belief. If you are indifferent between betting Heads and Tails you’ll be losing a lot of money. I find this Dutch book much more convincing.

March 28, 2010 at 7:55 pm

jeffEmil, in your n-awakening version, suppose we boost the stakes slightly for Heads as in my example 2 and the experimenters ignore you on all but the first awakening, as in my example 4. In the other n-1 Tails-awakenings they ignore what you say and place a bet on Tails on your behalf. Would this change how you bet?

March 29, 2010 at 5:10 pm

Emil TemnyalovSo just to be clear: the first time I’m awoken they’ll offer me a bet that slightly favours Heads, and if there are any subsequent awakenings (implying that the coin landed Tails), they’ll not offer me the bet (which is equivalent to them automatically placing a bet on Tails on my behalf).

In this setup my belief is clearly 1/2, so if the bet slightly favours Heads, I’ll prefer to bet Heads. I think this is consistent with the 1/3 answer in the original question.

March 29, 2010 at 5:23 pm

Emil TemnyalovMy bad, I realized that your setup is actually different. I think that what I wrote above is consistent, it just answers a different question.

You’re saying that they’ll always offer me the bet, but they’ll ignore my answer in all but the 1st awakenings. In that case of course I’ll take the favourable Heads odds and bet Heads.

However, this bet/setup is not reflective of my belief that the coin landed Heads conditional on being woken up and offered a bet. It also reflects my information about how the experimenters are going to ignore some of my answers and I don’t see how this setup helps with respect to the original belief question. I think betting Heads here is consistent with betting Tails in the original question, or betting Tails in the n-variation.

March 29, 2010 at 7:41 pm

jeffThank you Emil. Now suppose that, in the event of Tails, they will ignore your answer in all but the *last* awakening. (just as before they will bet Tails on your behalf in all the other Tails-awakenings. On the Heads-awakening, your chosen bet will be effective. )

So now your bet matters only if the coin came up Heads or if the coin came up Tails and this is the last awakening.

How would you bet then?

March 28, 2010 at 6:45 pm

To SIA or not to SIA? Link round-up. « Robert Wiblin[...] Clones In Their Jammies, and Other Variations – Cheap Talk [...]

March 29, 2010 at 11:53 pm

AnonymousYou are standing in a long line of people outside a door, every minute two people go into the door. It is finally your turn and you go inside the door, and a man in a lab coat explains the experiment that has been going on (the sleeping beauty problem you stated above). He also tells you that they bring two people into the room for every awakening, an odds maker and a gambler, both of which are told the same story. He then tells you that you are the odds maker and asks you what odds you would lay for the gambler that you were brought into the room on a tails coin flip?

If after you make your decision he then tells you that he is going to erase your memory of the experiment and send you back to the end of the line, do you have more or less information about the coin than sleeping beauty?

March 28, 2010 at 6:48 pm

Robert WiblinHere is a roundup of a bunch of blog posts and articles about this: http://robertwiblin.wordpress.com/2010/03/26/to-sia-or-not-to-sia/ . In particular I’d like to hear your view of the God’s Coin Toss: http://robertwiblin.wordpress.com/2010/03/27/sia-or-not-the-gods-coin-toss-thought-experiment/.