Via Robert Wiblin here is a fun probability puzzle:
The Sleeping Beauty problem: Some researchers are going to put you to sleep. During the two days that your sleep will last, they will briefly wake you up either once or twice, depending on the toss of a fair coin (Heads: once; Tails: twice). After each waking, they will put you to back to sleep with a drug that makes you forget that waking.
The puzzle: when you are awakened, what probability do you assign to the coin coming up heads? Robert discusses two possible answers:
First answer: 1/2, of course! Initially you were certain that the coin was fair, and so initially your credence in the coin’s landing Heads was 1/2. Upon being awakened, you receive no new information (you knew all along that you would be awakened). So your credence in the coin’s landing Heads ought to remain 1/2.
Second answer: 1/3, of course! Imagine the experiment repeated many times. Then in the long run, about 1/3 of the wakings would be Heads-wakings — wakings that happen on trials in which the coin lands Heads. So on any particular waking, you should have credence 1/3 that that waking is a Heads-waking, and hence have credence 1/3 in the coin’s landing Heads on that trial. This consideration remains in force in the present circumstance, in which the experiment is performed just once.
Let’s approach the problem from the decision-theoretic point of view: the probability is revealed by your willingness to bet. (Indeed, when talking about subjective probability as we are here, this is pretty much the only way to define it.) So let me describe the problem in slightly more detail. The researchers, upon waking you up give you the following speech.
The moment you fell asleep I tossed a fair coin to determine how many times I would wake you up. If it came up heads I would wake you up once and if it came up tails I would wake you up twice. In either case, every time I wake you up I will tell you exactly what I am telling you right now, including offering you the bet which I will describe next. Finally, I have given you a special sleeping potion that will erase your memory of this and any previous time I have awakened you. Here is the bet: I am offering even odds on the coin that I tossed. The stakes are $1 and you can take either side of the bet. Which would you like? Your choice as well as the outcome of the coin are being recorded by a trustworthy third party so you can trust that the bet will be faithfully executed.
Which bet do you prefer? In other words, conditional on having been awakened, which is more likely, heads or tails? You might want to think about this for a bit first, so I will put the rest below the fold.
There is a logical trap to watch out for.
- Since I will have no memory of any previous awakening, if I am awakened twice I will make the same choice both times (unless I am indifferent)
- So if I bet tails, and the coin has actually come up tails, I will be awakened twice, both time betting tails and winning. I would make $2. If it comes up heads I will be awakened only once and lose. I would lose $1.
- But if I bet heads I would win $1 if the coin comes up heads and lose $2 if tails.
If you accept 1-3, and you trust that the researchers tossed a fair coin you strictly prefer to bet on tails. In fact, you would only bet on heads if you were given 3 to 1 odds.
But you should not accept 2 or 3. On the other hand, 1 is absolutely true. I will make the same choice twice. But 2 and 3 make the fallacious jump to the conclusion that I must make the same choice twice.
Instead, when I am awakened, I am free to make either bet. And whatever choice I make now cannot have a causal effect on my choice in a (possible) second awakening. So I should think like this.
- In that other awakening I will make some bet, and my choice now has no effect on it.
- I should therefore measure the incremental gain or loss from my bet at this moment.
And we can solve the problem this way because it turns out that the incremental gain does not depend on how I will bet in the other awakening.
- Suppose that I would bet on tails in a second awakening. If I currently bet tails, then if the coin came up tails I will win twice, if heads I will lose once. Since the coin is fair, my expected payoff is 50 cents (.5 probability of winning twice, .5 probability of losing once.) On the other hand, if I currently bet heads, then if the coin came up tails I will win once, lose once; if heads I will win once. The expected payoff is again 50 cents. I am indifferent.
- Suppose that I would bet on heads in a second awakening. If I currently bet tails, then if the coin came up tails I will win once, lose once; if heads I will lose once. If I currently bet heads, then if the coin came up tails i will lose twice; if heads I will win once. With either bet I have an expected loss of 50 cents. I am again indifferent.
- Likewise I would be indifferent currently between the two bets if I had some probabilistic belief about how I would bet in the other awakening.
So no matter what I believe my other bet will be, I am indifferent between betting on heads or tails currently. This reveals that I believe it is equally likely that the coin has fallen on heads or tails. This despite the fact that, indeed, my payoff is strictly higher if my strategy is to bet on tails at every opportunity. It is a feasible strategy, it is a consistent strategy in the sense that I would have no reason to deviate from it when I actually have to place my bet (point 1 confirms this) and it gives me an expected payoff of 50 cents!
(The truly bizarre thing is that I am indifferent as to how I bet now, but I strictly prefer that the other time I bet tails. The resolution of course is that if there is another time, then indeed tails is the better bet. 🙂 )
(Follow the Wiblin link to see how physicists are (apparently) willing to bet large sums of money on the existence of infinitely many parallel universes, based on a similar example.)

22 comments
Comments feed for this article
March 25, 2010 at 11:46 pm
jeff
Here I am commenting on my own article. Because the more I think about it, the less convinced I am of my own reasoning. :). When I hold fixed my other bet and consider the incremental gain from changing my current bet I appeal to the fair coin to attach equal probabilities to heads and tails. But arguably at this stage of decision-making I could update that probability. I could imagine someone saying that since there are more awakenings with tails I should attribute higher conditional probability to tails at this stage. But I am not sure how to make this formal. Do the rules of conditional probability say anything?
March 28, 2010 at 1:47 am
Warren Dew
Yes, the rules of conditional probability say something. Basically waking up is new information allowing you to take into account the answer to the question, “what are the chances that this wakeup is the result of a toss of heads, as opposed to the result of a toss of tails?” The answer is obviously 1/3. Thus, the answer to the original question is also 1/3: the 1/3 chance of the wakeup being the result of a heads times the certainty that the result was heads in that case, plus the 2/3 chance of the wakeup being the result of a tails times the 0 chance that the toss was heads in that case.
I do not understand how anyone can think the correct answer is 1/2, or think that the question is ill posed. Then again, I don’t understand how anyone can reject the doomsday argument, either.
March 26, 2010 at 12:50 am
Emil Temnyalov
I can’t quite follow the reasoning of the argument, but I think that’s because the question is a little vague. If you ask me: what is the probability that the fair coin came up heads, a priori, I would say 1/2. If you ask me: what is the probability that I have just awoken you because the fair coin landed Heads, I will say 1/3. Those seem like two distinct questions, and so the fact that I would answer them differently doesn’t bother me.
So if the precise question is: conditional on being awoken will you bet heads or tails, then I will strictly prefer to bet tails.
To make the conclusion a bit more obvious, we can change the setting a little bit: the experimenter will wake me up and offer me the bet if and only if the coin lands Tails. Of course, I still think that the probability of a fair coin landing Tails is 1/2, but I now also think that the probability that the fair coin landed Tails given that you woke me up and offered me the bet is 1. Hence I’ll always want to bet Tails when you offer me the bet.
March 26, 2010 at 1:02 am
Robert Wiblin
Jeff: the paper I link to tries to use conditional probabilities to show that the probability is 1/3 on any individual wakening. It is short and easy to read:
Click to access sleeping.pdf
I am not sure of your analysis – trying to get my head around this, but I get different answers when I change small parts of the framing.
March 26, 2010 at 1:11 am
Kim
Emil, while you’re right that additional information can change the a priori probabilities ascribed to an event/non-event (indeed this is what bayesian theory and conditional probability is all about), your analogy for the solution to this problem is false since it invokes a zero / infinity depending on how you look at it, which changes the maths.
The answer given in the main post is correct, although I would adopt slightly different reasoning to illustrate this.
The only reason you might end up with the 1/3 answer is that you’re counting the *number of times you are correct* in some notional iterative test run, instead of the *proportion of iterations in which you’re correct*.
Suppose we run the whole test on someone 100 times, and suppose, conveniently, the 100 random coin tosses produce 50 heads and 50 tails.
Strategy 1: Guess tails every time.
On 50 iterations you’re wrong, and each of these times you get woken up once, so you make 50 wrong guesses and you’re wrong in respect of 50 tosses of the coin.
On 50 iterations you’re right, and each of these times you get woken up TWICE, so you make 100 correct guesses but you’re correct in respect of only FIFTY tosses of the coin still.
So you get 50 0/1’s, and 50 2/2’s.
This gives you 100/150 by guesses, but 50/100 by tosses. Because you essentially get asked the *same question* twice for the tails ones, being right about that is double counted if one does a simple arithmetic tally of guesses.
Strategy 2: Guess heads every time
On 50 iterations you’re right, and each of these times you get woken up once, so you make 50 correct guesses and you’re correct in respect of 50 tosses of the coin.
On 50 iterations you’re wrong, and each of these times you get woken up TWICE, so you make 100 wrong guesses but you’re wrong in respect of only FIFTY tosses of the coin still.
This time, your being wrong is double counted, while your being right is only single counted.
Now, a lot of people will uncritically apply an arithmetic tallying to the guesses, because arithmetic aggregates are those we’re most familiar with from our very early mathematical learning. In fact it’s instinctive for most people, even people with some real mathematical talent, to apply arithmetic aggregation without even being fully conscious of it.
But here, it’s illegitimate. We’re not interested in the pay-off if you get $1 per correct guess – or if we are, then I entirely accept the guess tails strategy. But we’re really asking what the probability is of the coin toss having gone a given way. And for that, you don’t want to double count instances where you are asked that question twice. You want to count once for each toss of the coin. And when you do that, counting the 2/2 the same as the 1/1 and so forth, both strategies give equal expected value. The probability is therefore 1/2 each way.
March 26, 2010 at 2:55 pm
Emil Temnyalov
Regarding Kim’s initial comment: it is true that I am offering a limiting case as an example when I rephrase the problem by assuming that the experimenter will only wake me up and offer me the bet on Tails.
However, we can approximate that case by rephrasing the problem: the experimenter will wake me up and offer me the bet once if Heads, or n times if Tails. Since we’re sticking to the decision theoretic argument, it seems quite clear to me that for any n>1 I would strictly prefer to bet on Tails every time I am offered a bet.
March 26, 2010 at 1:22 am
Kim
Another way to look at this, if people are wedded to the conditional on awakening subjective analysis rather than an iterative test run, is as follows:
To say ‘conditional on awakening’ etc is actually to obscure an inherent ambiguity. Simply saying ‘conditional on awakening’ is meaningless in one sense: P(awakening | heads) = P(awakening | tails) = 1, since you’re guaranteed to wake up at least once. But P(awakening second time) is obviously dependent on the result of the coin toss.
Looking at a probability tree, we can split first into heads and tails (0.5 each), but then, going down the tails path, we split again into halves on the basis of whether the current awakening is the first or second.
Thus it’s NOT true that each sort of awakening (Heads, Tails 1st, Tails 2nd) is equally likely a priori! So we can’t just say ‘oh well, in 2 out of 3 of them i should guess tails, so there we are’. A better representation of the awakenings, given no other knowledge, is
P(This awakening is from heads and is the only one) = 0.5
P(This awakening is from tails and is the first) = 0.25
P(This awakening is from tails and is the second) = 0.25
This of course reduces back to the 1/2 vs 1/2 position.
March 26, 2010 at 1:24 am
Robert Wiblin
Kim: What did you make of the conditional probability reasoning in the paper?
Even if you’re not write a greater proportion of the time guessing tails, the fact remains that on any single wakeup, you are twice as likely to be in a tails scenario. The paper seems to show that well.
March 26, 2010 at 1:26 am
Kim
See above, my 2nd comment.
March 26, 2010 at 1:27 am
Kim
(haven’t read that paper and don’t have the time now but don’t really feel I need to)
March 26, 2010 at 1:39 am
Robert Wiblin
Re your second post Kim: You really think on waking the probability of being in H1 is double that of being in T1 or T2?
This makes more sense to me:
“If (upon first awakening) you were to learn that the toss outcome is Tails, that would amount to your learning that you are in either T1 or T2. Since being in T1 is subjectively just like being in T2, and since exactly the same propositions are true whether you are in T1 or T2, even a highly restricted principle of indifference yields that you ought then to have equal credence in each. But your credence that you are in T1, after learning that the toss outcome is Tails, ought to be the same as the conditional credence P(T1jT1 or T2), and likewise for T2. So P(T1jT1 or T2) = P(T2jT1 or T2), and hence P(T1) = P(T2).
…
Now: if (upon awakening) you were to learn that it is Monday, that would amount to your learning that you are in either H1 or T1. Your credence that you are in H1 would then be your credence that a fair coin, soon to be tossed, will land Heads. It is irrelevant that you will be awakened on the following day if and only if the coin lands Tails — in 3 this circumstance, your credence that the coin will land Heads ought to be 1/2. But your credence that the coin will land Heads (after learning that it is Monday) ought to be the same as the conditional credence P(H1jH1 or T1). So P(H1jH1 or T1) = 1=2, and hence P(H1) = P(T1).”
March 26, 2010 at 1:41 am
Kim
That is a sneaky trick (evidently sneaky enough to elude the author), which relies upon injecting additional information, and perhaps also on the fine point that the relationship between T1 and H1 (mutually exclusive events) is different from the relationship between T1 and T2 (events which may both occur).
March 26, 2010 at 2:28 am
Chris
I think you make one mistake:
You show that whatever my belief about my second awakening betting is I am indifferent between the two bets on my first awakening. You forget that I have to have beliefs about which awakening I am in.
NOw everything is simple: In the second awakening I prefer to bet tails and in the first I am indifferent (given beliefs about my second betting). Hence for beliefs assigning prob>0 to being in hte secnd awakening it is optimal to bet tails.
March 26, 2010 at 9:06 am
jeff
Not so fast 🙂 Conditional on knowing this is the first awakening I am not indifferent. Because conditional on knowing this is the first awakening I assign probability greater than 1/2 to heads. This is standard conditional probability, memory doesnt play any tricks in this calculation.
To see this, suppose we have two urns. One contains a single ball with the number 1 written on it. The other contains two balls. The first has the number 1 on it, the second has the number 2 on it. Now the researcher tosses a coin to pick an urn. Then he draws a ball from that urn. Suppose you are told that the ball has the number 1 on it (i.e. the first awakening). Your conditional probability that it was drawn from the first urn is 2/3.
March 26, 2010 at 2:31 am
To SIA or not to SIA? « Robert Wiblin
[…] Betting and Probability: A Curious Puzzle (especially see the comments) […]
March 26, 2010 at 12:46 pm
Kevin Dick
You can’t update your priors if your memory is erased. So this problem is exactly equivalent to deciding what your _prospective_ strategy should be.
If you draw the prospective decision tree, you see that guessing tails will be right 2/3 of the time and (assuming some symmetry in your utility of being right/wrong in heads/tails) you should therefore choose tails. So you always choose tails.
March 26, 2010 at 7:29 pm
PLW
I’m going with 5/8.
Here’s the idea. When you wake up, it is either the first time or the second time. Conditional on it being the first time you’ve woken, there is a 1/2 probability it was heads and a 1/2 probability that it’s tails. Formally.. P(H|First)=1/2. If it is the second time you’ve awakened, the flip was heads for sure. (Formally P(H|Second)=1. Finally, the unconditional probability that a given wake up is the first one is 3/4. P(First)=P(First|Head)*P(Head)+P(First|Tails)*P(Tails). So the total probability that the flip was heads is P(Head|First)*P(First)+P(Head|Second)*Prob(Second)= 1/2*(3/4)+1/4*1=5/8.
March 27, 2010 at 8:56 am
Lones Smith
Am I saying the obvious here? This is the Absent Minded Driver Problem
Click to access absntdrvr.pdf
with an exit chance (behavior strategy) of 1/2 specified. So Wiblin’s second answer of 1/3 is the Rubinstein-Piccione consistent belief?
March 28, 2010 at 12:32 pm
Clones In Their Jammies, and Other Variations « Cheap Talk
[…] | Tags: game theory, maths, vapor mill | by jeff I’ve been thinking about the Sleeping Beauty problem a lot and I have come up with a few variations that help with intuition. So far I […]
April 1, 2010 at 3:40 am
WillJ
To see why the 1/3 argument is crap, consider an analogous problem: I flip a fair coin. If it lands heads, I set one red card on a table. If it lands tails, I set two red cards on the table. In either case, I ask you to draw a card.
You draw a card, and it’s red. What’s the probability, conditional on that information, that the coin I flipped was heads?
That’s exactly the same problem, and in this case it’s not so hard to see why saying 1/3 is stupid. The answer’s clearly 1/2.
Now, if we stick with that analogous problem and tweak it so that there are two spots on the table to put a card, and you’re given the information that there’s a red card in Spot 1 (or 2), THEN the probability of heads is 1/3 and tails 2/3. Just like if the original problem were that you were awakened and told it’s the first day (or the second day), THEN the probabilities would be 1/3 vs. 2/3. But in the original problem, all you know is that you were awakened (in the analogous problem, you drew a red card), period. The unconditional probabilities remain unaltered.
More straightforward way of looking at it (the original problem):
The fact that you are awakened doesn’t tell you any new information. When you’re awakened, all you know is that you were awakened, period. And this would have happened regardless.
P(heads, unconditional) = 1/2
P(tails, unconditional) = 1/2
P(you’re awakened | heads) = 1
P(you’re awakened | tails) = 1
let H = heads, A = awakened, then:
P(H|A)= P(H and A)/P(A) = P(H)P(A|H)/[P(H)P(A|H) + P(T)P(A|T)]
= [1/2 * 1]/([1/2 * 1] + [1/2 * 1])
=1/2
May 31, 2010 at 2:33 pm
Mark Stegeman
The answer is clearly 1/3 (e.g. Dew, Wiblin, Dick posts). To look at this another way, suppose that the subject knew whether he was being awakened for the first time. If so, then his posterior probability of H would be ½. If he knew instead that he was being awakened for the second time, then his posterior probability of H would be zero. If he is awakened and doesn’t know which case is true, then the posterior blends the two cases, implying a posterior probability less than ½.
The problem (as I see it) in Jeff’s 3-step argument concerning incremental gain, is that the premise is inconsistent with the decision problem. The premise that I would bet on tails in a second awakening (regardless of whether it is earlier or later) implies that I would bet on tails when awakened in the event of tails, because I would be unable to distinguish the two situations. But the premise that I would bet on tails in the event of tails implies that I must bet on tails in the current situation, because I also cannot distinguish those two situations. Therefore, the premise precludes the decision.
The problem (as I see it) in WillJ’s reasoning: Suppose that your mother will call you either once or twice during the month, twice if and only if someone dies (event T). The ex ante probability of someone dying is ½. If A denotes the event that your mother will call sometime during the month, then P(T/A)=½. But if A denotes the event that your mother just called, without conditioning on whether she has called earlier in the month, then, as already argued, the posterior probability of T exceeds ½. WillJ is, in effect, using the first definition of A, but the sleeping beauty problem calls for the second definition. This is similar to the flaw in Kim’s argument.
Thanks for posting this nice puzzle, to which someone just directed me (hence the late post). I think common intuition incorporates perfect recall, and that is why intuition seems to reject the 1/3 answer to this problem.
May 31, 2010 at 3:43 pm
jeff
mark, thank you for your comment. yes, in the time since i posted this i have thought a lot more about the problem and while i still don’t think there is any right answer, i do think that there are strong arguments against 1/2. And you spotted a good one.
One remaining puzzle is that if your interim beliefs are not 1/2 then upon placing your bet in favor of tails you have an expected earnings of something other than the true expected earnings. For example if your belief is 1/3 on H then your expectation of your earnings is 2/3*2 + 1/3*(-1) = $1. And if you were to repeat this experiment many times and you were shown the data at the time you were going to bet you would see that on average your earnings were (converging to) 1/2*2 + 1/2*(-1) = 50 cents. This would give you pause.