You firmly believe that the sun will rise every morning. Then one day you awake and the sun does not rise. What are you to believe now? You have basically two alternatives. One is to go on believing that the sun will rise every morning by rationalizing today’s exception. There could have been a total eclipse this morning. Perhaps you are dreaming. The other choice is to conclude that you were wrong and the sun does not rise every morning.
The “rational” (i.e. Bayesian) reaction is to weigh your prior belief in the alternatives. Yes, to believe that you are dreaming despite many pinches, or to believe that a solar eclipse lasted all day would be to believe something near to absurdity, but given your almost-certainty that the sun would always rise we are already squarely in the exceptional territory of events with very low subjective probability. What matters is the relative proportion of that low total probability made up of the competing hypotheses: some crazy exception happened, or the sun in fact doesn’t always rise. It would be perfectly understandable, indeed rational if you find the first much more likely than the second. That is, even in the face of this contradictory evidence you hold firm to your belief that the sun rises and infer that something else truly unexpected happened.
Cognitive dissonance is a family of theories in psychology explaining how we grapple with contradictory thoughts. It has many branches, but a prominent one and perhaps the earliest, suggests that we irrationally discard information that is in conflict with our preconceived ideas. It began with a study by Leon Festinger. He was observing a cult who believed that the Earth was going to be destroyed on a certain date. When that date passed and the Earth was not destroyed, some members of the cult interpreted this as proof they were right because it was their faith that saved humanity. This was the leading example of cognitive dissonance.
My preamble about Bayesian inference shows that when we see people who are rigid in their beliefs and we conclude that they are irrationally ignoring information, it is in fact we who are jumping to a conclusion. All we can really say is that we disagree with their prior beliefs and in particular the strength of those beliefs. Somehow though it is much less satisfying to just disagree with someone than to say that they are acting irrationally in the face of clear evidence.
Now, watch this video, especially the part that starts at the 3:00 mark. When this guy experiences his moment of cognitive dissonance, what is the rational resolution?

3 comments
Comments feed for this article
July 12, 2010 at 3:03 pm
Matt Warren
Great, quick piece. I love it! I’m always looking for more reason to suspect that lump of gray matter stuck in my head has its own agenda.
July 12, 2010 at 9:17 pm
michael webster
Leon Festinger’s work was on a) actions, b) beliefs and c) when the beliefs b) failed to justify the actions in a).
One of the great experiments he did was with two groups of people who were engaged in boring tasks. One group was given a small amount of money to complete the task, the other significantly more. Then they were both asked to rate the task while recommending it to others: the low money group rated the task as more pleasant than the high money group.
Festinger concluded that the low money group had to change their opinion of the task because they didn’t think that they would become shills for no money.
It is this social element of cognitive dissonance that your example fails to capture.
July 15, 2010 at 11:00 am
Idea scratchpad 1 « Daily Expositions
[…] Why are we easily embarrassed by slips in logical consistency when expressing ideas? Isn’t it safe to assume that every person can have conflicting thoughts or ideas about how the world works? What does this phenomenon imply about the way we study human behavior? Is there a mathematical framework that can allow for this explicitly? How about this cognitive dissonance phenomenon? (Hat tip to Cheap Talk.) […]