One reason people over-react to information is that they fail to recognize that the new information is redundant. If two friends tell you they’ve heard great things about a new restaurant in town it matters whether those are two independent sources of information or really just one. It may be that they both heard it from the same source, a recent restaurant review in the newspaper. When you neglect to account for redundancies in your information you become more confident in your beliefs than is justified.
This kind of problem gets worse and worse when the social network becomes more connected because its ever more likely that your two friends have mutual friends.
And it can explain an anomaly of psychology: polarization. Sandeep in his paper with Peter Klibanoff and Eran Hanany give a good example of polarization.
A number of voters are in a television studio before a U.S. Presidential debate. They are asked the likelihood that the Democratic candidate will cut the budget deficit, as he claims. Some think it is likely and others unlikely. The voters are asked the same question again after the debate. They become even more convinced that their initial inclination is correct.
It’s inconsistent with Bayesian information processing for groups who observe the same information to systematically move their beliefs in opposite directions. But polarization is not just the observation that the beliefs move in opposite directions. It’s that the information accentuates the original disagreement rather than reducing it. The groups move in the same opposite directions that caused their disagreement originally.
Here’s a simple explanation for it that as far as I know is a new one: the voters fail to recognize that the debate is not generating any new information relative to what they already knew.
Prior to the debate the voters had seen the candidate speaking and heard his view on the issue. Even if these voters had no bias ex ante, their differential reaction to this pre-debate information separates the voters into two groups according to whether they believe the candidate will cut the deficit or not.
Now when they see the debate they are seeing the same redundant information again. If they recognized that the information was redundant they would not move at all. But if don’t then they are all going to react to the debate in the same way they reacted to the original pre-debate information. Each will become more confident in his beliefs. As a result they will polarize even further.
Note that an implication of this theory is that whenever a common piece of information causes two people to revise their beliefs in opposite directions it must be to increase polarization, not reduce it.
4 comments
Comments feed for this article
January 29, 2013 at 2:13 am
kerokan
This seems related:
Click to access Ortoleva%20Snowberg%20Overconfidence.pdf
January 29, 2013 at 2:18 am
weknowthis
and this:
Click to access Druckman%20APSR%202004.pdf
January 29, 2013 at 7:20 am
MB
Or it can just be explained using the notion of informational cascades.
January 29, 2013 at 9:46 am
Enrique
Another possible explanation is that people are still being selective and biased when they evaluate new information during the “updating” phase, i.e., they are filtering out information inconsistent with their Bayesian priors when they are updating those priors