Jeff discussed a seminal game theoretic analysis of Cheap Talk in an earlier post: “Strategic Information Transmission” by Crawford and Sobel studies a decision-maker, the Receiver, who listens to a message from an informed advisor, the Sender, before making a decision.  The optimal decision for each player depends on the information held by the Sender. If the Sender and Receiver have the same preferences over the best decision, there is an equilibrium where the Sender reports his information truthfully and the Receiver makes the best possible decision.

What if the Sender is biased and wants a different decision, say a bit to the left of the Receiver’s optimal decision? Then the Sender has an incentive to lie and move the Receiver to the left and always telling the truth is no longer an equilibrium.  Crawford and Sobel show that this implies that in equilibrium information can only be conveyed in lumpy chunks and the Receiver takes the best expected decision for each chunk.  The bigger the bias, the less information can be transmitted in equilibrium and the larger each lump must be.

The Crawford-Sobel model has differences of opinion generated by differences in preferences.  But individuals who have the same preferences but different beliefs also have differences of opinion.  The Sender and Receiver may agree that if global warming is occurring drastic action should be taken to slow it down.  But the Sender may believe it is occurring while the Receiver believes it is not.  Differences in beliefs seem to create a similar bias to differences in preferences and hence one might conjecture there is little difference between them.  A lovely paper by Che and Kartik shows this is not the case.  If the Sender with a belief-based bias acquires information, his belief changes.  If signals are informative, his beliefs must move closer to the truth and his bias must go down.  If  Sender with a preference-based bias acquires information, his bias by definition does not change.  So, when there are belief-based differences in opinion, information acquisition changes the bias, indeed it reduces it.  This allows the Sender to transmit more information in equilibrium and improve the Receiver’s decision implementation (this is the Crawford-Sobel intuition but in a different model).  The Sender values this influence and has good incentives to acquire information.  Hiring an advisor with a different belief is valuable for the decision-maker, better than having a Yes-Man. Some pretty cool and fun insights.  And it is always nice when the intuition is well explained and it is related to classical work

There is lots of other subtle stuff and I am not doing justice to the paper.  You can find the paper Opinions as Incentives on Navin’s webpage.