Mindhacks discusses a surprising asymmetry. Journalists discussing sampling error almost always emphasize the possibility that the variable in question has been under-estimated.
For any individual study you can validly say that you think the estimate is too low, or indeed, too high, and give reasons for that… But when we look at reporting as a whole, it almost always says the condition is likely to be much more common than the estimate.
For example, have a look at the results of this Google search:
“the true number may be higher” 20,300 hits
“the true number may be lower” 3 hits
There are two parts to this. First, the reporter is trying to sell her story. So she is going to emphasize the direction of error that makes for the most interesting story. But that by itself doesn’t explain the asymmetry.
Let’s say we are talking about stories that report “condition X occurs Y% of the time.” There is always an equivalent way to say the same thing: “condition Z occurs (100-Y)% of the time” (Z is the negation of X.) If the selling point of the story is that X is more common than you might have thought, then the author could just as well say “The true frequency of Z may be lower” than the estimate.
So the big puzzle is why stories are always framed in one of two completely equivalent ways. I assume that a large part of this is
- News is usually about rare things/events.
- If you are writing about X and X is rare, then you make the story more interesting by pointing out that X might be less rare than the reader thought.
- It is more natural to frame a story about the rareness of X by saying “X is rare, but less rare than you think” rather than “the lack of X is common, but less common than you think.”
But the more I think about symmetry the less convinced I am by this argument. Anyway I am still amazed at the numbers from the google searches.