Last updated on September 16, 2012
A reader writes:
This might be interesting for you: people are annoyed/unhappy with NPR’s coverage about a report out of Stanford that said organic food isn’t really any healthier than conventional food. Clear case in point showing when people are faced with data and facts that counter their beliefs, they will more often than not completely shut it out and continue with their original belief. It’s much easier to continue believing what you thought before than stretch your mind and possibly acknowledge that you’re wrong or at least don’t have the full story. And we wonder why things don’t get accomplished?
What the reader has in mind is confirmation bias, the cognitive bias that makes people give more weight to empirical evidence that support their beliefs and less weight to empirical evidence that contradicts their beliefs, and about which I have written before in the context of development policy.
Confirmation Bias and Ambiguity Aversion
Suppose, however, that in response to the news that a study has found that organic food isn’t necessarily more healthful than regular food, someone tells you that they will keep buying organic food. Can you conclude that they suffer from confirmation bias?
Not necessarily. Indeed, it could be that that person is ambiguity-averse. That is, that person knows there is a positive probability that regular foods will make her sick, but there is also a considerable amount of uncertainty as to what that probability is exactly.
Most readers of this blog will be familiar with the notion of risk-aversion, i.e., aversion to adverse events (such as financial losses) whose probability of occurring is known. Fewer readers, however, will be familiar with the notion of ambiguity-aversion, i.e., aversion to adverse events (such as financial losses) whose probability of occurring is uncertain. Economists often use “risk” to denote the former and “uncertainty” (or “Knightian uncertainy”) to denote the latter.
So when someone says the will not change their behavior as a consequence of new empirical evidence that is inconsistent with the same behavior, without knowing anything more about that person’s decision process, we cannot tell whether they suffer from confirmation bias or ambiguity aversion.
That person would suffer from confirmation bias if she simply refuses to update her beliefs and change her behavior as a response to new, contradictory evidence. But if that person refuses to update her beliefs because there is a small likelihood that the new findings are wrong — after all, scientific findings get overturned all the time — then that person is simply ambiguity-averse.
My point is that it can be very difficult to definitely disentangle one from the other, which makes me wonder about studies that try to empirically test one or the other, and about whether there are any studies that actually test one versus the other.
A final note: It can be tempting to see a certain form of hypocrisy in confirmation bias, but it would be a mistake to do so. Confirmation bias is a cognitive bias. Much like it can be difficult to avoid optical illusions, it can be difficult to overcome a cognitive bias. This is where self-awareness comes into the picture.