A reader writes:
This might be interesting for you: people are annoyed/unhappy with NPR’s coverage about a report out of Stanford that said organic food isn’t really any healthier than conventional food. Clear case in point showing when people are faced with data and facts that counter their beliefs, they will more often than not completely shut it out and continue with their original belief. It’s much easier to continue believing what you thought before than stretch your mind and possibly acknowledge that you’re wrong or at least don’t have the full story. And we wonder why things don’t get accomplished?
What the reader has in mind is confirmation bias, the cognitive bias that makes people give more weight to empirical evidence that support their beliefs and less weight to empirical evidence that contradicts their beliefs, and about which I have written before in the context of development policy.