It’s amazing what people can convince themselves is true. A recent National Science Foundation study showed that 25% of Americans actually think the sun revolves around the Earth. There are a lot of good sources, online and off, who labor to dispel misinformation (and kill inaccurate Facebook memes, one hopes)… but it’s way harder than you think, because the human brain can play weird little tricks in the “belief” department.
Of course, the brain doesn’t work like a computer hard drive, collecting and filing away knowledge; information is first collected in the hippocampus (a part of the upper brain that works like RAM in a computer), then, each time we recall that information, our brain reprocesses it and re-records it, until that information is gradually inscribed in the cerebral cortex completely separate from the original context in which it was learned. For example, you know that “Paris” is located in “France,” but you probably don’t remember how you learned it. Because— like a sentence conveyed in a game of “telephone”— that info was altered in transmission.
This phenomenon is called source amnesia, and it sometimes causes someone to forget whether or not a statement is true. Even if someone hears a lie presented with a disclaimer marking it fraudulent or fictional, people often later remember it as “true.”
Over time the disconnect gets worse. A disproved statement from an unreliable source— known to be false— can gain credibility during the time it takes to reprocess recent memories into long-term cortical storage. Knowing this, disingenuous individuals (advertisers, pundits) can exploit the phenomenon of source amnesia to purposely spread misinformation as fact. Or, as Adolf Hitler famously said, “if you tell a big enough lie, and tell it often enough, it will be believed.”
If a message or statement is initially very memorable (like a short sound byte, bumper sticker slogan, or cute commercial jingle) it will remain in the memory long after it’s been debunked. If repeated, it grows stronger yet. When its victim repeats it to others, he or she may categorize it thusly: “I think I read somewhere…” In a famous study, a group of Stanford students was exposed repeatedly to an unsubstantiated claim: “Coca-Cola is an effective paint thinner.” The more times the students were forced to read that statement, the more likely they were to attribute it to a reputable source like Consumer Reports (instead of The National Enquirer, their other choice)… as if the repetition of the message somehow lended it more credibility.
Meanwhile, confirmation bias causes our brains to adapt new facts into our accepted mental worldview: we tend to remember facts that agree with what we already believe, and forget facts that contradict our established beliefs.
It’s been suggested that legends and folklore survive and spread because they affect us emotionally. Ideas also spread by appealing to our emotions (despite any potential falsehood) encouraging their continuance. Worse still, teachers may believe they are combating untruths by discussing them and debunking them at length, but that’s not always true. By repeating false facts— even to argue against them— they may be reinforcing them by repetition.
American consumers (and voters) are equally susceptible to selectively accepting statements that reinforce what they already think. In another psychological study of such factors, researchers found that even when subjects were given a specific instruction to be objective, they still rejected evidence that disagreed with their pre-established beliefs. True “objectivity” was almost impossible to achieve.
Oliver Wendell Holmes once wrote that “the best test of truth is the power of the thought to get itself accepted in the competition of the market.” Holmes mistakenly assumed that concepts are more likely to spread if they are honest. But our brains just don’t work like that.
Of course we all want to believe what’s right. Unfortunately, we all suffer from the same nasty tendency to decide what’s right BEFORE we study the actual facts.