Politicians get frustrated when the ‘lies’ of their opponents are believed. Here’s an explanation.
What this means is that if the subject isn’t very important to you or you have other things on your mind, misinformation is more likely to take hold.
It is assumed that rejecting false information requires more cognitive effort – because you have to prove it is false – than just taking it in, which “just” requires accepting the reliability of the source.
So if you are distracted or not that interested, the misinformation is allowed easy uncontested entry to your brain.
This strikes me as bollocks on two grounds:
1) It assumes cognitive laziness is the natural state. But actually, each piece of information is being assessed subconsciously for its validity. The brain checks if the information fits existing patterns or beliefs. If it does, it’s in. That’s not lazy, that’s an assessment.
2) My understanding of brain science is that relevance of information is crucial. So I find it hard to see how a person who is not interested in a subject is going to retain much at all, or accurately, of any information.
Which is exactly what the research found when people did attempt to evaluate the information; they paid attention to a limited number of features. These features were the parts that fitted with existing knowledge.
This finding is itself consistent with existing brain research.
The research confirmed an aspect of knowledge which Eli Parisner calls the “filter bubble”. This means that you only really know those things that you’re exposed to. As this knowledge builds up, it has a stronger rejection of new information that might confound it. Researchers wonder whether this explains why lies and misinformation are deeply rooted on political and religious matters.
I was fascinated by findings that attempts to correct misinformation can increase the effect of the false belief. The suggestion is that when people cling to existing beliefs, they find new ways to explain away evidence to the contrary.
Lewandowsky said this “has fairly alarming implications in a democracy because people may base decisions on information that, at some level, they know to be false.”
The research confirms PR101 for messages: providing people with a narrative that replaces the gap left by false information; emphasizing a few facts; keeping information simple and brief; and repetition.
In a Psychology Today article, Douglas LaBier, Ph.D. argues that these don’t go far enough, because the hidden factors are “emotional and attitudinal”.
It’s the person’s internal drivers: the fears, needs and prejudices that are largely unconscious, and very resistant to information that challenges or conflicts with them in ways that are threatening.
So it’s not just a matter of cognitive factors that make one receptive to lies or resistant to acknowledging the truth. It’s one’s entire psychology. That is, many emotional needs or conflicts may fuel one’s cognitive, conscious beliefs and attitudes. And the latter may only tighten and become more deeply entrenched when challenged. It’s much harder to address those. What may penetrate is empathic, supportive communication that recognizes deeply help positions. This may be a bridge to messages that enable one to examine one’s own beliefs and become more receptive to the truth.
This is pretty cool thinking – being less combative and more understanding of the fears and hopes of people is a better way of delivering the facts.
All of this still disturbingly assumes a lack of rationality in the human brain. I think it’s better to think of the brain having a kind of hyper-rationality. It’s so good at finding patterns and drawing conclusions, that it makes leaps too far.
I think our brains are looking for truths – for accuracy. That’s why I insist that political messages are coded not just for narrative, but with an indisputable relevant and meaningful fact. On matters where there can be accuracy, rather than judgement, people are hard wired to understand and adopt the information.
Note: In my view, the scientists referred to in this article make a fundamental error in assessing what a ‘lie’ or ‘untruth’ actually is. Many of their examples seem to me to be contestable issues; the veracity of which depend on your views about life. Which is why LaBier’s suggestion about the emotional driver is so valid.