Everyone is vulnerable to misinformation. We’ve all, at some point, believed something untrue. And it can be surprisingly hard to shake that belief.
Elitsa Dermendzhiyska explored why for Aeon. Part of the problem is “the continued influence effect”. Basically, it helps explain why people can keep on believing a falsehood after it’s been debunked. The misinformation lingers and shapes our decisions long into the future.
The issue, Dermendzhiyska explains, is that we tend to take other people on face value. We act in good faith and assume others will do the same. Then there’s the fact that life is hard: there’s too much going on to interrogate everything.
That helps explain why we can believe something that isn’t true in the first place. But the difficulty in debunking that mistruth goes beyond that:
One of the most common explanations for the continued influence effect puts it down to a gap in our mental model, or the story we tell ourselves about what happened. If the myth fits the ‘logic’ of events, its retraction leaves a hole, and the cogs of the story no longer click into place… If we aren’t to lose coherence, it makes sense to hold on to both the actual fact and the fitting falsehood – but keep them separate, compartmentalised, so that they don’t clash. This might be why, as studies show, we could be well aware of the truth, yet still allow the myth to creep in elsewhere and corrupt tangential judgments.
Our minds are impressive things but they can often latch onto simplicity. We love stories; stories are how we understand the world. So it can be hard to give up a story that, on the surface, makes sense to you. (It’s also why a series of facts can’t compete with a well-told mistruth.)
This idea – that people will reject a correction or a retraction of a falsehood if it leaves a whole in your understanding – becomes even more complex when you look at the way information is shared online.
Context collapse strikes again
A post on Facebook or a message in a WhatsApp group can totally decontextualise a story, emphasising the “hole” in a stories logic left by a correction. A headline explaining why something was wrong just tells you that it’s wrong – it doesn’t explain why and, really, a whole lot of people won’t bother finding out.
Information is shared in drips and drabs and highlights and grabs. That doesn’t fill the context gap so someone who believes a mistruth is left trying to reconcile a story that no longer makes sense to them. And that’s if they’re prepared to change their mind at all. If not, they can just say “pfft” and move on.
Combine this with the other explanations for misinformation explored by Dermendzhiyska, chief among them repetition (wherein a mistruth is repeated so often it feels true because it’s familiar), and you have a pernicious problem. And that’s without people who actively want you to believe something that isn’t true.
Get a bell and start yelling “shame”
It’s a lot. But trying to understand the psychology behind the problem can help. But there are alternatives.
Here’s Dermendzhiyska again:
Perhaps we ought to worry less about fixing people’s false beliefs and focus more on shifting those social norms that make it OK to create, spread, share and tolerate misinformation.
Social norms are a powerful thing. They shape our lives on a fundamental level. It also highlights our role, as individuals, in fixing this problem. Dermendzhiyska quotes biologist D’Arcy Wentworth Thompson in her piece and it’s apt:
Everything is what it is because it got that way.
That means we can change it too.
Sign up to the Kites can't fly newsletter to get a weekly summary of everything on the site (plus some other cool stuff) in your inbox.
I mean, it’s not like you're going to remember to come back here on your own. URLs are hard.