Did you hear about the couple who decided to name their daughter Brexit? Or the fact that the regions voting leave also happened to be the areas afflicted by mad cow disease? How about the statement that smartphone radiation is causing brain damage and widespread insanity?
All these claims are false. You might think that you’d have to be stupid to believe this kind of stuff, but this is a serious misunderstanding of the way the brain works. Psychological research shows that misinformation is cleverly designed to bypass careful analytical reasoning, meaning that it can easily slip under the radar of even the most intelligent and educated people. No one is completely immune. Indeed, there is now evidence that smarter people may sometimes be even more vulnerable to certain ideas, since their greater brainpower simply allows them to rationalise their (incorrect) beliefs. Fortunately, the research also offers us some strategies to overcome those biases.
Let’s begin by examining why some false claims stick. Various studies have demonstrated that many of us rarely give our full attention when reading new statements. Consider the following question, for instance: “How many animals did Moses take on to the Ark?” Norbert Schwarz at the University of Southern California has found that only around 12% of students answer correctly (none). (It was, of course, Noah’s Ark – not Moses’s.)
Particularly when a statement feels “fluent” (easy to process) and familiar, we tend not to focus on the details and instead go with the gist. Unfortunately, there are many simple ways that purveyors of misinformation can tweak the presentation of their claims to increase a statement’s fluency and familiarity.
One example is the use of imagery – photographs help us to visualise statements, which means they can be processed fluently – and therefore seem truer. We can see this with medical stories: people are more likely to believe a pseudoscientific claim if it has a brain scan alongside it.
Perhaps the most potent way of spreading misinformation is simple repetition; the more you hear an idea, the more likely you are to believe it to be true. That’s a serious problem when a small but vocal community – of climate change deniers, say – are presented as talking heads on TV and radio.
In these ways, we can begin to see how misinformation can be engineered to bypass logical thinking and critical questioning. But do intelligence and education protect us against false claims? The latest research shows it partly depends on your thinking style. Some people are “cognitive misers”, for instance: they may have a lot of brainpower that allows them to perform well in exams, but they don’t always apply it, using intuition and gut instinct rather than reflective, analytical thinking. This thinking style is commonly measured with a tool known as the “cognitive reflection test” using questions such as: “If it takes five machines five minutes to make five widgets, how long would it take 100 machines to make 100 widgets?” The correct answer is five, but many otherwise intelligent people say 100 – the more intuitive response.
Studies from the US have revealed that people who score badly on these kinds of questions tend to be more susceptible to fake news, conspiracy theories and paranormal thinking. Those who score better, in contrast, tend to be less gullible, because they use their intelligence to analyse claims rather than relying on their gut feelings.
Not all fake news is created equal, though. Some stories may be faintly ludicrous, such as the family who named their daughter Brexit – whatever your background, you don’t need to believe it for it to support your worldview. But other stories may fit with your political identity far more tightly. And for these particularly emotive claims, intelligence and education may actually make you more susceptible to fake news, through a process called “motivated reasoning”. Consider the “birther” theory that Barack Obama was not born in the US. This has been debunked time and time again, but it became highly ingrained in many people’s political ideology. And greater brainpower did not prevent them from believing the story; indeed, it actually increased their credulity. A study by Ashley Jardina at Duke University in North Carolina, for instance, surveyed the views of the more conservative white Republicans – the kind of people who might have found the former president most alienating. It found that beliefs in the birther theory were strongest among the participants with the greatest political knowledge.
A similar pattern could be seen with the beliefs that Obama was a Muslim, and the claims that his healthcare reforms would lead to “death panels” that decided who lived or died. To make matters worse, more educated participants also seemed less likely to update their beliefs after they had been debunked; instead, they actually became more certain they were right. Somehow, their greater knowledge simply allowed them to dismiss the new information and harden their attitudes.
So far, the studies of motivated reasoning have centred on the US. But given these findings, I would be willing to bet that the same processes are shaping British responses to Brexit. For any issue that strikes at the core of who we are, greater brainpower may simply serve to preserve that identity at the expense of the truth.
This new understanding of misinformation should change the way we go about debunking falsehoods. In the past, the assumption was that you could present people with the facts and they would eventually sink in. Instead, some experts studying misinformation now favour a form of “inoculation”. One of the most compelling demonstrations comes from John Cook at George Mason University and Stephan Lewandowsky at the University of Bristol. Their aim was to find a way to protect people from common misinformation about climate change – including the fake petitions that show widespread disagreement among scientists about the true causes of global warming.
Rather than tackling the claims head on, Cook and Lewandowsky first showed the participants a report on the tobacco industry’s previous attempts to spread misinformation, which also included the use of fake experts to cast doubt on the scientific research that linked smoking to lung cancer. The strategy worked a treat. Having read about the tobacco industry’s tactics, the participants were more sceptical of the climate change petitions. Crucially this was true even of the more rightwing participants, who would have been naturally more inclined towards climate denialism.
Given the sheer prevalence of misinformation around us, I believe that ways of identifying misinformation, combined with critical thinking, should now be taught in every school. After all, it’s not just the fake political news that we need to avoid, but health scams and financial fraud. A firmer grounding in sceptical reasoning could help everyone – whatever their IQ – to use their intelligence to make wiser judgments.
• David Robson’s The Intelligence Trap: Why Smart People Do Stupid Things and How to Make Wiser Decisions is published by Hodder & Stoughton (£20). To order a copy go to guardianbookshop.com. Free UK p&p on all online orders over £15.
guardian.co.uk © Guardian News & Media Limited 2010