FEATURE INOCULATING AGAINST FAKE NEWS?
hammer into people that there are Weapons of Mass Destruction (WMDs) in Iraq, it doesn’t matter that none were found after the country was thoroughly scoured by the invading forces. The constant drumbeat of ‘WMD, WMD, WMD’ in the lead-up to the invasion, followed by innumerable media reports of ‘preliminary tests’ that tested positive for chemical weapons during the early stages of the conflict – but ultimately were never confirmed by more thorough follow-up tests – created a powerful impression that those weapons had been discovered. An impression so powerful that four years after the absence of WMDs became the official US position, 60% of Republicans and 20% of Democrats believed either that the US had found WMDs or that Iraq had them, but had hidden the weapons so well that they escaped detection. Misinformation can stick even when people acknowledge a correction, and know that a piece of information is false. In a study conducted during the initial stages of the invasion of Iraq, colleagues and ourselves presented participants with specific war- related items from the news media, some of which had been subsequently corrected, and asked for ratings of belief as well as memory for the original information and its correction. We found that US participants who were certain that the information had been retracted, continued to believe it to be true. This ‘I know it’s false but I think it’s true’
behaviour is the signature of the stickiness of misinformation. Misinformation sticks even in
Once you hammer into people that there are Weapons of Mass Destruction (WMDs) in Iraq, it doesn’t matter that none were found after the country was thoroughly scoured by the invading forces.
Inoculating against fake news? M
ISINFORMATION STICKS. Erasing ‘fake news’ from your memory is as difficult as getting jam off your fingers after a Devonshire tea. Once you
Benjamin Franklin is said to have coined the phrase that an ounce of prevention is worth a pound of cure. This applies to many things, even combating ‘fake news’ and other forms of misinformation. By Stephan Lewandowsky, Sander van der Linden and John Cook
situations in which people have no ideological or motivational incentive to stick to their erroneous beliefs. In the laboratory, the original misinformation shines through in people’s responses to inference questions when they are presented with entirely fictional but plausible scripts about various events. For example, people will act as though a fictitious warehouse fire was due to negligence even if, later in the script, they are told the evidence pointing to negligence turned out to be false.
Is there any way to unstick information? There is broad agreement in the literature that combating misinformation requires that the correction be accompanied by a causal alternative. Telling people that negligence was not a factor in a warehouse fire is insufficient – but telling them that arson was to blame instead will successfully prevent any future reliance on the negligence idea. Another way to combat misinformation is to
prevent it from sticking in the first place. An ounce of inoculation turns out to be worth
a pound of corrections and causal alternatives. If people are made aware that they might be misled before the misinformation is presented, there is evidence that people become resilient to the misinformation. This process is variously known as ‘inoculation’ or ‘prebunking’ and it comes in a number of different forms. At the most general level, an up- front warning may be sufficient to reduce – but not eliminate – subsequent reliance on misinformation. In one of our studies, led by Ullrich Ecker, we found that telling participants at the outset that ‘the media sometimes does not check facts before publishing information that turns out to be inaccurate’ reduced reliance modestly (but significantly) in comparison to a retraction-only condition. A more specific warning that explained that ‘research has shown that people continue to rely on outdated information even when it has been retracted or corrected’, by contrast, reduced subsequent reliance on misinformation to the same level as was observed with a causal alternative. A more involved variant of inoculation not only provides an explicit warning of the impending threat of misinformation, but it additionally refutes an anticipated argument that exposes the imminent fallacy. In the same way that a vaccination stimulates the body into generating antibodies by imitating an infection, which can then fight the real disease when an actual infection
16 SOCIETY NOW WINTER 2018
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36