Short animations giving viewers a taste of the tactics behind misinformation can help to “inoculate” people against harmful content on social media when deployed in YouTube’s advert slot, according to a major online experiment led by the University of Cambridge.
Working with Jigsaw, a unit within Google dedicated to tackling threats to open societies, a team of psychologists from the universities of Cambridge and Bristol created 90-second clips designed to familiarise users with manipulation techniques such as scapegoating and deliberate incoherence.

This “prebunking” strategy pre-emptively exposes people to tropes at the root of malicious propaganda, so they can better identify online falsehoods regardless of subject matter.

Researchers behind the Inoculation Science project compare it to a vaccine: by giving people a “micro-dose” of misinformation in advance, it helps prevent them falling for it in future – an idea based on what social psychologist’s call “inoculation theory”.

The findings, published in Science Advances, come from seven experiments involving a total of almost 30,000 participants – including the first “real world field study” of inoculation theory on a social media platform – and show a single viewing of a film clip increases awareness of misinformation.

The videos introduce concepts from the “misinformation playbook”, illustrated with relatable examples from film and TV such as Family Guy or, in the case of false dichotomies, Star Wars (“Only a Sith deals in absolutes”).

Lead author Dr Jon Roozenbeek from Cambridge’s SDML describes the team’s videos as “source agnostic”, avoiding biases people have about where information is from, and how it chimes – or not – with what they already believe.

“Our interventions make no claims about what is true or a fact, which is often disputed. They are effective for anyone who does not appreciate being manipulated,” he said.

“The inoculation effect was consistent across liberals and conservatives. It worked for people with different levels of education, and different personality types.”

Google – YouTube’s parent company – is already harnessing the findings. At the end of August, Jigsaw will roll out a prebunking campaign across several platforms in Poland, Slovakia, and the Czech Republic to get ahead of emerging disinformation relating to Ukrainian refugees.

The campaign is designed to build resilience to harmful anti-refugee narratives, in partnership with local NGOs, fact checkers, academics, and disinformation experts.

“Harmful misinformation takes many forms, but the manipulative tactics and narratives are often repeated and can therefore be predicted,” said Beth Goldberg, co-author and Head of Research and Development for Google’s Jigsaw unit.

“Teaching people about techniques like ad-hominem attacks that set out to manipulate them can help build resilience to believing and spreading misinformation in the future.

“We’ve shown that video ads as a delivery method of prebunking messages can be used to reach millions of people, potentially before harmful narratives take hold,” Goldberg said.

The team argue that prebunking may be more effective at fighting the misinformation deluge than fact-checking each untruth after it spreads – the classic ‘debunk’ – which is impossible to do at scale, and can entrench conspiracy theories by feeling like personal attacks to those who believe them.

“Propaganda, lies and misdirections are nearly always created from the same playbook,” said co-author Prof Stephan Lewandowsky from the University of Bristol. “We developed the videos by analysing the rhetoric of demagogues, who deal in scapegoating and false dichotomies.”

“Fact-checkers can only rebut a fraction of the falsehoods circulating online. We need to teach people to recognise the misinformation playbook, so they understand when they are being misled.”

Six initial controlled experiments featured 6,464 participants, with the sixth experiment conducted a year after the first five to ensure earlier findings could be replicated.

Data collection for each participant was comprehensive, from basic information – gender, age, education, political leanings – to levels of numeracy, conspiratorial thinking, news and social media checking, “bullshit receptivity”, and a personality inventory, among other “variables”.

Factoring all this in, the team found that inoculation videos improved people’s ability to spot misinformation, and boosted their confidence in being able to do so again. The clips also improve the quality of “sharing decisions”: whether or not to spread damaging content.

LEAVE A REPLY

Please enter your comment!
Please enter your name here