Emotionally charged language, personal attacks, false comparisons, and deliberate incoherence are just some of the misinformation techniques we encounter worldwide, including in Malta’s very own divisive political discourse.
Yet a new study reveals that one of the most promising responses to misinformation may also be one of the simplest. In a paper published last week in the journal Science Advances, a team of psychologists from the universities of Cambridge and Bristol found that briefly exposing social media users to the tricks behind misinformation boosts their awareness of dangerous online falsehoods.
Using an approach called “pre-bunking”, the researchers created a series of short videos that focused on specific misinformation techniques such as false dichotomies or scapegoating.
For the Maltese context, the Labour Party’s “you’re either with us, or you are traitors” rhetoric is a good example of a false dichotomy. Blaming activists and the media for the Nationalist Party’s electoral defeat or blaming foreigners for a perceived rise in crime are good examples of scapegoating.
The video below is one of the five animations created by the researchers to explain the manipulation technique of using emotionally charged language.
Promising results
The research team ran a field study on YouTube, the world’s second-most visited website owned by Google, to test the effectiveness of the videos on US YouTube users over 18 years old who had previously watched political content on the platform.
Two of the group of videos created were inserted in YouTube’s advertisement slots, showing them to around 1 million YouTube users. The researchers then went on to ask those people who saw a pre-bunking video to answer one multiple choice question.
The question assessed their ability to identify a manipulation technique in a news headline. The study also included a control group, which answered the same survey question but hadn’t watched the pre-bunking video.
What the researchers found was that the pre-bunking group was 5-10% better than the control group at correctly identifying misinformation, showing that this approach improves resilience even in a busy and distracting environment like YouTube.
Enormous potential
The pre-bunking approach used in the videos builds on years of research based on the idea known as ‘inoculation’ theory.
Broadly speaking, inoculation theory hypothesises that if we think of the spread of misinformation the way we would think of a virus, then just as the exposure to a weakened pathogen triggers the production of antibodies in our system, exposing people to a weakened persuasive argument builds people’s resistance against future manipulation, potentially curbing the spread.
The pre-bunking method has several advantages compared to other techniques used to counter falsehoods online, such as fact-checking or content moderation.
Current research suggests that while journalistic fact-checking can effectively counter mis- or disinformation, it is time and labour-intensive, especially since certain types of misinformation tend to stick even when they’ve been repeatedly fact-checked, as The Shift has found.
Content moderation by social media companies is often inconsistent, and platforms like Facebook and Twitter have also been criticised for not doing enough and, in some cases, for actively choosing content that favours engagement rather than the safety of the platform users.
The pre-bunking videos used by university researchers, on the other hand, don’t target any specific claims and make no assertions about what is true or not. Instead, they simply show the viewer how false claims work in general by focusing on the characteristics of misinformation, and in that way, the viewer can spot false claims on a wide variety of topics.
The study found that despite the intense “noise” and distractions on YouTube, the ability to recognise manipulation techniques at the heart of misinformation increased by 5% on average.
The researchers argue that such an increase in recognition could have an enormous impact if dramatically scaled up across social platforms and could help address issues such as the disinformation surrounding the war on Ukraine.