Christopher Penler/Shutterstock.com

Modern vaccines have saved over 150 million lives. Yet misinformation about them can still have deadly consequences. A gunman recently opened fire at the US Centers for Disease Control and Prevention headquarters, wrongly believing that the coronavirus vaccine had caused his depression.

Public health is increasingly being threatened by the spread of dangerous misinformation. In fact, there have been several recent cases of healthy unvaccinated children who died after contracting the highly contagious measles virus – including in July in Liverpool. Childhood vaccination rates in the UK are now at their lowest point in over a decade, well below the World Health Organization recommended threshold of 95% for herd immunity.

A key question for scientists and public health practitioners alike is how to design interventions that help reduce people’s susceptibility to health misinformation. To help accomplish this, we designed a free browser game, Bad Vaxx, that simulates social media and lets players step into the shoes of online grifters who peddle vaccine misinformation, using four common manipulation tactics.

In three experimental trials, we found that the game helps people discern significantly better between credible and misleading information about vaccinations, boosts players’ confidence in their judgments, and reduces their willingness to share vaccine-related misinformation with others.

Much research shows that once exposed, people often continue to rely on falsehoods despite having seen a debunk or fact-check. Fact-checks matter, but it’s difficult to get people to engage with science and to spread corrective information across an increasingly fractured media landscape.

Inoculating minds

Our new game draws inspiration from a more preemptive approach known as “prebunking”. Prebunking aims to prevent people from encoding misinformation into their brains in the first place.

The most common way to prebunk misinformation is through psychological inoculation – an approach that befittingly parallels the immunisation analogy. Just as the body gains immunity to infection through exposure to severely weakened doses of a viral pathogen (that is, the vaccine), so too can the mind acquire cognitive resistance to misinformation. This happens through exposure to weakened doses of the tricks used to manipulate people online, along with clear examples of how to identify and neutralise them.

One way to build immunity is to immerse people in a social media simulation. This is exactly what happens in our game, Bad Vaxx.

In a controlled setting, people are exposed to weakened doses of the main techniques used to deceive people on vaccination through humorous and entertaining scenarios where people interact with four shady characters. They include Ann McDoctal, who loves to float scary anecdotes about vaccines, Dr Forge, who fakes his expertise and gets traction by pumping out pseudoscience, Ali Natural, who promotes the naturalistic fallacy (“if it’s natural, it must be good”), and the conspiracy theorist Mystic Mac, who doubts all official narratives.

The player can choose between two competing perspectives: one, take on the role of an online manipulator to see how the sausage is made, or two, try to defeat the characters by reducing their influence.

Cognitive inoculation is thought to work, in part, by introducing a sense of threat to elicit motivation to resist propaganda, which both perspectives aim to achieve, albeit through different means. People were randomly assigned to either the “good” or “evil” version of our 15-minute game or a placebo group (who played Tetris).

We “pre-registered” our study, meaning we wrote down our hypothesis and analysis plan before collecting any data, so we couldn’t move the goalposts.

We measured effectiveness by asking people how manipulative they found vaccine misinformation embedded in social media posts, how confident they are in their judgments, and whether they intend to share the post with their networks.

We based the test on real-world misinformation that corresponded to each of the techniques featured in the game or a non-manipulative (neutral) counterpart. This was done to see if the game improves people’s ability to discern between misleading and credible content. For example, a conspiratorial post read: “Vaccine database wiped by government to hide uptick in vaccine injuries.”

In all our tests, we found that both versions of the game helped people get much better at spotting fake vaccine information. Players also became more confident in their ability to tell real from fake, and they made better decisions about what to share online. The version where you play as the “good guy” worked slightly better than the version where you play as the “bad guy”.

Boosting discernment without breeding cynicism

We also found that people became significantly better at spotting false and manipulative content without becoming sceptical of credible content that doesn’t use manipulation. In other words, players became more discerning.

Of course, the immunisation analogy should not be over-interpreted as effects of psychological interventions are generally modest and do wear off but epidemiological simulations show that when applied across millions of people, prebunking can help contain the spread of misinformation.

Although vaccination decisions are complex, much research has shown a robust link between exposure to misinformation and reduced vaccination coverage. Needless to say, misinformation about vaccines is not new. In the 1800s, anti-vaxxers falsely claimed that taking the cowpox vaccine against smallpox would turn you into a human-cow hybrid.

What’s different today is that the most influential misinformation is coming from the top, including prominent politicians and influencers, who spread thoroughly debunked claims, such as the myth that the MMR vaccine causes autism.

Empowering the public to identify pseudoscience, misdirection, and manipulation in matters of life and death is therefore crucially important. We hope that our easy-to-play, short prebunking game can be integrated into educational curriculums, used by public health officials, doctors and patients in medical settings, and feature as part of international public health campaigns on social media and beyond. After all, viruses need a susceptible host. If enough people are immunised, misinformation will no longer have a chance to spread.

This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Sander van der Linden, University of Cambridge; Jon Roozenbeek, University of Cambridge, and Ruth Elisabeth Appel, Stanford University

Read more:

Sander van der Linden has received funding from the UK Cabinet Office, Google, the American Psychological Association, the US Centers for Disease Control, EU Horizon 2020, the Templeton World Charity Foundation, and the Alfred Landecker Foundation. He has lectured and/or consulted for the WHO, UN, Meta, Google, the Global Engagement Center (US State Dept), and UK Defense and national intelligence.

Jon Roozenbeek has received funding from the UK Cabinet Office, the US State Department, the ESRC, Google, the American Psychological Association, the US Centers for Disease Control, EU Horizon 2020, the Templeton World Charity Foundation, and the Alfred Landecker Foundation.

During her time at Stanford University, Ruth Elisabeth Appel has been supported by an SAP Stanford Graduate Fellowship in Science and Engineering, a Stanford Center on Philanthropy and Civil Society PhD Research Fellowship, a Stanford Impact Labs Summer Collaborative Research Fellowship, and a Stanford Impact Labs Postdoctoral Fellowship. She has interned at Google in 2020 and attended an event where food was paid for by Meta. After completing her research at Stanford University, which forms the basis for this article, she joined Anthropic to research the economic and societal impacts of AI.