Beyond bad information: exploring the causes and solutions to countering vaccine misinformation


The consequences of vaccine misinformation have been painfully felt in the last two years. Alongside the tragic and avoidable deaths of those who did not get the vaccine, relationships have suffered and trust across society has been damaged.

Through a series of debates, discussions and talks this conference explored the relationship between social media, misinformation and vaccine hesitancy in the context of the Covid-19 pandemic, and what can be done to support a healthier information environment online.

Over the two afternoons, Demos and the University of Warwick convened policy and technical experts with academics working in the areas of epistemology and public health for conversations informed by both applied and philosophical perspectives on vaccine hesitancy. Having moved past the most acute phase of the pandemic, we aimed for the conference to offer space to reflect on and share research into drivers of vaccine misinformation. With an eye on what interventions could be made now and in the future as Covid becomes endemic and a winter spike looms, the conference also focused on exploring solutions aiming to counter misinformation about vaccines.

Sessions and speakers included:

  • Myths of Misinformation: Is ‘false information’ the right target in the fight for better health information?  Speakers; Ellen Judson (CASM), Luke Tryl (More in Common)
  • The Trouble with Testimony: The role of lived experience in vaccine hesitancy  Speakers: Alex Wakefield (Royal Society), Oliver Marsh (Demos Fellow)
  • Information Systems: How does the online environment support vaccine misinformation?  Speakers: Carl Miller (CASM), Dr. Nahema Marchal (University of Zurich), Henry Tuck (Institute for Strategic Dialogue), Josh Smith (Demos)
  • Content Moderation: What can platforms actually do about vaccine misinformation?  Speakers: Iain Bundred (YouTube), Niamh McDade (Twitter), John Samples (Facebook Oversight Board)
  • Beyond Content Moderation: Alternative methods for tackling vaccine hesitancy  Speakers include Grace Rahman (FullFact), Jon Roozenbeek (University of Cambridge), Natalie-Anne Hall (Loughborough University)
  • After the Pandemic: How do we rebuild trust?  Speakers: Amil Khan (Valent Projects), Polly Curtis (Demos)

The conference is in partnership between CASM and the AHRC-DFG project ‘Moral obligation, epistemology and public health: the case of vaccine hesitancy’.


  • Vaccine misinformation is often framed as the problem of too much false information and not enough true information. But this overlooks how frequently it is the ambiguity of information that is exploited.

Ellen Judson and Luke Tryl emphasised how actors can use true statements in malicious ways to create fear or anger – Jon Roozenbeek later used the example of an article about a doctor dying after having the covid vaccine – or use ideas that can’t be fact-checked, including language like calling people ‘sheeple’. Carl Miller and Henry Tuck added a further layer of nuance in that conceptual distinctions between misinformation (often understood as false or deceptive information shared withoutharmful intent) and disinformation (often understood as false or deceptive information shared with harmful intent) fall back on whether or not false information is deliberately spread. However, motivation is frequently impossible to establish. These conversations made it clear that definitions of vaccine misinformation needed to resist simple binaries that over-simplify how and why it spreads.

  • The demand for misinformation goes beyond people seeking out information or sensational content. Online spaces for the vaccine hesitant and sceptics provide those people with a sense of belonging that they might not find elsewhere; and vilifying them reaffirms the need for that community. 

The appetite for misinformation functions as much as a matter of identity-making as it does fact-seeking. People with concerns about the vaccine feel unheard and seek like-minded communities: vilifying those who haven’t yet had the vaccine reaffirms this sense. Interventions need to prioritise public engagement that leaves people empowered and feeling part of a community. Luke Tryl admitted this is a difficult task as people are time-poor and efforts can be easily undermined by events like ‘partygate’, where officials were seen to be breaking rules that the public went to great efforts to follow. But taking their concerns seriously reveals that vaccine hesitancy is driven by more complex concerns than stereotypes of conspiracy theories – such as vaccines being vehicles to insert microchips into bodies – suggest, with worries about the time it takes to recover or about ‘unnatural’ ingredients being more prominent concerns.

  • The design and scale of online information systems shape how misinformation is able to spread. 

The scale at which major social media platforms operate exposes people to information frequently enough that fringe ideas can be brought together at a rate not previously seen. Algorithms that recommend online groups or content can quickly build networks and amplify ideas with a potentially unlimited spread. Languages and cultures that are considered more marginal and less profitable by platforms are moderated less well owing  to the fact that automatic detection systems are not trained in less-spoken languages and key contextual information is not known to human reviewers. Even languages as widely spoken as Spanish are not moderated as effectively as English is, meaning misinformation in Spanish is less likely to be acted on.

  • Vaccine misinformation is not isolated from other forms of misinformation and conspiracy theories.

Vaccine hesitancy has links with other conspiracies, especially with posts from the far right linking content back to niche websites that have their own languages, mythologies and journals. This creates, as Carl Miller described them, “competing epistemic worlds” that limits the impact of interventions based on increasing the supply of ‘better’ information, as the framework for truth is different across communities. Fact-checking organisations have noted that influencers who started spreading covid misinformation during the pandemic use their platform to spread other conspiracy theories now that the most acute phase of the pandemic is fading.

  • Eroded trust in institutions and traditional forms of media create the conditions in which the spread of vaccine misinformation can thrive.

The digital age has altered legitimacy away from top-down processes and towards bottom-up information flows, with Amil Khan describing how individuals are increasingly trusting YouTubers who may sometimes discuss politics alongside other content not typically found in traditional media outlets. Polly Curtis described inequalities of information experiences resulting in groups known as the ‘un-newsed’ who are unable to access good news sources and as a result become disenfranchised from democracy. These trends of disenfranchisement and waning trust in institutions were already under way before the pandemic, and so were easy to exploit when it  arrived.

  • Platforms are working to counter misinformation, but the efficacy of content moderation is limited by scale and business models. 

Three platforms – YouTube, Twitter and the Oversight Board – presented the work they had done to counter vaccine misinformation. Common challenges remained around whether it was appropriate for platforms to act as ‘arbiters of truth’, how to increase the likelihood of users encountering good information and public health messages, and how to get the right balance between human and automated review for the hundreds of millions of posts that are made each day. John Samples offered his reflections on the particular challenges of the American context, where free speech culture and increasing polarisation means that fact-checking organisations suffer from limited trust amongst some sections of the public. He suggested the pandemic is best viewed as an emergency situation, where normal rules were overturned. Now that the emergency phase is ending, it is vital that platforms and civil society look back at the lessons learned and how responses could be improved in the future.

  • Other methods for countering misinformation can help build people’s resilience to it as well as address its spread in aspects of digital lives that cannot be addressed by content moderation, such as on messaging platforms.

Platforms are not the only groups capable of countering misinformation, and their work was contrasted by three presentations. Grace Rahman presented the work of FullFact, who are Meta’s official fact checking partner. She discussed how ratings are used to communicate to users the likelihood that a post is misleading without it having to be removed, and repeat offenders can be punished by, for example, having their ability to monetise removed. Jon Roozenbeek presented his work developing video games and short videos to ‘inoculate’ players against misinformation, allowing them to see how misinformation is constructed so that they are more resistant to it when they encounter it in the wild. Natalie-Ann Hall discussed the work of the ‘Everyday Misinformation’ project, which investigated the spread of misinformation in the most popular form of social media platform – private messaging applications – and how it is resisted by citizens. The research demonstrated the power that conflict avoidance can have and how many people prioritise maintaining good relationships with family and friends over challenging misinformation.

  • Grappling with vaccine misinformation means focusing on the systems that amplify it, maintaining empathy and building relationships with those who believe it, and using a combination of methods to strengthen citizens’ resistance to it. 

Throughout the conference there was a tension between the effectiveness of methods that can be used to intervene in misinformation on small scales – e.g. media literacy projects, conversations between friends and communities, individual influencers using their relationship with their audience – and the necessity of operating at the huge scale of interactions on platforms. The value of navigating this problem through a multiplicity of approaches to vaccine misinformation is a major lesson from the last few years. An equally vital message from across the panels was that vaccine misinformation should not be syphoned off from the non-technical: misinformation is easily spread when trust and relationships between an individual and wider society are damaged.