The public conversation around the source of Covid-19 has been laced with worrying undertones. Almost as soon as the global scale of the pandemic became apparent, conspiracies linking the virus to 5G lit up the internet. Phone masts were burned, and just this weekend a protest of around 100 people was held in Hyde Park, resulting in numerous arrests. While the pandemic lends them a new and heightened context, conspiracy theories themselves are nothing new, and have long been a challenge to society. Yet social media allows them to spread faster and further than ever before – creating a tidal wave of misinformation that’s increasingly hard to mitigate. Last year, we analysed a group of conspiracy-oriented Twitter accounts in Spain, Germany and Poland, highlighting the alignment between conspiracy circles and populism.
Read Suspicious Minds: Conspiracy Theories in the Age of Populism here, and the introduction below.
This short policy brief provides a data-driven introduction to some of the ways in which conspiracy theories are spread through online networks, and to the places and channels through which this is done. By comparing the networks and links shared in conspiracy-minded public conversations in three countries—Germany, Poland and Spain—it aims to stimulate a debate: how does the web contribute to the rise and spread of conspiracy theories, and how worried should we be?
The last decade has seen a fundamental shift in how we learn about the world around us and come to hold opinions. The digital revolution has democratised content production. Recent research by the British think tank Demos has demonstrated that social media platforms are now the primary source of news for young British adults, and this is likely to be replicated across Europe. (1) When we look at online media sources, we see that the proliferation of competing voices is accelerating. In a digital marketplace that prioritises engagement above education, traditional standards of evidence and accuracy are being eroded.
Digital platforms have provided platforms for conspiracy theorists to reach hundreds of thousands of users. Terrorist and extremist organisations, who weave conspiracy thinking into their propaganda, have been highly successful in using digital channels to spread their messages. Quick to adapt to digital channels, political groups not traditionally afforded a platform have found their voice online. And they have frequently used these platforms to attack mainstream media sources. Moreover, these groups compete with mainstream sources for both viewers/readers and revenue. Critics have accused governing and opposition parties in Europe and abroad of peddling conspiracy theories or encouraging conspiracy thinking. This decline in trust is generally believed to have contributed to the rise of ‘misinformation’ or ‘fake news’, which is helping to shape the current political climate. (2)
In this environment, a range of groups have flourished whose politics is built on conspiracy thinking. Conspiracy theories are central to a wide variety of political forces that include neo-Nazis in Poland and Germany, anti-capitalist ‘black blocs’ and Islamic fundamentalists. Alongside these extreme examples, conspiracy themes are increasingly visible in mainstream political life: Donald Trump’s ‘deep state’; Bernie Sanders’ ‘one percent’; and in the UK a gamut of conspiracy theories around multiculturalism, immigration and the EU.
The role of the web in spreading conspiracy thinking is poorly understood. On the one hand, academics have pointed out that, even in the digital age, a broadly stable proportion of the population believe in specific conspiracy theories. On the other hand, the web has had a profound effect on the way these theories are transmitted. News is being proliferated that fails to conform to long-standing journalistic standards. The news cycle has been accelerated. The Internet has provided a stable and far-reaching public platform for conspiracy theories. Moreover, it has facilitated the international networking of conspiracy thinkers and the building of vast ‘ecosystems of falsehood’, with conspiracy theories now being promulgated with the help of thousands of videos, photos, essays, wikis and so on.
Shedding light on all this is crucial, and this must be done with the help of new data sources and methods of analysis. This policy brief attempts to do just that. Using Twitter as a case study, it attempts to provide a narrow window into the ways in which conspiracy theories are propagated in three countries.
(1) C. Miller, The Rise of Digital Politics, Demos (London, 2016); R. Fletcher et al., Reuters Institute Digital News Report 2018, Reuters Institute for the Study of Journalism (Oxford, 2018).
(2) The definitions of ‘misinformation’ and ‘fake news’ have become blurred over the past two years. ‘Fake news’ originally referred to the for-profit circulation of fictitious stories optimised for click-through to generate advertising revenue. However, its repeated invocation by President Donald Trump as a term of abuse for the wider mainstream media has corrupted this narrow definition. ‘Misinformation’ refers to the broader pollution of the information environment by actors and states. The incentives for producing misinformation are not restricted to financial gain through advertising. Rather, the objectives are often social or political in nature. For an overview of the problem and policy recommendations, see K. Niklewicz, Weeding Out Fake News, Wilfried Martens Centre for European Studies (Brussels, 2018).