By Maeve Walsh, Associate at Carnegie UK Trust
As we wait for the final response to the consultation Government’s Online Harms White Paper – a wait that has lasted for 18 months since the publication of the White Paper itself in April 2019, and over three years since the Green Paper that preceded it – it’s clear from the recent Demos/Polis research that a better-quality public debate on the issues, particularly to engage marginalised groups such as the digitally excluded, is needed.
Wouldn’t it have been a wonderful thing for the Government to have used some of this time to ask the public what they really think about what regulation of online spaces might mean for them and how it might work? Not just the traditional “send in your views by email” consultations on dry policy documents. But a national programme of open consultation events with wide cross-sections of the population (and for those groups to hear from each other)? Targeted listening exercises with marginalised groups who frequently experience the worst of the harm online? Deliberative forums or citizens juries? All of this would have helped prepare the ground for the proposals that follow. It may – whisper it quietly – have made the proposals better.
The Polis research helps fill in some of the blanks, however and raises questions for policymakers and Ministers to consider as they put the finishing touches to the final White Paper response and, in the New Year, to the Online Harms Bill itself. Parliamentarians on all sides of the House are increasingly raising similar questions, and civil society groups – like Carnegie UK Trust – have sought to help answer them in recent months and years. For example, policymakers will need to understand the balance between different rights and protections online; Professor Lorna Woods has written comprehensively on how a systemic duty of care for online harm reduction (as proposed by Carnegie UK) would operate to protect users online while also protecting fundamental freedoms, including freedom of speech. In our view, the tension in relation to different rights can be resolved by looking at the impact of design features across the entire communication cycle, which broadens the options found in the tool box; it is not just a binary choice about whether content should be taken down or stay up.
How this thinking then gets translated into a policy document, a Ministerial briefing pack and – most importantly – public-facing government comms will be critical. It will affect the next stage of the debate in Parliament, the noise that campaign groups make outside it and, ultimately, the design and effectiveness of the regulations that come out the other side. A national conversation over the last three years that was open and deliberative, and that asked questions about, for example, what the right to freedom of speech online means to individuals and society, would have provided rich insight. It would have helped prepare the ground for informed Parliamentary debate with an understanding of public acceptability for regulation which no amounts of behind-closed-doors policy advice can match. Civil society groups that work closely with minoritised or vulnerable groups to understand the impact of online harms – such as Glitch, Antisemitism Policy Trust, Hope Not Hate, Catch-22, 5 Rights – have written powerfully on their findings. But that is no substitute for policymakers and politicians listening to such concerns – and the ideas for solutions – direct.
With every passing week, the scale of harms online – from child abuse, to hate crime, to scams – increase and the impact on individual members of the public, particularly those from vulnerable and minoritised groups, becomes more widespread. The rising levels of anti-vaxx misinformation and disinformation is a looming threat to the Covid-19 vaccination programme – and society at large. Regulation is long overdue. Let’s hope the last three years of waiting don’t prove to have been a missed opportunity to let the public help get it right.