Last week, the world looked on in horror as a mob of Trump supporters stormed the Capitol, leading the people inside to have to barricade themselves in their offices for safety as the chambers were overrun by far-right extremists. In response, Trump continued to post on social media, reiterating the lie that the election was stolen from him and issuing support for the mob. And, after four years, Trump’s Twitter account was temporarily, and then permanently, suspended.
This caused an uproar: some people said this wasn’t going far enough, and he should have been suspended years ago; others said this was unacceptable censorship; others said, regardless of the outcome, this was not the sort of thing private companies should be making the decisions about. But that wasn’t the end of the suspensions: violent content proliferating on the ‘no censorship’ alternative app Parler, meant it has now been suspended from the Google and Apple stores and from Amazon Web Services which hosted it.
It is indicative of platforms’ unprecedented private power to be the ultimate arbiter of when something is in the public interest, what it is to be in the public interest, and when something goes against the public interest. It’s a troubling state of affairs where a peaceful transition of power could be under threat depending on what is available in the App Store. This power is used inconsistently, not transparently, and reactively. Angela Merkel’s spokesperson outlined this tension, saying that despite the clear need for action on harmful content, the chancellor saw Trump’s Twitter ban as ‘problematic’. While freedom of expression can legitimately be interfered with, this should be happening ‘according to the law and within the framework defined by legislators — not according to a decision by the management of social media platforms’.
We’re in a paradox. Without regulation, platforms make these calls themselves: but the most powerful people, most able to weaponise their online presence, are also able to threaten retribution against platforms for action taken. In such a situation, these people are in a position to seriously influence, if not determine completely, the shape of any domestic regulation that could require consistent platform action. Indeed, platform action to remove far-right extremist content in the US saw the platforms being accused of censorship by the leader who counts those users as his base, and threatened with their legal protections being removed. Power hovers uneasily between the frequently unaccountable corporations and the frequently unaccountable state leader.
When threats to democratic integrity are external – such as coordinated by inauthentic foreign actors – platforms can be straightforwardly called on to act on these. But relying on national regulation regimes to clamp down on the often much more serious threats posed by a blur of domestic state and non-state actors is a riskier strategy. The mob at the Capitol was incited by the state leader: they were coordinated by non-state actors, in support of the leader. This is a pattern we see repeated across the world – with state-directed or coordinated action blending with supportive citizens joining the fray.
We need regulation. But we can’t just assume any and all regulation is good regulation. When we talk about regulation, we need to be thinking beyond expectations being set by a patchwork of domestic actors with more or less commitment to the rule of law. We need to be thinking about how we can promote and work towards international standards, with multilateral buy-in and oversight: and when building national regulations, we need standards which are grounded in rights and based on evidence, independently enforced and overseen, and not easily susceptible to the whim of a new leader.