Everything in Moderation: Platforms, communities and users in a healthy online environment


It is barely an oversimplification to characterise current debate on Internet regulation as a fight over the things people see, and the things they don’t. The systems of curation and moderation that dictate what is and isn’t permitted are the machinery most responsible for the health of online spaces. It follows that the ways they work are the subject of intense government scrutiny.

This paper argues that the principle and practice underpinning most major platforms have failed to create healthy online spaces, and that current attempts by states to regulate these spaces will in all likelihood fall short of addressing the root causes of this failure.

We identify three failures in current approaches to content moderation:

  • There is a democratic deficit in the way the majority of online platforms are moderated both in principle and in practice.
  • The architecture of the majority of online platforms undermine the abilities of communities to moderate themselves both in principle and in practice.
  • The majority of online platforms lack the cultures and norms that in the offline world act as a bulwark against supposed harms.

Improving these spaces therefore requires platforms to answer three challenges:

The first challenge is moving from authoritarianism towards democracy. Insofar as platforms play a quasi-public role, their processes should be subject to public scrutiny.

The second challenge is to turn digital subjects into digital citizens. Limiting the powers of platforms by the fundamental rights of their users – digital constitutionalism – lays a foundation on which a platform must provide the tools, structures and incentives for users to actively participate in shaping society, digital or otherwise.

Finally, we require the development of cultures conducive to minimising online harm. This is challenging, as the Internet hosts communities of an immeasurable range of perspectives, values and norms. The paper proposes two answers. First, that certain values are better than others. Promoting values of respect, understanding and equality, as well as fundamental human rights as put forward by the United Nations, should be encouraged. Second, that the infrastructure on which online communities and cultures are built should help empower and inform that community.

In the paper, we present solutions in principle and practice to the challenges presented by content moderation practices and systems, and put forward a number of recommendations.

Read the full discussion paper here.