Blog-Layout

DIGITAL CONSTITUTIONALISM

Social Media Councils

A decentralized solution fit for Digital Constitutionalism.

By Sonia Sangiovanni

December 19, 2020

With the 2016 American Election, it became clear that social media platforms allowed fake news to spread, at unprecedented speed. Facebook, in particular, proved to be the main source of exposure to misleading information on the elections. Four years later, however, tech companies have taken a new stance on the proliferation of fake news. From redirecting users to official guidelines on covid-19, to flagging and blocking unverified stories, Facebook, Instagram and Twitter have made efforts in tackling this problem.


Case in point, on October 14, an unverified story by the New York Post, incriminating Hunter Biden through allegedly leaked emails, got moderated by Facebook and Twitter, in an attempt to limit its traffic. While the restriction of the article was only one of the efforts taken in countering fake news, both companies were accused by conservative voices of “liberal censorship”.


However, as Kyle Langvardt, law professor at the University of Detroit, explains in his academic article, this is at the core of the dilemma of moderators:  “whether it is condemned as “censorship” or accepted as “content moderation,” [this system] sits in tension with an American free speech tradition that was founded on hostility toward ex ante administrative licensing schemes” .


The debate on whether or not online content moderation constitutes censorship over free speech is not only limited to the ‘editorial powers’ of Tech Companies. It equally fits within the broader discussion about digital governance and the need for online rules to solve the social threats that exude from the online world. As Edoardo Celeste, a fellow researcher on Digital Constitutionalism, explains through his publications, the new legal challenges created by digital technology have led to the “emergence of normative counteractions”. Through the lenses of Digital Constitutionalism, the author further explains that the need for new norms in the digital environment stems from the necessity to re-establish a “constitutional equilibrium”. While the role to guarantee such equilibrium used to belong to State actors in the physical world, in the digital one, “private actors emerge beside nation-states as new dominant actors, thus potential guarantors and infringers of fundamental rights”.


While the participation of private companies in defining the new rules of the digital realm seems inevitable, the question on how social media companies should work as moderators of free speech still stands.


As Michal Lavi, a researcher at the University of Haifa, points out in his research, the role of moderators for social media companies is often a consequence of the liability regime in which they operate. Digital technology, in fact, allows to both “empower individuals and promote important social objectives” and “create a setting for speech-related torts, harm, and abuse”. Because of this, tech companies are often considered liable for the moderation of content that could potentially constitute a threat to its users.


The argument over liability of the online platforms is, indeed, a strong one. Yet, the way they operate as moderators is not always as transparent as it should be. In stark contrast with their role of “democratic fora”, social media’s decision-making remains opaque and erratic (Langvardt, 2018). As Spandana Singh explains, “in part, this lack of transparency is due to the black box nature of such tools, which prevents comprehensive insight into algorithmic decision-making processes, even by their creators”.  However, in the context of elections, the lack of transparency and the accountability of potential negative decisions, “the moderation of  elections by private platform companies can erode, rather than protect, our democracies” (Bradshaw, et al., 2020).


With the importance and critical nature of the online discourse, it seems necessary to include other stakeholders within this framework. Drawing from the Digital Constitutionalism theory by Edoardo Celeste, other entities could participate alongside State actors and platform companies.


An alternative model that has been presented by Article 19, a Human Rights organisation that defends freedom of speech, foresees a decentralised network of moderators within the digital realm. As the organisation explains on their website, “due to the high asymmetry of information currently in the market, regulators alone do not possess the necessary information to properly shape the unbundling requirement at the contractual and at the technical layer. Nonetheless, regulators should be the ones to lead this process, and to closely monitor the compliance of market operators”. Through this new model, the decentralized network of moderators will include the so called “Social Media Councils”, a set of “open, participatory and accountable bodies made up of various actors working at the national level to safeguard freedom of expression online”. 


Besides taking into consideration more actors with the role of moderators, the decentralized model proposed by Article 19 will equally constitute a more ethical solution, by “unbundling” the different services of social media companies, which control both “the hosting of the social media platform and the moderation of content on the platform”. By doing so, tech companies won’t completely lose their ability to moderate content, but they would be obliged “to allow competitors to provide competing content moderation services on their platforms”.


By potentially creating a competitive market of content moderators through a decentralised network, Article 19 might have found a solution on how to respond to the need for digital constitutionalism, around the issues of freedom of speech and censorship. This option, however, will require both State and Private actors to rethink how to build a better digital world.


Sonia Sangiovanni is a young professional, working at the intersection between Technology and Public Affairs. She holds a master’s degree in International Security from Sciences Po and her Master Thesis on the privatization of the Intelligence Services in the UK covered the Cambridge Analytica scandals. Through her career, Sonia has worked as a consultant for Public Administrations and the European Commission.

Read More

By Kamayani 21 Sep, 2022
Elon Musk points at Twitter's cybersecurity vulnerabilities to cancel $44 bn buyout-deal.
By Raushan Tara Jaswal 21 Sep, 2022
Time is running out on the National Security defence adopted by the Government of India for the prolonged ban on Chinese based Mobile Applications.
By Marco Schmidt 21 Sep, 2022
This article is a follow-up to “Showdown Down Under?” which was published here last year. As our cycle aims to explore jurisdictions outside the EU and North America, we will further dive into Australian competition law by outlining its basic structure, introducing the relevant actors and give an insight into the pursued policies in the realm of digital markets with a particular focus on “ad tech”.
By Linda Jaeck 16 Jan, 2022
How AI is enabling new frontiers in Mars exploration.
By Marco Schmidt 09 Aug, 2021
Regulation is gaining more traction all over the place but it is uncertain if the Australian News Media Bargain Code will become a role model for legislation in other places. There are several weaknesses to the Code and after all, it is not clear if paying publishers for their content will really alter the high levels of market concentration.
By Theint Theint Thu 09 Aug, 2021
The perseverance of Myanmar’s youth to fight for freedom is proving to be the key to the country’s democratic future.

Watch Our Episodes

Share by: