Over the past year, the European Union has launched public consultations over the Digital Service Act package, a new set of regulations that aims to reform the governance of the internet and its digital services. With the exponential growth of Big Tech companies, such as Google, Amazon, Facebook, Apple, and Microsoft, the EU felt the need to address the gap in regulations over the liability of digital services. As a matter of fact, the previous regulation was established in the early 2000s. This change in legislation was long overdue, not only because of the inadequacy of the current legal framework, but also because of the public pressure that followed recent scandals around digital platforms and the rise in extremist and harmful online behaviours.
Earlier this year, on February 16th, the proposition to regulate Big Tech was also encouraged by Facebook’s founder and CEO, Mark Zuckerberg. As explained in an op-ed for the Financial Times, he believes that private companies should not be left alone in making decisions over “fundamental democratic values”. In addition to the Op-ed, Facebook also submitted a white paper on online content regulation to the EU. Its aim was to suggest a possible way forward, where online platforms should be regulated on a global scale and not be liable for their content. The EU firmly rejected Facebook’s proposition and is now looking forward to “arm itself with new powers to take on big technology companies”.
To better explain the role of the European Union on this matter, it is important to examine the Digital Service Act, and the question of whether self-regulation of Big Tech companies is possible or advisable.
The Digital Service Act and its consultation have been covering two main topics. On one hand, they focus on the review of the e-commerce directive, from the year 2000. This pillar of the regulation aims at ensuring
“a modern system of cooperation for the supervision of platforms and guaranteeing effective enforcement”. To do so, it will set rules to clearly frame the responsibilities of platforms in addressing the risks faced by users. On the other hand, the DSA also tackles the
“ex ante rules covering large online platforms acting as gatekeepers”. This aspect of the new regulations particularly points at ensuring that Big Tech companies do not work as monopolies on the European Single Market and allow smaller competitors to challenge them.
As an article by the European Digital SME Alliance explains, these pillars of the DSA are strictly related for two reasons: “(1) social media platforms strongly rely on advertisement revenues for their business models. Thus, it is essential for them to generate and drive traffic; (2) due to network effects and a closed proprietary environment, digital markets seem to concentrate more easily, leading to the dominance of certain platforms which can sometimes act as gatekeepers.”
This concern was equally presented by the Committee to Protect Journalists. In an article by Tom Gibson, the Committee stated the following:
“For many media organisations, the DSA is an opportunity to rebalance an online landscape dominated by Big Tech, the companies behind major social media platforms and search engines that enable access to news but absorb advertising revenue and audience data that would otherwise help sustain independent journalism.”
Yet, could Big Tech companies regulate themselves?
From their point of view, the regulation of their digital services should not be in the hands of national regulators. As Facebook suggested through Zuckerberg’s op-ed and its white paper, the role of regulators would be to set requirements for private companies, without making them legally liable.
To monitor and moderate its content, Facebook has also created an oversight board, composed of 20 experts. While this solution is a step forward in supervising and regulating Facebook’s posts, its limited powers do not tackle the majority of the issues linked to the core functions of the digital platform.
As the Cambridge Analytica revelations showed, the spread of fake news on Facebook is inherent to its targeted advertisement services. For this reason, the journalist Maria Ressa and the researcher Taylor Owen argued that “it's not just that these platforms enable the free speech of autocrats, but that there's actually a liberal and autocratic tendencies embedded in the design of these companies themselves.” Due to their business model, Big Tech companies have become behavioural modification systems and, as such, they cannot guarantee impartial moderation of their content, without radical change.
The general feeling over Big Tech companies, however, seems to be that “they are too big to care” about these issues and that regulators and users will have no choice but to adapt to their models. As Matt Stoller argues in the podcast “Taking on Tech Goliaths”, this is not necessarily true. According to the author, Big Tech companies are just a new iteration of the type of monopolies that have been characterising the American economy and, as they have been regulated before, so it can be done again.
Consequently, the EU finally has the opportunity to regulate digital platforms and set a new international standard as was done with the GDPR. Their challenge, as one EU official warned, will be to “strike the right balance”.
Read More
Watch Our Episodes