Two linked proposals are contained in the legislative package: a Digital Services Act, which proposes new content moderation rules and could give media companies new tools in their fight against illegal content online; and a Digital Markets Act which proposes ex-ante rules for very large “gatekeeper” platforms and is designed to complement competition law.
This article will focus on the key measures proposed in the DSA which will introduce a horizontal framework for all categories of content, products, services and activities on intermediary services.
According to the Commission, the proposed Digital Services Act builds on the still “valid” core principles of protection from liability for hosting service providers and the prohibition of general monitoring set out in the e-Commerce Directive (2000/31/EC) some twenty years ago. The proposal aims to define certain responsibilities and accountability measures for providers of digital services, in particular online platforms, such as social media and marketplaces.
Included in the proposal are due-diligence obligations for certain intermediary services, including strengthened notice-and-action procedures for illegal content and the possibility for users to challenge the platforms’ content moderation decisions.
Also included is an obligation for certain online platforms to receive, store and partially verify and publish information on traders using their services. This “know-your-business-customer” (KYBC) approach, which is designed to guard against rogue traders, is familiar from the financial services sector.
Furthermore, online platforms would be required to ensure that users have the individualised information needed for them to understand when and on whose behalf an advertisement is displayed, including whether or not this advertisement is based on profiling.
Very large platforms have been singled out for particular attention. The proposal sets a higher standard of transparency and accountability for the way in which such platforms moderate content, use targeted advertising and employ algorithms. It also sets obligations to assess the risk their systems might pose and develop risk management tools to protect their services against the use of manipulative technologies.
In addition, the Commission aims to remove disincentives towards voluntary online safety measures undertaken by digital service providers and to clarify their responsibilities in relation to consumers.
Concretely, the envisaged measures include:
- Measures to counter illegal content online (goods and services) e.g. a mechanism for users to flag such content, and for platforms to cooperate with “trusted flaggers”;
- New rules on traceability of business users in online market places, to help identify sellers of illegal goods (‘know-your-own-business-customer’ approach);
- Content moderation measures to enable users to challenge platforms’ decisions;
- Transparency requirements for online platforms as regards the targeting of users with advertisements and algorithms used for recommendations;
- Obligations for very large online platforms to prevent abuse of their systems by taking risk-management measures, including oversight through independent audits ;
- Researchers to have access to data of key platforms, in order to scrutinise how platforms work and how online risks evolve;
- Oversight structure with Member States to have the primary role, supported by a new European Board for Digital Services; enhanced supervision and enforcement by the Commission for very large online platforms.
Platforms that reach more than 10% of the EU’s population (45 million users) are considered systemic in nature, and are subject not only to specific obligations to control their own risks, but also to a new oversight structure. This comprises a board of national Digital Services Coordinators, with special powers to supervise very large platforms, including the ability to sanction them directly. Online platforms deemed to have flouted the new rules risk fines of 6 per cent of total global turnover in the preceding year, which could amount to billions of euros for the largest players.
Explaining its reasons for regulatory intervention, the Commission said that existing sector specific regulations which cover, for example terrorist content, hate speech, child sexual abuse material or copyright infringements have limited scope and this leaves regulatory gaps. In particular, the current framework does not provide fully-fledged rules on the procedural obligations related to illegal content and only includes basic rules on transparency and accountability, as well as limited oversight mechanisms.
Nevertheless, the Commission said that the proposal also takes into account voluntary initiatives signed by a number of online platforms such as the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech and the EU Internet Forum with regard to terrorist content.
The Commission also noted that in its view the DSA does not affect the application of existing laws such as the recently updated Audiovisual Media Services Directive (AVMSD – Directive (EU) 2018/1808) which already ushered in new rules regarding video-sharing platform providers.
The proposed Digital Services Act takes the form of a Regulation, which will be directly applicable to Member States. Its publication marks the beginning of a process that will involve, at the minimum, 18 months of negotiations between the European Commission, the European Parliament and Member States in the Council, followed by six months for the final Regulation to take effect.
However, we can expect intense discussions over definitions and the scope of the new rules, possibility even legal challenges before the European Court of Justice. Some Members of the European Parliament already wish to go further than the Commission proposal and introduce a complete ban on behavioural advertising online. Fault lines are also already emerging between the more interventionist countries such as France and countries already indicating a cautious approach such as the Nordic countries and Ireland. In reality, the decision-making process could stretch into 2023.
Bird & Bird’s Regulatory & Public Affairs team and legal experts are available to support clients in following these regulatory developments and understanding the potential impacts on their business.