Online platforms face far-reaching new duty of care obligations as agreement reached on EU Digital Services Act

Online marketplaces, social media and search engines will soon be subject to ground-breaking new rules in the European Union, following a political agreement reached on the Digital Services Act (DSA) on 23 April. The DSA, which aims to create a safer and more transparent online environment for citizens, will usher in due diligence requirements on all online intermediaries providing services in the EU. With the DSA, Brussels aims to set a global precedent for regulating digital services. This development comes just one month after an agreement was reached on a parallel piece of legislation, the EU Digital Markets Act, which will complement competition law by introducing specific ex-ante regulation for the largest “gatekeeper” platforms.

0
451

Requirements under the DSA are also dependent on the size of the relevant online intermediary. Very large online platforms (VLOPs) and very large online search engines (VLOSEs), with more than 45 million monthly active users in the EU, will be subject to stricter obligations since they are viewed as posing increased risks regarding the circulation of illegal content, goods or services. The European Commission will have exclusive power to supervise such large players, in cooperation with Member States. At the other end of the scale, start-ups, small and micro enterprises will be exempted from certain new requirements.

While the official DSA text has not yet been made available, press releases from the EU institutions and comments from the negotiators have shed some light on the shape of the final agreement. The Parliament’s rapporteur, the Danish social-democratic MEP Christel Schaldemose, pointed to additional obligations on algorithmic transparency and disinformation as important achievements. The latter was inspired by mounting concern over online misinformation in the context of the war in Ukraine. To this end, a new provision was added to the DSA introducing a crisis response mechanism that is designed to enable analysis of the impact of the activities of very large online platforms and search engines on a particular crisis and decide on measures to be put in place to safeguard fundamental rights. This mechanism will be activated by the European Commission on the recommendation of the incoming board of national Digital Services Coordinators.

Under the DSA, online platforms using recommendation systems to suggest certain content to users will be obliged to meet transparency requirements regarding the parameters used and offer users the possibility of a system for recommending  content that is not based on their profiling.

Furthermore, platforms accessible to minors will also have to put in place special protection measures to ensure their safety online. Platforms will be prohibited from presenting targeted advertising based on the use of minors’ personal data.

For all online intermediaries covered by the DSA, the negotiators agreed to prohibit misleading interfaces known as ‘dark patterns’ and practices aimed at misleading users.

According to the European Parliament’s press release, the final DSA agreement includes the following new rules:

Online platforms

  • Algorithmic accountability: the European Commission as well as Member States will have access to the algorithms of very large online platforms;
  • Swift removal of illegal content online, including products, services: a clearer “notice and action” procedure where users will be empowered to report illegal content online and online platforms will have to act quickly;
  • Fundamental rights to be protected also online: stronger safeguards to ensure notices are processed in a non-arbitrary and non-discriminatory manner and with respect for fundamental rights, including the freedom of expression and data protection;
  • More responsible online marketplaces: they have to ensure that consumers can purchase safe products or services online, by strengthening checks to prove that the information provided by traders is reliable (“Know Your Business Customer” principle) and make efforts to prevent illegal content from appearing on their platforms, including through random checks;
  • Victims of cyber violence will be better protected especially against non-consensual sharing of illegal content (revenge porn) with immediate takedowns;
  • Penalties: online platforms and search engines can be fined up to 6% of their worldwide turnover. In the case of very large online platforms (with more than 45 million users), the EU Commission will have exclusive power to demand compliance;
  • Fewer burdens and more time to adapt for SMEs: longer period to apply the new rules will support innovation in the digital economy. The Commission will follow closely the potential economic effects of the new obligations on small businesses.

Online space for users

  • New transparency obligations for platforms will allow users to be better informed about how content is recommended to them (recommender systems) and to choose at least one option not based on profiling;
  • Online advertising: users will have better control over how their personal data are used. Targeted advertising is banned when it comes to sensitive data (e.g. based on sexual orientation, religion, ethnicity);
  • Protection of minors: platforms accessible to minors will have to take specific measures to protect them, including by fully banning targeted advertising;
  • Manipulating users’ choices through ‘dark patterns’ will be prohibited: online platforms and marketplaces should not nudge people into using their services, for example by giving more prominence to a particular choice or urging the recipient to change their choice via interfering pop-ups. Moreover, cancelling a subscription to a service should become as easy as subscribing to it;
  • Compensation: recipients of digital services will have a right to seek redress for any damages or loss suffered due to infringements by platforms.

Harmful content and disinformation

  • Very large online platforms will have to assess and mitigate systemic risks and be subject to independent audits each year. In addition, those large platforms that use so-called “recommender systems” (algorithms that determine what users see) must provide at least one option that is not based on profiling;
  • Special measures in times of crisis: when a crisis occurs, such as a public security or health threat, the Commission may require very large platforms to limit any urgent threats on its platforms. These specific actions are limited to three months.

Next steps

Once the DSA text is finalised at a technical level, which should happen in the following weeks, both the European Parliament and Council will have to give their formal approval. This process is likely to be completed by the end of July 2022.

Once adopted, the DSA will be directly applicable across the EU and will apply fifteen months or from 1 January 2024, whichever is later, after entry into force. However, the DSA will apply to very large online platforms and very large online search engines from an earlier date, that is four months after their designation.

Links to the relevant press releases:

Bird & Bird is available to help companies and organisations with their impact assessment of the applicability of the new rules to their services.

For further information contact:

Francine Cunningham
Regulatory & Public Affairs Director (Brussels)
francine.cunningham@twobirds.com

Francine is the Regulatory & Public Affairs Director, based in Bird & Bird's Brussels office. With 20 years' experience of working in the EU and U.S. regulatory environment, in both the private and public sectors, Francine has extensive knowledge of the European regulatory environment for the media, technology and communications sectors. At a time of enormous regulatory challenges that will define the future of the digital economy, she advises clients how to navigate complex EU decision-making processes to achieve specific industry goals.

Leave a Reply