In May 2019 the Republic of Poland brought, on the basis of Article 263 TFEU (review of legality of acts intended to produce legal effects vis-à-vis third parties), an action for the annulment of Article 17 of the then new directive.
The action sought to annul Article 17 in part, that is Article 17 (4)(b) (best efforts with high industry standards to ensure unavailability of the flagged content) and Article 17(4) (c), in fine, as far as the words “best efforts to prevent their future uploads in accordance with point (b)” are concerned. In the alternative, Poland asked the Court to annul Article 17 in its entirety.
The Directive on copyright in the Digital Single Market
Adopted on 17 April 2019, the Directive sets out new rules regarding copyright in a digital environment.
Article 17 of the Directive amends the liability rules of certain service providers (online content-sharing service providers). It aims at remedying the “Value Gap” between the perceived revenues of the providers and the revenues distributed to rightsholders.
By means of Article 17, the Directive greatly enhances the rights of the rightsholders vis-à-vis the online content-sharing service providers (defined as providers of information society services of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which they organise and promote for profit-making purposes).
It makes it clear that such providers communicate to the public works uploaded by their users when they give access to such works. This was a much disputed issue under the Directive 2001/29 (see, for example: YouTube and Cyando case (C-682/18 and C-683/18)).
To avoid liability, the providers should obtain authorisation from the rightsholders (such as a licensing agreement). In the absence of such authorisation, they are in principle primarily liable for the content unless they can prove that they (a) made best efforts to obtain an authorisation (b) made best efforts to ensure unavailability of content notified to them by the rightsholders (this obligation was disputed by the Republic of Poland in its entirety) (c) reacted expeditiously to take down notices and made best efforts to ensure that such content stayed down (the latter was disputed by Poland).
To determine whether a provider in a particular situation passed the legal test, the proportionality principle should apply.
The Directive’s implementation deadline expired on 7 June 2021, but only 16 Member States have adopted it so far.
The claimant’s arguments
In its challenge, the Republic of Poland in essence claimed that the disputed provisions create a preventive monitoring measure that is incompatible with Article 11 of the Charter of Fundamental Rights of the European Union (freedom of expression and information).
The claimant submitted that the providers, to fulfil the requirements of best efforts to make the content unavailable or stay down, must use software tools which enable the automatic filtering of the content that users wish to upload. As such measures are not error free and in general are not suitable to recognise the legal uses of the content, this would lead to over-blocking of the user content, which can only be restored ex post (and only if the user decides to file a complaint and seek redress).
Therefore, this measure constitutes a limitation on the exercise of the right to freedom of expression and undermines the essence of this right or fails to comply with the principle of proportionality.
The AG’s Opinion
The Advocate General (AG) advised the CJEU to dismiss the action.
The AG stated, among other things, that the correct interpretation of Article 17 must lead to a conclusion that service providers must ex ante protect the rights of users and are not allowed to preventively and systematically block content falling within, inter alia, the scope of the exceptions and limitations to copyrights.
The Court addressed the very admissibility of the action at the outset. Like the AG, it held that Article 17 is a new liability regime, whose various parts form a unicum whole. As such, it would not be possible to annul Article 17 in part: it should be either annulled as whole or not at all. Incidentally, the eventual annulment of Article 17 would not have any effect on the rest of the Directive.
This said, the Court turned to the analysis of the regime introduced in Article 17, which only applies to online content sharing service providers and requires them in principle to seek authorisation from relevant rightsholders to make available user-uploaded content.
Turning to Article 17(4), the CJEU characterised it as a specific liability regime that applies in the event that no authorisation has been obtained from the concerned rightsholders despite the best efforts made by the providers at hand.
The Court then noted how such a regime is clarified and supplemented by Article 17(5) and (10). Article 17(7) further specifies that the cooperation between the providers and rightsholders should not result in the prevention of the availability of lawful uploads, including those covered by exceptions and limitations.
Article 11 of the EU Charter of Fundamental Rights corresponds to Article 10 of the European Convention on Human Rights. The Court stressed the importance of the internet to freedom of expression and information, something also stressed in last year’s ruling in YouTube and Cyando, C-682/18 and C-683/18.
Turning to the core of the claim, the Court admitted that Article 17(4)(b)-(c) imposes on the online content-sharing service providers monitoring obligations that require them to employ automated content recognition (ACR) and filtering technologies. Such technologies clearly restrict freedom of expression/information. That being said, freedom of expression/information is not absolute, and its exercise can be restricted under certain conditions and provided that the principle of proportionality (which is also a general principle of EU law) is respected.
In order to determine whether the limitation on freedom of expression/information as undertaken by Article 17(4)(b)-(c) may be tolerated, the Court considered it necessary not to limit the analysis to Article 17(4), but also to: (i) take Article 17(7)-(10) into account given that they specify and complete the liability regime and (ii) consider the legitimate objective underlying the adoption of Article 17 as a whole.
The Court then added that, when more than one interpretation is possible, precedence should be given to that which does not call into question the validity of the provision at hand and is in accordance with primary law as a whole and, in particular, with the provisions of the Charter.
Ultimately, the Court held that:
- The limitation on freedom of expression carried out by Article 17 is provided for by law.
- Even though Article 17(4)(c) does not specify what measures the providers are to adopt to ensure the unavailability of specific content, this is not in itself contrary to freedom of expression/information and may be even necessary to respect their freedom to conduct a business, in accordance with Article 16 of the Charter.
- The limitation to freedom of expression carried out by Article 17 does not remove the essential content of that freedom by virtue of (i) the obligation of result contained in Article 17(7) and (ii) the wording of Article 17(9), third sub-paragraph concerning “legitimate uses”.
Agreeing with the AG and also considering the principles laid down in earlier case law, the Court concluded that Article 17(4)(b)-(c) cannot be complied with by taking measures that block lawful uploads. In this sense, it follows from Article 17(7) and (9), as well as recital 66 and 70, that it is not possible to use ACR technologies that do not sufficiently distinguish between lawful and unlawful content to block user uploads ex ante, including where relevant exceptions and limitations – which the Court referred to as ‘rights’ – are at issue.
The ruling adds complexity to the already complex legal landscape of platform liability.
All Member States will need to ensure that they do not mandate ex ante blocking of content, especially with respect to content which requires providers to make an independent assessment of its lawfulness. Algorithms based on the quantity of work used in the uploaded content may not provide sufficient guarantees to users’ rights.
The Member States that have already implemented the Directive will need to reassess whether their local laws ensure a proper balance between the rights of users and the rightsholders.
In all this, the mandate of the Court is clear: lawful content should not be blocked, and technological solutions adopted to comply with Article 17(4) of the Directive need to appropriately discriminate between lawful and unlawful uploads.