The Commission Guidance on Article 17 of the “Directive 2019 (790/EC on Copyright in the Digital Single Market”) provides much-sought guidance on who is caught by the provision, what licensing obligations exist and on how to tell potentially infringing content from legitimate copyright use. The commission Guidance has been confirmed by the Advocate-General’s conclusions on the pending Polish annulment case – with the CJEU yet to rule.
The EU Commission has published its guidance document on what has probably been the most-controversial issue in copyright across Europe for years, if not decades: the special regime for “online content sharing service providers” – services such as YouTube, TikTok, etc that “store and give the public access to a large amount of copyright-protected works … uploaded by its users”.
The legislator sought to fulfil a delicate task when drafting article 17 of Directive 2019/790 on Copyright in the single market (“DSM-Directive”), striving to balance diverging rights and interests of three different types of actors: rightsholders, end users, and platforms. Rightsholders such as music composers or major labels rely on copyright as a property right under the EU Charter of Fundamental Rights, while end users draw on their freedom of speech, and platforms flag their entitlement “to conduct a business” under the same Charter.
Even though the drafters made an attempt towards giving detailed rules, the requisite balance between countervailing fundamental rights still remains to be struck by domestic court rulings on the individual case at hand in the future, by the European Court of Justice, and by domestic legislators when implementing transposition legislation. Not to mention by market actors seeking to ensure compliance and to escape liability.
The transposition phase expired last June, but a few member states are yet to implement the directive, expecting to do so in upcoming months. The Commission guidance document is thus very likely to inform ongoing transposition steps, the market behaviour of actors, and future case law. This piece of guidance is not legally binding, but has been formally adopted as a communication by the commission and therefore fulfils the mandate under article 17(10).
The document focuses on four key areas:
- the limitation in terms of scope as a lex specialis to the general rules under the e-commerce directive,
- limitations of member state powers to implement more specific concepts for general notions under the Directive such as “best efforts” or “large amounts of protected works”
- guidance on best efforts to obtain licenses, and
- how to strike the subtle balance between precautionary blocking of potentially unlawful content to ensure copyright protection, and avoiding any overblocking to limit interferences with the freedom of speech.
In summary, the Commission highlights:
- First, the special regime under Article 17 of the DSM-Directive will only be applicable to a limited set of market actors. The notion of “online content sharing service providers” ought to be construed narrowly as limited to platforms that compete actively with other online content services. This legal description of the “value gap” reasoning is enshrined in Recital 62, but the Commission has added some further narrowing language.
- Second, member state powers to develop their own concepts as to “large amount of copyright-protected works” or “best efforts” to obtain licenses are not permitted, since these are autonomous concepts of Union law that will be assessed on a case-by-case basis. Related provisions in the Directive are therefore likely to trigger references for a preliminary ruling to the CJEU as has been the case with the “communication to the public” right under the Infosoc Directive.
- Third, best efforts to obtain licenses typically include reaching out to collecting societies as representative rights owners, but platforms may also limit the scope of their obligations by restricting the type of content typically available on their systems.
- Fourth, owing to the limitations of technology, automated blocking should in principle be limited to manifestly infringing content. The Commission includes details and examples to inform the novel concept of “manifestly infringing content” that appears to be informed by German transposition legislation.
- And finally, there is no general obligation to deploy all publicly available filter technologies, let alone to develop in-house filters, should suitable solutions not be readily available.
Online content sharing service providers: A special regime with a narrow scope
Recital 62 is fairly explicit as to what Article 17 is intended to regulate: The Directive “should target only online services that play an important role on the online content market”, namely “by competing with other online content services, such as online audio and video streaming services, for the same audiences.” This is a conceptual reference to the value gap policy argument that now turns into a scope-limiting requirement. This is apparent from Advocate-General Saugmandsgaard ØE’s recent opinion in the Polish annulment action pending in the CJEU (C-401/19). The opinion has not made its way into the Commission document which was published before that opinion, however the guidance does note that the guidance may need to be reviewed following the CJEU decision in that case.
The criteria for determining the scope of Article 17 as listed in the Directive and further explained by the Commission (see below) must be interpreted in the light of this policy argument: The value gap occurs where copyright protected content is available on commercial (mostly ad-funded) platforms for free, driving consumers away from licensed (mostly subscription-based) content providers, while not sharing what is perceived to be a fair proportion of their revenue with the rightsholders. Against this background, requirements such as “important role”, “competing with other online content services … for the same”, “large amount of works” or “promoting … to attract a larger audience” should therefore be understood as only being met by services that consumers are likely to choose instead of content providers such as Spotify, Netflix, etc., if they are looking for specific content.
The Commission specifically relies on the indicative value of explicit exceptions of services such as not-for-profit online encyclopaedias, open source software development platforms or online marketplaces. As the Commission suggests, the list under Art. 2(6) of the Directive (“such as”) is “non-exhaustive”, meaning that each service is to be “assessed on a case by case basis, taking into account a combination of elements such as the audience of the service provider and the number of files uploaded by users overall”.
The French implementation of the article 2-6 does not reproduce “such as”, meaning that under French law the list of online services being excluded seems to be exhaustive.
Such case by case assessment will indeed include review of the “main purpose” of each service as per Recital 62:
“The services covered by this Directive are services, the main or one of the main purposes of which is to store and enable users to upload and share a large amount of copyright-protected content with the purpose of obtaining profit therefrom, either directly or indirectly, by organising it and promoting it in order to attract a larger audience, including by categorising it and using targeted promotion within it.”
Further elements to be considered are the categorisation of content and the use of targeted advertising. Related profits have to be made “from the organisation and promotion of the content uploaded by the users in a manner to attract a wider audience”. In turn, the “simple fact of receiving a fee from the users to cover the operating costs of hosting the content … should not be considered as an indication of profit-making purpose.”
All these elements, taken together, will inform the distinction to be made as to whether a service competes “with other online content services, such as online audio and video streaming services, for the same audiences”, as Recital 62 phrases it. The “overall assessment” language employed by the Commission, in turn, is reminiscent of the similar phrase developed by the CJEU in “communication to the public” cases which to date has led to an impressive number of over 20 decisions. The key definition of online content sharing service providers thus bears considerable potential for CJEU referrals.
Guidance on “best efforts” to enter into license agreements
As the guidance document highlights elsewhere, “cooperation is key” which is reflected in the Directive’s recurrent recourse to “best efforts” to be deployed by platforms to enter into licensing agreements, or to set up state-of-the-art filter technologies – all those requirements to be qualified and fine-tuned by the proportionality principle in each individual case. This is part of the overall approach to reconcile countervailing rights of copyright owners, platforms and end users in a “procedural” fashion: by cooperation and mutual checks-and-balances.
The Commission now broadly distinguishes two types of obligations: reaching out to rights holders proactively versus engaging (passively) with rights holders approaching platforms. To demonstrate best efforts, the Commission seems to require providers to “engage proactively with rights holders that can be easily identified and located, notably those representing a broad catalogue of works …” – which includes collective management organisations as a “minimum requirement for all online content sharing service providers”. Due efforts will also be assessed on a case-by-case basis, taking into account:
“the size and audience of the service and the different types of content …, including the specific situations where some types of content may appear only rarely on the service.”
Again, this draws on the general purpose of Art. 17 to prevent the value gap to occur: Even services that – taken as a whole – fall under the scope of regulation do not have to proactively seek licenses as regards types of content in relation to which they do not compete with licensed content providers.
Furthermore, this confers powers on platforms to influence the scope of due obligations, for example by taking precautions to limit the type of content that will typically be available to users. If a service does not include any or very few options to upload images, it will not be necessary to contact the relevant CMO. On the other hand, obligations go well beyond this level where rights holders reach out to platforms proactively. In that case, such online content sharing service providers “would be required to engage with all the rights holders approaching them to offer a licence.” This obligation of good conduct goes even further:
“Online content-sharing service providers should engage with those rightholders that wish to offer an authorisation for their content, irrespective of whether their type of content (e.g. music, audio-visual content, images, text) is prevalent or is less common on the website of the service provider (e.g. images or text for a video-sharing platform).
Automated blocking: ex-post fine-tuning efforts informed by Member State transposition concepts
Automated blocking has probably been the most controversial topic in the debate over article 17 of the DSM-Directive. The boundaries are enshrined under article 17(7) which provides:
“The cooperation between online content-sharing service providers and rightholders shall not result in the prevention of the availability of works or other subject matter uploaded by users, which do not infringe copyright and related rights, including where such works or other subject matter are covered by an exception or limitation.”
In other words: Legitimate uses should remain online from the outset. As the Commission Guidance specifies, this is an obligation of result – a somewhat idle obligation given the current state of the art as “no technology can assess the standard required in law whether content … is infringing or a legitimate use”, as the Commission guidance states. The use of third-party copyright works may particularly be lawful under exceptions that are either mandatory or optional under the Infosoc Directive (2001/29/EC) or under exceptions specified under the DSM-Directive such as quotation, parody or pastiche. These are particularly difficult to identify by automated filters.
The conflict is apparent: On the one hand, article 17(4) of the DSM-Directive imposes a number of obligations on platforms, namely on best efforts to obtain licenses, filtering, and notice-and-take-down as well as notice-and-stay-down mechanisms following a sufficiently substantiated request. On the other hand, article 17(7) seeks to ensure that non-infringing content stays online. The Commission Guidance yet leaves no doubt as to which obligation shall prevail in case of doubt:
“Article 17(7), (8) and (9) are formulated as obligations of result. The Member States should … ensure in their implementing laws that these obligations prevail in case of conflict with the provisions set out elsewhere in Article 17 and in particular in Article 17(4).”
The proportionality principle and the fundamental rights at issue would therefore seem to require a risk-based approach: identifying situations where copyright infringement is the most likely scenario as opposed to others where defences are likely to apply. The Commission concludes that “automated blocking… should in principle be limited to manifestly infringing uploads.” Any other content should “in principle go online and may be subject to an ex post human review when rightholders oppose by sending a notice.”
Concurring with this key piece of guidance, Advocate-General Saugmandsgaard Øe has concluded in the Polish annulment case (C-401/19, para. 198):
“to minimise the risk of ‘over-blocking’ and, therefore, ensure compliance with the right to freedom of expression, an intermediary provider may, in my view, only be required to filter and block information which has first been established by a court as being illegal or, otherwise, information the unlawfulness of which is obvious from the outset, that is to say, it is manifest, without, inter alia, the need for contextualisation.”
For the Commission, decisive factors for establishing a manifest infringement are the length/size of the identified content, the proportion of the matching content in relation to the entire upload and the level of modification of the work. In particular:
“exact matches of entire works or of significant proportions of work should normally be considered manifestly infringing (e.g. when the recording of the whole song is used as background in the user created video).”
Counter-examples of non-manifestly infringing works include “adding elements to a picture to create a ‘meme’” (this could be read as implying that moral rights are not covered by article 17 which will facilitate the task of the platforms) or short extracts representing a small proportion of the entire work (that is likely to be covered by the quotation exception). Examples are extracts of a feature film or of a song.
The “manifestly-infringing standard” does not represent a legal assessment of the “legitimacy of the upload“. It is a purely procedural standard aimed at striking a fair balance between countervailing fundamental rights. A more detailed review, including actual standing of the rightholder, will be carried out in a subsequent notice and takedown/staydown procedure under article 17(4) (c) of the DSM-Directive following a sufficiently substantiated notice by rightholders.
This distinction by manifestly infringing types of content has been introduced by German transposition legislation in an attempt to give clear guidance to platforms as to the proper set-up of automated filter technologies. Dutch transposition legislation, while closer to the wording of the Directive, includes a provision that allows the Ministry of Justice to provide further rules on the application of the Article 17 implementation (by way of “governmental decree”) – thus paving the way for issuing more detailed guidance for market actors in the future. In its “Promusicae” judgment, the CJEU had already established that “Member States must … take care to rely on an interpretation of the directives which allows a fair balance to be struck between the various fundamental rights protected by the Community legal order” (C-275/06, para. 68). The Commission document now sheds some light on this delicate balancing exercise to be performed by the member states.
In its transposition, Italy, by contrast, has instead opted for the ex ante blocking of content pending a complaint. The Italian Communication Authority (AGCOM) is tasked with both issuing guidelines regarding the complaint and redress mechanism and deciding appeals against online content sharing service providers’ decisions rendered further to a complaint. The right to bring judicial proceedings is unaffected.
Concerning the scope of filter technologies, online content sharing service providers are not expected to apply the most costly or sophisticated solutions, in particular where applicable technologies are not readily available on the market. Depending on the type of service and exposure to 3rd party copyright infringement, standard filter software would not necessarily be considered as state-of-the-art market standard to be complied with, as the Guidance document highlights.
Poland is one of the member states that has not even started the implementation of the DSM-Directive yet as it decided to challenge its Article 17 before the EU highest court – seeking to annul the filtering obligation on the grounds that it would lead to censorship and limit the freedom of expression and information guaranteed in Article 11 of the EU Charter of Fundamental Rights (C-401/19). Even though Advocate General Saugmandsgaard Øe in his postponed opinion of 15 July 2021 found Article 17 of the DSM-Directive compatible with Article 11 of the Charter, the judgment of the CJEU in this case is still due. It is not yet clear if the recently published Commission’s guidance on Article 17 would change Poland’s position on the matter, but (as of the date of publication) it seems Poland is not planning to start the implementation process prior to its challenge to Article 17 being finally decided.
Implications of Brexit After publication of the DSM Directive the UK government announced that it would not be transposing it into UK law. This means that there will be a material divergence in law on all areas governed by the Directive, including the requirements under Art. 17. This decision will have led to frustration for all UK-based businesses who lobbied in relation to the creation of Art. 17 in an attempt to fill the perceived “value gap”, but will also cause legal uncertainty for those platforms operating across both the UK and the EU. By way of example, though the DSM Directive is stated to be lex specialis, it will not take precedence in the UK over more generally stated EU law, such as the safe harbours set out in 2000/31/EC. This means that online content sharing service providers may be able to continue to operate under a more permissive notice-and-takedown style regime in the UK, without the need to enter any arrangements with rightsholders. (Although this may now also be subject to closer scrutiny as to whether the platform itself communicates the work to the public – see the CJEU decision in You Tube (C‑682/18).) In light of the significance of the UK market, this could result in a continuance of that value gap. It may be that, to facilitate business, the UK does decide at some time in the future to create new law in this area, however it could well wait to see how successful the national implementation of Art. 17 is (or the number of CJEU referrals it creates…) before it considers the best approach from a UK perspective.
Rebecca O’Kelly Gillard and Niels Lutzhöft on behalf of the Copyright Group, with contributions from Edouard Treppoz (France), Clemens Molle (Netherlands), Benoit van Asbroeck (Belgium), Piotr Dynowski (Poland), and Eleonora Rosati (Italy).