The EU is currently working on, or passing, big legislative changes in order to address issues within the digital marketplace and in relation to "Big Tech". This includes the Digital Single Market Directive and the forthcoming Digital Services Act, which will make sweeping changes to how companies operate in the digital space. One hotly contested point within this package is Article 17 of the DSM, which was challenged by Poland in a hotly awaited case within the CJEU. Following an opinion from Advocate General Øe as far back as July last year, the Court handed down its decision very recently on this, which sets the scene for the application of the DSM in the near future.
The case of Republic of Poland v European Parliament concerned an action for annulment brought by Poland in relation to Article 17. What the provision does is introduce liability for platforms that allow users to upload digital content onto the platform, such as YouTube, for copyright infringement if that user-uploaded material infringes copyright. However, platforms can avoid liability by making "best efforts" to either acquire a license or block the infringing content, and an obligation is placed on the platforms to act expeditiously following any notification by rightsholders to remove or disable the content and to use "best efforts" to prevent any future uploads, for example through the use of content filtering.
As is clear, Article 17 is set to change the landscape for platforms like YouTube and introduces strict measures and steps that need to be taken by the same in relation to infringing content. Poland challenged the provision on the grounds that it infringed Article 11 of the Charter of Fundamental Rights of the European Union, which enshrines the freedom of expression in EU law.
What they specifically argue is that in order to be exempted from all liability for giving the public access to copyright-protected works or other protected subject-matter uploaded by their users in breach of copyright, online content-sharing service providers are required to carry out preventive monitoring of all the content which their users upload. This imposition of these mandatory measures, without appropriate safeguards put in place, would breach the right to freedom of expression.
The Court discussed their decision in YouTube and Cyando, which sets out that "the operator of a video-sharing platform or a file-hosting and ‑sharing platform, on which users can illegally make protected content available to the public, does not make a ‘communication to the public’ of that content, within the meaning of that provision, unless it contributes, beyond merely making that platform available, to giving access to such content to the public in breach of copyright". In addition, the same service providers will be specifically exempt from infringement provided that they do not play an active role of such a kind as to give it knowledge of or control over the content uploaded to its platform.
Article 17, as discussed above, is set to change this position and introduce new liability provisions and obligations on platforms.
As set out by the Court, Article 11 of the Charter and Article 10 of the European Convention on Human Rights guarantee the freedom of expression and information for everyone, including in relation to the dissemination of information which encompasses the Internet as a key means of dissemination. Therefore, courts have to take due account the particular importance of the internet to freedom of expression and information and ensure that those rights are respected in any applicable legislation.
In considering whether Article 17 has an impact on peoples' freedom of expression, they initially noted that the provision assumes that some platforms will not be able to get licenses from rightsholders for the content that is uploaded by users, and the rightsholders are free to determine whether and under what conditions their works and other protected subject matter are used. If no authorization is received the platforms will only have to "...demonstrate that they have made their best efforts... to obtain such an authorisation and that they fulfil all the other conditions for exemption" in order to avoid liability.
The conditions include acting expeditiously when notified by rightsholders in order to disable access to, or to remove the potentially offending content, to put in place appropriate measures to prevent the uploading of infringing content (e.g. through the use of automatic recognition and filtering tools), and to prevent future uploads. The Court does accept that these early measures can restrict an important means of disseminating online content and thus constitute a limitation on the right guaranteed by Article 11 of the Charter. They, therefore, concluded that "the specific liability regime [under] Article 17(4)... in respect of online content-sharing service providers, entails a limitation on the exercise of the right to freedom of expression and information of users of those content-sharing services".
However, the next piece of the puzzle the Court addressed was whether the limitation could be justified.
The Charter does allow for legislation to impact the freedoms it contains, however, any limitations of those rights must be proportionate and made "... only if they are necessary and genuinely meet objectives of general interest recognised by the [EU] or the need to protect the rights and freedoms of others". If there is a choice between several different measures the least onerous one has to be chosen over the others and the disadvantages caused must not be disproportionate to the aims pursued. Finally, adequate safeguards need to be put in place within the legislation to guarantee the protection of those rights from abuse.
The Directive and Article 17(4) do specify the limitations placed on the freedom of expression, it does not specify the means through which it should be done, only that they need to use their 'best efforts' in doing so. Although, as set out above, the Charter does require that these need to be specified, the Court noted that the limitations can be formulated in terms that are sufficiently open to be able to keep pace with changing circumstances.
The Court also mentioned that the legislation, in Article 17(7) (and similarly in Article 17(9)), requires that other expression not be limited if it does not infringe copyright, so it is specific well beyond mere "best efforts". The Court accepted that both Articles 17(7) and (9) protected the right to freedom of expression and information of users of online content-sharing services adequately. They confirmed that this struck an adequate balance between both parties' interests. Looking at the liability mechanism itself, the Court agreed that it was not only appropriate but also appears necessary to meet the need to protect intellectual property rights.
Also, the platforms are protected by the provision somewhat due to the notification requirements it sets, and without notification, broadly speaking, they would not be held liable for any infringing contents. Article 17(8) also specifically excludes a general monitoring obligation on the platforms. Any notification also has to contain sufficient information to remove the infringing contents easily. Article 17(9) contains several other procedural safeguards that should protect the right to freedom of expression and information of users where the service providers erroneously or unjustifiably block lawful content.
In summary, the Court set out that "...the obligation on online content-sharing service providers to review, prior to its dissemination to the public, the content that users wish to upload to their platforms, resulting from the specific liability regime established in Article 17(4)... and in particular from the conditions for exemption from liability... has been accompanied by appropriate safeguards by the EU legislature in order to ensure... respect for the right to freedom of expression and information of the users of those services... and a fair balance between that right, on the one hand, and the right to intellectual property, protected by Article 17(2) of the Charter".
The decision is undoubtedly correct and, although a bitter pill to swallow for many platforms that will be impacted by it, the aim of the legislation seems to not be to unduly burden them with monitoring obligations and to stifle the freedom of expression, but attempts to balance that with the protection of legitimate copyrights held by other parties. It will remain to be seen what filtering measures will be considered adequate by the EU courts in the future if they are challenged, but one can imagine sufficient measures already exist, such as Content ID for YouTube. The DSM is the first big step in the new IP regime in Europe, with more changes set to be made in the near future.
No comments:
Post a Comment
All comments will be moderated before publication. Any messages that contain, among other things, irrelevant content, advertising, spam, or are otherwise against good taste, will not be published.
Please keep all messages to the topic and as relevant as possible.
Should your message have been removed in error or you would want to complain about a removal, please email any complaints to jani.ihalainen(at)gmail.com.