Guarantees of the rights of citizens (users) saved the system of filtering and blocking content on the Internet

In the opinion of the CJEU, the preventive measures enabling  monitoring and blocking user content, as provided for in Article 17 (4), restrict their exercise of the right to freedom of expression and information, but comply with the conditions set out in the Charter.

On April 26, 2022, the Court of Justice of the European Union (CJEU) issued a judgment in  the case C-401/19 – Poland v. Parliament and Council. This case concerns the most controversial mechanism of the Directive on copyright and related rights in the digital single market. It is Article 17 of the Directive, which provides for a new liability regime for the largest online content-sharing service providers (hereinafter service providers). In practice – as it was also emphasized by the Court in the judgment – this regime imposes an obligation to use automatic upload filters by service providers which enable preventive evaluation of uploaded content.

Essentially, the case concerned the issue of permissibility of such a solution in the light of the freedom of expression and information, i.e. the fundamental rights of the European Union (Article 11 of the Charter of Fundamental Rights of the European Union). The Court acknowledged that although the new liability regime infringed the freedom of expression and information, such a breach was permissible provided that certain additional conditions were met. In the opinion of the CJEU, the preventive measures enabling  monitoring and blocking user content, as provided for in Article 17 (4), restrict their exercise of the right to freedom of expression and information, but comply with the conditions set out in the Charter. It is worth emphasizing that these are the guarantees that were introduced into the text of the Directive, also thanks to the involvement of organizations such as the Communia Association, of which Centrum Cyfrowe is a member. 

New liability rules and new copyright - what Article 17 is all about

The rules that have been in place so far (including Article 14 (1) of the E-Commerce Directive) assume that the responsibility for the content made available rests with the person who uploaded it, and the platforms themselves – if any – bear secondary liability. Article 17 of the new Directive on Copyright in the Digital Single Market changes these rules, once again extending the copyright monopoly. In this way, a new element of copyright is being created. It relates to the activities of specific online content sharing service providers. These are providers whose main purpose is to store and provide public access to a large number of copyright-protected works uploaded by users of these services which are organized and promoted by them (the platforms) for commercial purposes (Article 2 (6) of the Directive). 

Although the intention of the authors of the Directive was to apply these provisions to the largest platforms, ultimately the decision on who is subject to these provisions is each time left to the courts. As indicated in recital 63 of the Directive, it should be assessed on a case-by-case basis whether a service provider stores and makes available to the public a large number of works protected by copyright or other subject-matter uploaded by users. 

As the Advocate General noted in his opinion, the EU legislator made a political decision in favor of copyright owners (paragraph 136). In consequence, works are made available to the public not only by those who publish a given work on the Internet, but also by the platform itself (service providers). As a result, these providers become directly responsible for the content uploaded by their users. At the same time, Article 17 is based on the assumption that ‘these providers are not necessarily able to obtain authorization for all protected content that may be uploaded on their platforms by users of the platforms’ (Judgment, paragraph 48). In those circumstances, in order to avoid liability, where users upload unlawful content to the platforms of online content-sharing service providers for which the latter have no authorisation from the rights-holders, those providers must demonstrate that they have made their best efforts to obtain such an authorisation and that they fulfill all the other conditions for exemption, laid down in Article 17 (Judgment, paragraph 49). 

Additionally, the new regime introduces certain mitigating mechanisms which also secure platform operators from the potential negative effects of preventive measures: 

  1. the need to perform assessments in accordance with the principle of proportionality (section 5); 
  2. a special scheme for small and new service providers (section 6); 
  3. the need to comply with certain mandatory exceptions and limitations to copyright (section 7); 
  4. prohibition of the general obligation to monitor uploaded content (section 8); 
  5. a set of procedural guarantees, including a complaint and redress mechanism on the platform and rules on out-of-court redress mechanisms (section 9).

Moreover, Art. 17 (10) obliges the European Commission (EC) to organize a dialogue between service providers, rights-holders, user organizations and other interested parties. Its effect is, among others, the publication of the Commission’s Guidelines on the interpretation of Art. 17.

The judgment of the Court, i.e. how copyright law may infringe the freedom of speech

The Court confirmed that the new liability regime, provided for in Art. 17 of the Directive,, interferes with the freedom of expression and information by users of content sharing services, guaranteed in Art. 11 of the Charter of Fundamental Rights and Art. 10 of the European Convention on Human Rights (Judgment, paragraphs 55, 58, 82). In particular, the judgment acknowledged that online content-sharing service providers are required to use automatic recognition and upload filtering tools. The Court emphasized that neither the defendant institutions nor the interveners had been able to indicate at the hearing before the Court any alternatives to such tools. As a consequence, the Court found that providers of such services ‘are therefore forced to preventively filter and block the content in question’ (Judgment, paragraph 53). For this reason – in the opinion of the Advocate General – the EU legislator, ‘rather than directly providing for an obligation to put those tools into place, it has indirectly imposed their use by means of the conditions for exemption from liability laid down in those provisions’ (Opinion, point 62).

At the same time, the Court recognized that the freedom of expression and information are not absolute and may be subject to limitations (Judgment, paragraph 68). In accordance with the provisions of the Charter of Fundamental Rights (Art. 52 (1)), such limitations are permissible as long as they are provided for by statute and respect the essence of the rights and freedoms with which they interfere. Moreover, any such restrictions must comply with the principle of proportionality, i.e. be necessary and genuinely meet the objectives of general interest recognized by the Union or the need to protect the rights and freedoms of others.

 

In the opinion of the Court, all these conditions were met by Art. 17 in its present form. Moreover, according to the Court, in the context of ‘online content-sharing services, copyright protection must necessarily be accompanied, to a certain extent, by a limitation on the exercise of the right of users to freedom of expression and information’. (Judgment, paragraph 82). Consequently, the Court found that such restrictions did not infringe the essence of the freedom of expression and information and that they were admissible in the light of the principle of proportionality (Judgment, paragraph 62). However, in order to maintain in force Art. 17 and the mechanism of limiting the freedom of expression provided for therein, the Court, following the Advocate General, indicated six conditions within which such limitation is permissible. These concern:

 

  1. content filtering and blocking systems permitted by law;
  2. respect citizens’ user rights, i.e. the right to use works for the purposes of quoting, criticizing, reviewing, and for the purposes of caricature, parody and pastiche;
  3. the need for rights-holders to provide in advance relevant and necessary information on the content to be filtered and blocked;
  4. the prohibition of a general monitoring obligation; 
  5. introducing certain procedural guarantees, e.g. related to the mechanism for submitting complaints and redress by citizens (users);
  6. conducting a dialogue between interested parties in order to develop proportionality practices related to the application of Art. 17.

What are the practical consequences of the above?

Procedural (ex post) guarantees are only a supplement to the system of protection of the rights of citizens - users (ex ante)

One of the basic issues raised in the proceedings before the Court was whether the protection of the rights of citizens – users is to be taken into account beforehand (ex ante), i.e. before filtering and blocking (removing) content, or only afterwards (ex post), i.e. only after blocking (removing) specific content as a result of using procedural guarantees consisting in submitting complaints and pursuing claims provided for in Art. 17 (9) of the Directive. This issue is of fundamental importance.

The adoption of the second interpretation would mean that the uploaded content could be massively blocked (deleted) by the platform’s service provider, and the burden of restoring it in full would be passed on to the people who upload the content. In such a case, on the one hand, the new liability regime would induce service providers to excessively prevent content blocking. On the other hand, many users, for various reasons, would refrain from using the mechanisms for restoring blocked content, which would require them to submit separate complaints. As emphasized by the Advocate General, such ‘a preventive “over-blocking” of all of those legitimate uses and the systematic reversal of the burden of demonstrating that legitimacy on users could therefore lead, in the short or long term, to a “chilling effect” on the freedom of expression and creation, resulting in a decrease in the activity of those users (Opinion, point 187).

The Court agreed with the first interpretation and found that Art. 17 (9) of the Directive ‘introduces several procedural safeguards, which are additional to those provided for in Article 17 (7) and (8) of that directive’ (Judgment, paragraph 93). This means that both the existence and application of (ex-post) procedural guarantees and the inclusion of these rights at an earlier stage (ex ante) are necessary to maintain a balance between copyright and the rights of citizens – users. As a consequence, the obligation to protect the rights of users after blocking the content sent (Art. 17 (9)) does not replace the obligation to protect them in accordance with the law already at the stage of uploading the content (Art. 17 (7)).

Moreover, according to the Court, while in the event of an attempt to exempt themselves from liability under the new regime providers of such services must make ‘best efforts’, Art. 17  (7) – concerning the protection of users’ rights – provides for an order to achieve a specific result (Judgment, paragraph 78). In particular, this impacts the rules upon which  content filtering and blocking may occur. The question of how this mechanism will work in practice, however, remains open. Although it is clearly indicated that users have certain rights, the Directive, apart from the complaint mechanism, does not create specific regulations that could enable their enforcement.

When content filtering and blocking systems are allowed

Although the Directive does not specify what means may be used by providers of such services, the Court’s judgment has ruled that they ‘must be strictly targeted in order to enable effective protection of copyright but without thereby affecting users who are lawfully using those providers’ services’ (Judgment, paragraph 81). This means that service providers have an obligation to prevent systematic and massive content blocking. As a consequence, the platforms covered by the Directive:

  • they cannot preventively block all content uploaded by citizens. In particular, they are obliged to use solutions that will not block content uploaded in accordance with the law. This is mainly about content that is not protected by copyright, as well as content that has been uploaded in accordance with the exceptions and limitations provided for by copyright (Judgment, paragraph 86);
  • are required to detect and block content only on the basis of relevant and necessary information provided by the copyright holders (Judgment, paragraph 88);
  • are not obliged to block content, the recognition of which, whether they have been unlawfully placed, requires an independent assessment from the perspective of the exceptions and limitations provided for in the law (Judgment, point 90).

For these reasons, filters used by such service providers may recognize uploaded content, but may not automatically block access to it under all circumstances. Therefore, it is not allowed to use means that filter and block lawful content when it is uploaded online (Judgment, point 95). As the Court points out, the use of a filtering system which does not sufficiently distinguish between legitimate content and illegal content may lead to a blockade of lawful communication. For this reason, the Court decided that such a system ‘is incompatible with the right to freedom of expression and information, guaranteed in Article 11 of the Charter, and does not respect a fair balance between that right and the right to intellectual property’ (Judgment, paragraph 86).

In this context, it should be emphasized that the Court, contrary to the opinion of the Advocate General, has not formulated precise criteria as to when given content may be automatically blocked – and when an independent assessment is required in the light of exceptions and limitations to copyright (Judgment, point 90). However, it is clear that fully automated upload filters should only be able to block content that has been previously found to be unlawful by a court or that is manifestly unlawful, i.e. without regard to the context. In all other cases, a request to block or remove specific content requires separate verification by the rights-holders and due justification (Judgment, paragraph 94).

Polish implementation now!

We are currently waiting for a draft of the Polish regulations implementing the Directive. The Polish government, despite being the lost party in this judgment, contributed to strengthening the guarantees of the rights of citizens (users) enjoying the freedom of expression and information on the Internet. It was as a result of this complaint that the Court was able to indicate that only a creative implementation of the Directive, i.e. meeting a number of additional conditions and guarantees, would meet the requirements of the European law.

The judgment undoubtedly provides a binding interpretation of the Directive. First of all, it imposes on the member states a number of obligations related to its implementation into their national legal systems. The directives of the European Union are binding on the Member States as to the achievement of a specific goal, without indicating by what means this goal should be achieved. It is up to the national legislators to select the appropriate measures. Often, however, individual states, when implementing directives, limit themselves to simply rewriting their text into national laws. In the light of the judgment, it is already clear that, in the case of this Directive, such a mechanical transposition of the text of Art. 17 will not be sufficient.

According to the judgment, it will be necessary to propose specific legal solutions and apply such an interpretation of the provision ‘which allows a fair balance to be struck between the various fundamental rights protected by the Charter. Further, when implementing the measures transposing that same provision, the authorities and courts of the Member States must not only interpret their national law in a manner consistent with that provision but also make sure that they do not act on the basis of an interpretation of the provision which would be in conflict with those fundamental rights or with the other general principles of EU law, such as the principle of proportionality’ (Judgment, paragraph 99).

This means that in the case of those countries that mechanically transferred the content of Art. 17 to their legal systems, such as France or the Netherlands, they will be forced to introduce significant changes. In particular, such changes will include providing for specific legal mechanisms that ex ante protect the rights of citizens – users posting legal content (e.g. directly indicating in which cases it is allowed or not to use automatic blocking systems), or allowing user organizations to access information provided by providers online content sharing services on the functioning of their practices in relation to Art. 17 (4) of the Directive (e.g. by granting them the right to request such information).

However, the key solution seems to be to create an appropriate mechanism for pursuing claims by citizens (users), whose content has been unlawfully blocked (removed). The necessity to provide for such a mechanism is stipulated in Art. 17 (9) of the Directive, but its final shape is left up to the legislator’s decision. The Court’s guarantee that service providers are responsible for the result of taking into account the rights of citizens – users (Judgment, point78), and the obligation to conduct a dialogue aimed at developing appropriate rules referred to in Art. 17 (10) are not sufficient in and of themselves. One can guess how service providers will conduct with an individual user with rights without explicit sanctions on one side, and  with a collective management organization or owner of a large copyright portfolio, supported by a regime of strict liability for infringement of copyright and the possibility of claiming lump-sum compensation on the other. 

In practice,  it is therefore not possible to secure real freedom of expression and information on the Internet based solely on the involvement of individual citizens (users). For this reason, it seems appropriate to consider some mechanisms enabling user organizations to bring class actions. Just as creators needed collective management organizations to fight the massive illegal exploitation of their works, citizens (users) need appropriate organizations, endowed with specific powers to prevent mass blocking and removal of content legally placed on the Internet. Such a mechanism should provide for a specific catalog of claims, such as the possibility of a request to  block the use of a particular type of filter that does not meet the requirements indicated by the Court, or lump sum compensation claims. Without equipping citizens (users) with specific legal instruments, the right to freedom of expression and information will remain only an obligation to fulfill the result of the Court’s judgment – not in real life.

What will these provisions look like in the Polish implementation of the Directive? We expect that the Polish government, as the author of the complaint, in which it strongly sided with citizens, also at the national level, will take an exemplary reference to all the requirements that the Court imposed on national laws and will present a project that will enable real protection of citizens’ (users’) rights.

The paradox of Art. 17 - a quick summary

As I pointed out at the beginning of this analysis, in the opinion of the Court, Art. 17 – much as it interferes with the right of freedom of expression and information of citizens – users, it does so in an acceptable manner. This is because it contains a number of guarantees of rights and safeguards that have been introduced into the text of the Directive, also thanks to the involvement of organizations such as the Communia Association, of which Centrum Cyfrowe is a member. The question, however, remains open: What would have happened if the current Art. 17 had been adopted in the original form proposed by the European Commission?

The famous Article 13 (the current Article 17) was the subject of mass criticism from civil society, academia and defenders of freedom of expression. These organizations, including Centrum Cyfrowe, have clearly indicated that the use of automatic filters to block and remove uploaded content by content-sharing service providers leads to the violation of the essence of freedom of expression and information. Were it not for that determination, Article 13 would have perhaps been adopted in its original form, and consequently the judgment would have been different. 

The judgment in its current form undoubtedly contributes to the empowerment of citizens – users who enjoy freedom of expression and information in the digital environment. Regardless, however, both the judgment and the Directive reflect on a tendency to extend the copyright monopoly. After all, it is possible to imagine a situation in which the Court finds that the creation of a new aspect of copyright by the Directive does not meet the requirements of the Charter (including the principle of proportionality), e.g. due to the fact that its existence, which violates the right to freedom of expression and information – is not necessary.

In this context, it should be emphasized that, in general, creating exclusive rights to intangible goods, such as works, and extending the scope of these rights, is not an optimal solution. Many of the problems that the Court tried to protect citizens from in its judgment would not have arisen if the copyright system (and more broadly, intellectual property rights) had been properly designed from the beginning. In line with the approach presented, among others, by the Communia Association, these exclusive rights constitute an exception to the principle of the public domain – not the opposite, as is often emphasized by the rights-holders. 

[Note: this analysis was written on April 29]

 


Konrad Gliściński LL.M PhD, Expert at Centrum Cyfrowe, Intellectual Property Law Chair, Jagiellonian University

Leave a Reply

Your email address will not be published.

Komentarz zostanie dodany po zatwierdzeniu przez administratora