— last modified 03 October 2019

Today, on 3 October 2019, the Court of Justice of the European Union (CJEU) gave its ruling in the case C-18/18 Glawischnig-Piesczek v Facebook.

The case is related to injunctions obliging a service provider to stop the dissemination of a defamatory comment. Some aspects of the decision could pose a threat for freedom of expression, in particular that of political dissidents who may be accused of defamatory practices.

“This ruling could open the door for exploitative upload filters for all online content,” said Diego Naranjo, Head of Policy at EDRi. “Despite the positive intention to protect an individual from defamatory content, this decision could lead to severed freedom of expression for all internet users, with particular risks for political critics and human rights defenders by paving the road for automated content recognition technologies.”

The ruling confirms that a hosting provider such as Facebook can be ordered, in the context of an injunction, to seek and identify, among all the content shared by its users, content that is identical to the content characterised as illegal by a court. If the obligation to block future content applies to all users on a large platform like Facebook, the Court has in effect considered it to be in line with the E-Commerce Directive that courts demand automated upload filters and blurred the distinction between general and specific monitoring in its previous case law. EDRi is concerned that automated upload filters for identical content will not be able to distinguish between legal and illegal content, in particular when applied to individual words that could have very different meanings depending on the context and the intent of the user.

EDRi welcomes the Court’s attempt to find a balance of rights (namely freedom of expression, freedom to conduct a business) and to limit the impact on freedom of expression by differentiating between the search for identical and equivalent content. However, the ruling seems to be departing from previous case law regarding the ban on general monitoring obligations (for example Scarlet v. Sabam). Imposing filtering of all communications in order to look for one specific piece of content, using non-transparent algorithms, is likely to unduly restrict legal speech – independently from whether they look for content that is identical or equivalent to illegal content.

The upcoming review of the E-Commerce Directive should clarify, among other things, how to deal with online content moderation. In the context of this review, it is crucial to address the problem of disinformation without unduly interfering with the fundamental right to freedom of expression for users of the platform. Specifically, the business model based on amplifying certain type of content in the detriment of other in order to attract users’ attention requires urgent scrutiny.

European Digital Rights (EDRi)

Leave A Reply Cancel Reply

Exit mobile version