— last modified 12 May 2022

Today, 11 May, is a worrying day for every person in the EU who wants to send a message privately without exposing their personal information, like chats and photos, to private companies and governments.

The European Commission has adopted its “Regulation laying down rules to prevent and combat child sexual abuse” material online, including measures which put the vital integrity of secure communications at risk. Commissioner Ylva Johansson has spearheaded a proposal which could still force companies to turn our digital devices into potential pieces of spyware, opening the door for a vast range of authoritarian surveillance tactics. Despite several attempts at safeguards, the main risks of the proposal remain, putting journalists, whistleblowers, civil rights defenders, lawyers, doctors and others who need to maintain the confidentiality of their communications at risk.

Coercive scanning obligations

The proposed law contains measures which seem to – at a very minimum – allow the widespread scanning of people’s private communications (which EDRi has warned may constitute illegal generalised monitoring) and in some cases, will go further to force scanning and removals.

It will do this to search not only for verified illegal child sexual abuse material (CSAM), but also for new photos and videos, as well as evidence of text-based “grooming”, both of which inevitably require the use of notoriously inaccurate AI-based scanning tools of our most intimate conversations.

Furthermore, the proposal will discourage the use of end-to-end encryption and heavily incentivise providers to take the most intrusive measures possible in order to avoid legal consequences.

    “By threatening companies with legal actions, the Commission is likely trying to wash their hands of the responsibility for dangerous and privacy-invasive measures, whilst de-facto incentivising these measures with the law.” – warns Ella Jakubowska, Policy Advisor, EDRi

The proposal may appear superficially to contain a balanced and proportionate approach. In particular, providers can only be forced to scan on their platform or service if required to do so by a judicial authority, and are subject to a series of safeguards. According to Contexte, many of these safeguards have only been introduced in the last few days, which shows that pressure from the EDRi network and our supporters has had a positive effect.

However, there are several provisions which would indicate that these protections are mainly cosmetic, and that we may in fact be facing the worst-case scenario for private digital communications. For example, providers of services and platforms have to take actions to mitigate the risk of abuse being facilitated by their platform. But they will still be liable to be issued with a detection order forcing them to introduce additional measures unless they have demonstrated in their risk assessment that there is no remaining risk of abuse at all.

This is an impossible standard to achieve, meaning that the law will likely have a coercive effect on providers to introduce their own initiative as many ‘mitigation measures’ as possible. Furthermore, the technical methods that will be made available by the new EU Centre for providers that are forced to scan private communications are likely to pose drastic security risks.

The end of end-to-end encryption?

As EDRi has repeatedly argued, such detection measures will inevitably lead to dangerous and unreliable Client-Side Scanning practices, undermining the essence of end-to-end encryption. Cybersecurity experts and technologists all around the world have warned that you cannot safely or effectively circumvent encryption in this way. Such interference with the encryption process will also make everyone’s phones vulnerable to attacks from malicious actors. Alternatively, the proposal may simply incentivise providers to abandon encryption entirely.

    “The European Commission is opening the door for a vast range of authoritarian surveillance tactics. Today, companies will scan our private messages for CSAM content. But once these methods are out there, what’s stopping governments forcing companies to scan for evidence of dissidence or political opposition tomorrow?” – Ella Jakubowska, Policy Advisor, EDRi

The limitations of a technological approach

This proposal has failed to respect EDRi’s ten principles which advise policymakers on how to defend children’s rights in the digital age in a targeted and proportionate way. At its core, the proposal attempts instead to find a quick technological fix to complex systemic and societal problems.

As the European Parliament reports, countries have long failed to enact existing measures to tackle the issue of child sexual abuse and exploitation. So, rushing into potentially forcing the use of unsafe technologies to tackle systematic harms in ways that could break the most important foundations of our society – privacy and security of communications – is a serious risk of the proposal. Given that Commissioner Johansson has continuously refused to meet digital rights groups, we are disappointed, but not surprised, that the legislation lacks awareness of the inherent limitations of technology and the serious implications for society at large.

    “The proposal includes a requirement for internet service providers to block access to specific pieces of content on websites under orders from national authorities. However, this type of blocking will be technically impossible with HTTPS, which is now used on almost every website.” – Jesper Lund, Chairman of EDRi member IT-Pol Denmark

We urgently need the European Parliament to step in and explicitly reject generalised scanning, to specifically fully protect encrypted services and remove all obligations to search for anything other than known images or photos when requested by independent judicial authorities. EDRi calls for a legislation that keeps our communications secure – and this proposal is not it.

European Digital Rights is a dynamic collective of 45+ NGOs, experts, advocates and academics working to defend and advance digital rights across Europe. It advocates for robust and enforced laws, informs and mobilises people, promotes a healthy and accountable technology market, and builds a movement committed to digital rights in a connected world.

European Digital Rights (EDRi)

Leave A Reply Cancel Reply

Exit mobile version