There are currently 53 signatures.
We need to collect as many as possible, add yours now!
Dear european parliament members,
Two years ago, a law was presented to the European Commission. It would allow messaging service providers to automatically screen messages sent through them, for the sake of detecting child pornography distribution. If a message is detected as suspiscious, its content is shared with employees and police for further investigation.
On July 5, 2021, despite much pushback, this law, dubbed ChatControl, was temporarily adopted for the next 3 years. The absolute majority of parliamentarians voted this way.
After nearly a year of closed-off discussion, on May 11, 2022, a new, permanent law was proposed. This law would make the screening compulsory for all messages, videos, images, e-mails, voice and video calls, or even cloud storage such as iCloud or Google Drive.
First off, protecting children is incredibly important. However, despite its good intentions, the proposal could have massive adverse effects - just like the post office being forced to read everyone's letters without a court order;
The privacy of digital communication would effectively be abolished. Nearly all citizens' communication - including that of journalists, activists and whistleblowers - would constantly be watched and scrutinized. The recent leaks about the Pegasus spyware have also shown just how much interest government and police bodies have in that kind of information - for example, the German police, Hungarian government and the largest Polish party. Employees of the United States' NSA have absused their position since at least 2013, sharing others' nude pictures they intercepted. This likely included both those of strangers and people they knew.
Unlike the already approved temporary law, this proposal also affects securely encrypted services like Telegram and Signal. These services would be forced to either scan messages on the user's device and send reports from it, or break their otherwise secure encryption. The first option would lead to devices spying and telling on their own users, the second would effectively make the encryption entirely useless. Encryption, in principle, is incompatible with this kind of surveillance.
A legal opinion has also shown that mass surveillance of communication, without any prior suspicion, would go against EU citizens' fundamental rights.
Online anonymity would become effectively impossible. And yet, many people's lives heavily depend on it, and would soon end if information about their identities was known. Despite that, the proposal suggests messaging services which can be used by children (i.e. all popular ones) must verify the age - hence also the identity - of their users. Age verification without identity verification is impossible.
Private files would be searched without suspicion. Cloud services such as Google Drive, iCloud and OneDrive would be obliged to check and possibly also report the files of their customers, even if they haven't shared them with anyone.
Many reports are irrelevant, but can still lead to consequences. AI and other systems used for detecting illegal content create huge numbers of false reports, which, at best, take up much of investigators' precious time. According to the Swiss police, up to 86% of automated reports are irrelevant. The police may receive pictures of babies and families on vacation, or reports of young people sexting. These young people can also be falsely investigated - according to German statistics, over 33% of criminal investigations of this type were directed at children.
Criminals are already using safe communication services outside the EU's influence. Whether they run them themselves, or use others', criminals and regular technologically inclined citizens will always have ways to chat with others away from Big Brother's prying eyes. Ordinary people, who this law will affect the most, won't.
The majority of child pornography is not shared at all and stays with the abuser. When it is, it doesn't originate on Messenger - but, for example, closed off dark web forums, where the files are encrypted with a passphrase known only to a few select people. It can take years before it first appears on a more common messaging service and can be identified. At that point, information about its origin is long gone.
There are other ways of fighting the problem. First off, they must be proportional - which intercepting millions of messages to find a few illegal ones isn't:
Education, awareness-raising and empowerment of survivors. Increasing awareness of and access by young people to hotlines, institutional reporting (police, social services and other authorities), and support mechanisms, as emphasised by EPCAT International and the WeProtect Global Alliance.
Societal changes. Many organisations recommend increasing investments in social services - especially child protection departments, schools, anti-poverty measures and other victim support services, as well as trauma-informed approaches by police.
Reform of police and other institutions. In Germany, one of the largest vaults of abuse imagery ever discovered stayed online for years because police reported not having enough human resources to take it down. Yet it took journalists investigating the issue just a couple of days to fully remove the content. Other Member States face similar issues, from the problem of closed institutions in France, to the overburdened police and public prosecutors in the Netherlands and Belgium. Structural solutions would ensure that the right authorities have the right resources to tackle the vast numbers of child abuse cases that they are already unable to deal with.
Increasing both EU and national funding to hotlines, ensuring a proper legal basis for their work, and committing funding further in advance. This would boost the capacity and reduce the precariousness of these vital organisations who already remove vast amounts of CSAM from the Internet quickly and effectively, and also provide support to survivors.
Enforcing existing laws. The 2011 EU Child Sexual Abuse Directive contains many provisions requiring Member States to do more to tackle child sexual abuse on a national level, and worryingly, it has not been implemented fully despite being in force for over 11 years. The newly approved DSA also offers many opportunities to tackle such content.
Prevention. "Abuse will continue if the root causes that allow it to exist in the first place are not challenged", we learn. The Centers for Disease Control and Prevention (CDC) advises that "Effective evidence-based strategies are available to proactively protect children from child sexual abuse, but few have been widely disseminated".
Therefore, we, the signatories, ask you to uphold your citizens' basic rights even in the online space and do everything within your power to stop the proposal.