It's exactly how it seems. If you use WhatsApp, iMessage, Messenger, Gmail, Discord, Skype or Teams, your messages and the images within them are constantly being scanned and analyzed. If an automated and often error-prone system, for any reason, detects them containing child pornography or grooming, their contents and identities of the people involved are shared with the authorities.
How did we get here?
Two years ago, a law was presented to the European Commission. It would allow messaging service providers to automatically screen messages sent through them, for the sake of detecting child pornography distribution. If a message is detected as suspiscious, its content is shared with employees and police for further investigation.
On July 5, 2021, despite much pushback, this law, dubbed ChatControl, was temporarily adopted for the next 3 years. The absolute majority of parliamentarians voted this way.
After nearly a year of closed-off discussion, on May 11, 2022, a new, permanent law was proposed. This law would make the screening compulsory for all messages, videos, images, e-mails, voice and video calls, or even cloud storage such as iCloud or Google Drive.
Why is this bad? It's about protecting children, right?
First off, protecting children is incredibly important. However, despite its good intentions, the proposal could have massive adverse effects - just like the post office being forced to read everyone's letters without a court order;
The privacy of digital communication would effectively be abolished. Nearly all citizens' communication - including that of journalists, activists and whistleblowers - would constantly be watched and scrutinized. The recent leaks about the Pegasus spyware have also shown just how much interest government and police bodies have in that kind of information - for example, the German police, Hungarian government and the largest Polish party. Employees of the United States' NSA have absused their position since at least 2013, sharing others' nude pictures they intercepted. This likely included both those of strangers and people they knew.
Unlike the already approved temporary law, this proposal also affects securely encrypted services like Telegram and Signal. These services would be forced to either scan messages on the user's device and send reports from it, or break their otherwise secure encryption. The first option would lead to devices spying and telling on their own users, the second would effectively make the encryption entirely useless. Encryption, in principle, is incompatible with this kind of surveillance.
Online anonymity would become effectively impossible. And yet, many people's lives heavily depend on it, and would soon end if information about their identities was known. Despite that, the proposal suggests messaging services which can be used by children (i.e. all popular ones) must verify the age - hence also the identity - of their users. Age verification without identity verification is impossible.
Private files would be searched without suspicion. Cloud services such as Google Drive, iCloud and OneDrive would be obliged to check and possibly also report the files of their customers, even if they haven't shared them with anyone.
Many reports are irrelevant, but can still lead to consequences. AI and other systems used for detecting illegal content create huge numbers of false reports, which, at best, take up much of investigators' precious time. According to the Swiss police, up to 86% of automated reports are irrelevant. The police may receive pictures of babies and families on vacation, or reports of young people sexting. These young people can also be falsely investigated - according to German statistics, over 33% of criminal investigations of this type were directed at children.
Criminals are already using safe communication services outside the EU's influence. Whether they run them themselves, or use others', criminals and regular technologically inclined citizens will always have ways to chat with others away from Big Brother's prying eyes. Ordinary people, who this law will affect the most, won't.
The majority of child pornography is not shared at all and stays with the abuser. When it is, it doesn't originate on Messenger - but, for example, closed off dark web forums, where the files are encrypted with a passphrase known only to a few select people. It can take years before it first appears on a more common messaging service and can be identified. At that point, information about its origin is long gone.
Well, yeah, alright. But if you're so smart, what do we actually do about it?
First, the proposed solutions must be proportional to the problem. Intercepting millions of messages to find a few illegal ones is not one of those solutions - but there are many ways to improve:
Education, awareness-raising and empowerment of survivors. Increasing awareness of and access by young people to hotlines, institutional reporting (police, social services and other authorities), and support mechanisms, as emphasised by EPCAT International and the WeProtect Global Alliance.
Societal changes. Many organisations recommend increasing investments in social services - especially child protection departments, schools, anti-poverty measures and other victim support services, as well as trauma-informed approaches by police.
Reform of police and other institutions. In Germany, one of the largest vaults of abuse imagery ever discovered stayed online for years because police reported not having enough human resources to take it down. Yet it took journalists investigating the issue just a couple of days to fully remove the content. Other Member States face similar issues, from the problem of closed institutions in France, to the overburdened police and public prosecutors in the Netherlands and Belgium. Structural solutions would ensure that the right authorities have the right resources to tackle the vast numbers of child abuse cases that they are already unable to deal with.
Increasing both EU and national funding to hotlines, ensuring a proper legal basis for their work, and committing funding further in advance. This would boost the capacity and reduce the precariousness of these vital organisations who already remove vast amounts of CSAM from the Internet quickly and effectively, and also provide support to survivors.
Enforcing existing laws. The 2011 EU Child Sexual Abuse Directive contains many provisions requiring Member States to do more to tackle child sexual abuse on a national level, and worryingly, it has not been implemented fully despite being in force for over 11 years. The newly approved DSA also offers many opportunities to tackle such content.
Prevention. "Abuse will continue if the root causes that allow it to exist in the first place are not challenged", we learn. The Centers for Disease Control and Prevention (CDC) advises that "Effective evidence-based strategies are available to proactively protect children from child sexual abuse, but few have been widely disseminated".
Can I read some more about this?
But of course!
- Prostasia: How the war against Child Abuse Material was lost
- EDRi: Is surveilling children really protecting them? Our concerns on the interim CSAM regulation
- EDRi and other civil society groups: Civil society views on defending privacy while preventing criminal acts
- European data protection supervisor: Opinion on the proposal for temporary derogations from Directive 2002/58/EC for the purpose of combatting child sexual abuse online
- Alexander Hanff, a privacy expert and victim of sexual abuse: Why I don’t support privacy invasive measures to tackle child abuse.
- AccessNow: The fundamental rights concerns at the heart of new EU online content rules
- Tutanota: Strategic autonomy in danger: European Tech companies warn of lowering data protection levels in the EU.
- CEPIS: Europe has a right to secure communication and effective encryption
- Patrick Breyer: Chat Control, a huge inspiration for this article.
Is there any way I can help?
Message or call your representatives. Only 20% voted against the original proposal, mainly Greens/EFA and GUE/NGL. Contact representatives in your country, politely talk to them about why you disagree with the proposal and explain its possible consequences. If you have the time, letters, calls and mainly personal meetings are much more effective than e-mails, especially pre-written ones. The official name of the proposal is "Proposal for a Regulation laying down rules to prevent and combat child sexual abuse".
Tell the media and people around you. Right now, the proposal is largely unknown outside a few expert circles. Show a friend the website, write a letter to the local newspaper. If you want to help out further, promotional materials are available here and here.
Sign the petition against the proposal.
How's it going so far?
Update no. 1 - Sep 3, 2022
We collected signatures for the Czech petition against the proposal, protested and lobbied. On August 31, the European affairs committee met and discussed the proposal. The result could have been better, but it could definitely have also been worse. There was a clearly visible effect on the final statement - it highlights privacy issues and explicitly asks for encryption to be protected and respected. It doesn't dismiss the proposal outright though, and the fight is just moving to the whole of the EU. This is also why you're reading this website in English.
You can read the whole untranslated statement here.