Chat Control is back on the agenda
The EU’s ominous plan to mass-scan citizens’ private messages is now being pushed through the backdoor — it would amount to mass surveillance on a massive and unprecedented scale
In theory, Chat Control should have been buried last month. The EU’s ominous plan to mass-scan citizens’ private messages was met with overwhelming public resistance in Germany, with the country’s government refusing to approve it. But Brussels rarely retreats merely because the public demands it. And so, true to form, a reworked version of the text is already being pushed forward — this time out of sight, behind closed doors.
Chat Control, formally known as the Child Sexual Abuse Regulation, was first proposed by the European Commission in 2022. The original plan would have made it mandatory for email and messenger providers to scan private, even encrypted, communications — with the purported aim of detecting child sexual abuse material.
The tool was sold as a noble crusade against some of the world’s most horrific crimes. But critics argued that the tool risked becoming a blueprint for generalised surveillance, by essentially giving states and EU institutions the ability to scan every private message.
An open letter signed by 18 of Europe’s leading cybersecurity and privacy academics warned that the latest proposal poses “high risks to society without clear benefits for children”. The first, in their view, is the expansion of “voluntary” scanning, including automated text analysis using AI to identify ambiguous “grooming” behaviours. This approach, they argue, is deeply flawed. Current AI systems are incapable of properly distinguishing between innocent conversation and abusive behaviour. As the experts explain, AI-driven grooming detection risks sweeping vast numbers of normal, private conversations into a dragnet, overwhelming investigators with false positives and exposing intimate communications to third parties.
Digital rights campaigner and former MEP Patrick Breyer further emphasised this danger by noting that no AI can reliably distinguish between innocent flirtation, humorous sarcasm — and criminal grooming. He warned that this amounts to a form of digital witch-hunt, whereby the mere appearance of words like “love” or “meet” in a conversation between family members, partners or friends could trigger intrusive scrutiny. This is not child protection, Breyer has argued, but mass suspicion directed at the entire population.
Even under the existing voluntary regime, German federal police warn that roughly half of all reports received are criminally irrelevant, representing tens of thousands of leaked legal chats annually. According the Swiss Federal Police, meanwhile, 80% of machine-reported content is not illegal. It might, for example, encompass harmless holiday photos showing nude children playing at a beach. The new text would expand these risks dramatically.
Further concerns arise from Article 4 of the new compromise proposal, which requires providers to implement “all appropriate risk mitigation measures”. This clause could allow authorities to pressure encrypted messaging services to enable scanning, even if this undermines their core security model. In practice, this could mean requiring providers such as WhatsApp, Signal or Telegram to scan messages on users’ devices before encryption is applied.
The Electronic Frontier Foundation has noted that this approach risks creating a permanent security infrastructure, one which could gradually become universal. Meta, Google and Microsoft already scan unencrypted content voluntarily; extending this practice to encrypted content would merely require technical changes. Moreover, what begins as a voluntary option can easily become compulsory in practice, as platforms face reputational, legal and market pressure to “cooperate” with the authorities. Furthermore, this doesn’t affect just people in the EU, but everyone around the world, including the United States. If platforms decide to stay in the EU, they would be forced to scan the conversations of everyone in the bloc. If you’re not in the EU, but you chat with someone who is, then your privacy is compromised too.
Read the full article here. If you’re a paid subscriber and you can’t access the article write to me at thomasfazi82@gmail.com.
Thanks for reading. Putting out high-quality journalism requires constant research, most of which goes unpaid, so if you appreciate my writing please consider upgrading to a paid subscription if you haven’t already. Aside from a fuzzy feeling inside of you, you’ll get access to exclusive articles and commentary.
Thomas Fazi
Website: thomasfazi.net
Twitter: @battleforeurope
Latest book: The Covid Consensus: The Global Assault on Democracy and the Poor—A Critique from the Left (co-authored with Toby Green)



why is it that euro policy ideas dovetail so nicely with the intel agencies?? is this how the power structure works? are the intel agencies viewed as god by politicians and those involved in directing affairs in the eu to the point they have to bend over and give them everything they wish??
of course the concept of privacy never meant much to a lot of these same people.. that is fairly clear..
And it's not a coincidence the same EU is busy pushing to wage a war that few citizens of European countries wish to wage.
So with this mass surveillance tool, they can target opponents, and a) subject them to blackmail b) retroactively manufacture some kind of offense or crime to take down opponents.
The best way they can model the behavior they claim they want is to immediately:
- begin recording and archiving the electronic messages of all EU officials
- begin recording and publishing on a daily basis, full transcripts and recording of all contact, meeting and discussions between EU officials and lobbyists or representatives fo business and interest groups.
There we will find crimes already - if they actually want to find crime.