Despite three years of debate, the EU is bringing back Chat Control. Researchers denounce intrusive monitoring of communications and the weakening of end-to-end encryption.
The Council of the European Union is once again debating the Chat Control project, a law aimed at scanning communications and images to fight child sexual abuse. Presented as a compromise, the latest Danish draft still preserves client-side scanning and the option to extend surveillance to text and audio messages. Researchers like Bart Preneel (KU Leuven) denounce this as a serious attack on end-to-end encryption and fundamental rights. After four open letters and several compromise attempts, the project remains contested. The October 15 vote will decide whether the EU chooses privacy protection or preventive surveillance.
A contested return despite compromises
Chat Control resurfaced in 2024 after a brief retreat. The proposal would force messaging apps to enable scanning on the sender’s device, blocking the transmission of known CSAM (Child Sexual Abuse Material). But this also means systematic analysis of private images, including ordinary family photos.
Under the Belgian and then Hungarian presidencies, several compromises circulated. Each time, researchers responded with an open letter. In 2021, more than 150 signatories raised concerns. In May 2024, a second letter brought together 250 experts. In Fall 2024, a new version claimed to soften the project. But Preneel denounced these as “fictional measures” that left the surveillance logic untouched.
For him, end-to-end encryption, comparable to a sealed envelope, loses all meaning if an algorithm inspects the message before sending. “It’s like a postal company installing a camera in your home to check what you put in the envelope,” he told Belgian newspaper LeVif.
Technological and legal flaws
The latest Danish proposal of July 2025 temporarily excludes text and audio messages, focusing on images and URLs. But the text explicitly allows adding these formats later. For researchers, this is a maneuver to defuse criticism without changing the substance.
Another novelty: the category of “high-risk services,” subject to detection obligations. It includes the main encrypted messaging apps used daily by hundreds of millions of Europeans. The system therefore targets not marginal services but mainstream communication channels.
The return of AI for detecting new images also draws criticism. Previously dropped in a 2024 compromise, the mechanism has reappeared. Preneel calls it technologically uncertain and likely to produce a flood of false positives, misclassifying medical or family photos as suspicious. Law enforcement would be overwhelmed, while thousands of innocent users would be flagged.
Finally, mandatory age verification is highlighted as a source of complexity and vulnerability. It would compromise anonymity, open the door to censorship, and be easily bypassed with VPNs or third-party accounts, while weakening security systems.
A democratic risk
Beyond technical aspects, Preneel warns of a slippery slope. Chat Control starts with child abuse, but once the technology is in place, nothing prevents its expansion. “Today it’s CSAM. Tomorrow it will be terrorism. The day after, political dissidents,” he warns.
He recalls that Apple attempted in 2021 to introduce a similar client-side scanning system before abandoning it, judging the implementation too complex and risky. “If this intent is maintained, the legislation will miss its goal,” he concludes.
For the signatories of the open letters, the real answer lies elsewhere: education, prevention, victim support, and stronger funding for specialized centers. Generalized scanning, they argue, will not stop abuse at its source.
Despite three years of negotiations, the project still preserves client-side scanning, which renders encryption meaningless. The central question remains: can Europe reconcile child protection with unbreakable encryption, without tipping into mass surveillance? [ZATAZ News English version]