The Signal Foundation Cautions Against EU’s Intention to Scan Private Messages for CSAM
A contentious idea presented by the European Union to scrutinize users’ private conversations for identifying child sexual exploitation material (CSAM) carries significant dangers to end-to-end encryption (E2EE), as cautioned by Meredith Whittaker, the leader of the Signal Foundation, that operates the privacy-focused messaging platform bearing the same name.
“Requiring broad scanning of private communications fundamentally weakens encryption. Complete Dismissal,” Whittaker expressed in a declaration on Monday.
“Whether this occurs through manipulation of, for example, an encryption algorithm’s random number generation, or by introducing a key escrow system, or by compelling communications to flow through a surveillance mechanism before encryption takes place.”
The reaction comes amidst legislators in Europe proposing rules to combat CSAM with a fresh provision known as “upload moderation” that permits messages to be examined prior to encryption.
A recent account from Euractiv disclosed that audio communications are outside the scope of the legislation and that individuals need to agree to this identification as per the terms and conditions of the service provider.
“Those declining consent can still utilize components of the service that do not involve transmitting visual content and URLs,” it further detailed.
Europol, in late April 2024, urged both the tech sector and governments to prioritize public safety, advising that protective measures like E2EE might obstruct law enforcement from accessing problematic content, reigniting an ongoing discussion on balancing privacy against combating severe crimes.
Furthermore, it advocated for platforms to devise security structures in a manner that permits them to still pinpoint and report damaging and unlawful activities to law enforcement, without delving into the specifics of implementation.
The manufacturer of iPhones, Apple, famously declared intentions to incorporate client-side screening for CSAM, but retracted the plan towards the end of 2022 following heightened criticism from privacy and security proponents.
“Searching for a single category of content, for example, sets the stage for extensive surveillance and could evoke a desire to explore other encrypted messaging platforms across various content categories,” the organization stated at the time to elucidate its decision. It also labeled the mechanism as a “slippery slope of unintended repercussions.”
Whittaker from Signal elaborated by stating that labeling the strategy “upload moderation” is an attempt to play with words that is similar to inserting a security vulnerability (be it a backdoor or a front door), effectively creating an opening for exploitation by malicious entities and state hackers.
“Either end-to-end encryption protects all, and upholds security and privacy, or it’s compromised for all,” she articulated. “And compromising end-to-end encryption, particularly during such a geopolitically unstable juncture, is a calamitous proposition.”


