This article contains references to the sexual abuse and exploitation of children.
Eleven-year-old Iris was lonely and looking for friends when she began chatting to a man online. She thought this was finally someone she could rely on. So when he threatened to cut off contact unless she sent him a picture of herself naked, Iris eventually agreed — afraid of feeling alone again.
The man said he would send the photos to Iris’ family if she refused to keep sending more.
“Three years of anything he wanted, whenever he wanted,” Iris, whose name was changed to protect her identity, said when she eventually shared her story of sexual extortion and online abuse.
Cracking down on child abuse
Her case and others are documented on sheets of paper laid out on school desks outside the European Union’s headquarters in Brussels.
It’s part of a rally by children’s rights campaigners, who are furious that the bloc’s member states have once again delayed decisions on controversial online protection laws which would force tech companies to scan images, videos and links for evidence of child sexual abuse.
“In these three years of delay — that’s over 1200 days of negotiations — a lot of children have fallen into the hands of perpetrators,” Fabiola Bas Palomares, who leads policy work at the campaign group Eurochild, told DW.
Around two-thirds of all child sexual abuse webpages detected by the Internet Watch Foundation last year were traced to an EU country, and globally more than 60 million pictures and videos linked to the sexual exploitation of minors were flagged online.
Privacy at risk?
If you scroll through TikTok or do a quick Google search, you’ll quickly find claims that the EU is about to start reading your texts. That’s not the case — but Brussels’ proposal does break with the bloc’s current privacy norms.
Germany is among the EU states refusing to back the planned laws.
“Private communication must never be subject to general suspicion. Nor may the state force messenger services to scan messages massively for suspicious content before they are sent,” German Justice Minister Stefanie Hubig said in a statement on October 8.
Under the latest proposal penned by current EU chair Denmark, tech firms deemed high risk could be ordered to scan all links, images and videos — though not texts — shared on their platforms, to report instances of suspected child sexual abuse material to law enforcement.
Encrypted content in the spotlight
Most controversially, the rules would also apply to content shared on messengers such as WhatsApp, which use encryption — a technical promise that your message won’t be seen by anyone other than the person it’s intended for.
The EU says that the measures are essential to catch predators who exploit digital environments. When Facebook parent company Meta began encrypting some messages in 2023, it flagged 6.9 million fewer cases of suspected online child exploitation to US watchdogs than the previous year — though it remained the top incident reporter.
Now, the encrypted messaging platform Signal has threatened to quit the EU market, claiming that the latest EU proposals would amount to “mass surveillance.”
‘Not a trade-off we can make’
While privacy campaigner Ella Jakubowska usually spends her days trying to rein in big tech’s power in Europe, this time, she finds herself on the same side of the debate.
“This is an unprecedentedly undemocratic proposal,” Jakubowska, who heads up policy work at the nonprofit organization European Digital Rights, told DW.
“It is going to undermine vital digital security protections that all of us rely on, day in, day out. And that is just not a trade-off that we can make, even if it’s in service of something that is a really important aim,” she added.
Jakubowska says she’s tired of privacy and child protection being seen as an “either-or” choice.
And that’s a view echoed by Dorothée Hahne, a 59-year-old who coordinates a group of German childhood abuse survivors actively campaigning against the EU proposals. Hahne fears that people feel unsafe sharing their stories or seeking help online in the future if the so-called chat control laws are ever approved.
“It would be like having a police officer standing next to you during a therapy session,” she told DW over the phone.
Tech troubles
And that’s a concern that keeps coming up: That legitimate or harmless communication, like parents sharing pictures of their children at the beach, could end up being flagged to law enforcement.
Experts are split on whether the EU’s plan is technically feasible. Officials working on the latest draft insist that detection mechanisms similar to those that could be used to scan for child abuse are already being used to target malware.
“The technology, the answers, are there,” Swedish center-left parliamentarian Evan Incir told reporters at Monday’s children’s rights rally.
But Bart Preneel, a cryptographer based at the Catholic University of Leuven, is not convinced. He’s among some 800 scientists and researchers who penned an open letter speaking out against the laws earlier this month.
“It’s already very hard for humans to distinguish between CSAM and legitimate content,” Preneel told DW over a video call. “We’re very skeptical that AI can learn this,” he said, adding: “We believe that there is no technology — and there will be no technology in the next 10 years — that can do this.”
Preneel also warns that the technology that platforms would use to scan files could leave users and authorities more vulnerable to hackers.
Legal vacuum looms
Children’s rights campaigner Fabiola Palomares is also fed up with the framing of “privacy vs. child protection” as a binary choice.
“If we don’t have technology that is safe enough, privacy-preserving enough, let’s focus on developing that technology instead of questioning the legal framework behind it,” she told DW.
“Without a legal incentive to do it, they will not invest in developing the technology and developing the capacity that they need to actually proactively look for this,” she said.
“If we don’t pass a regulation that covers the end-to-end encrypted environment, the burden will be once again put on the victim – will be put on the child – to be able to report something, instead of being on the platform that is facilitating, one way or another, that crime, she said.
And Palomares also warns that a legal vacuum may be on the horizon for Europe.
For now, tech firms can voluntarily choose to search for and flag suspicious content thanks to derogations in broader EU privacy rules. With that exception set to expire next April, pressure on member states to make a decision is mounting.
Now what for Europe?
EU member Denmark, which holds the bloc’s rotating presidency until 2026, is in favor of stricter laws and now faces the challenge of gathering enough support among more skeptical states such as Germany.
Its attempts to cater to privacy concerns by restricting which content platforms should scan have failed to bring Berlin on board so far, but Copenhagen has vowed to keep trying.
“We will continue, of course, the ongoing and constructive negotiations towards a sustainable compromise,” Danish Justice Minister Peter Hummelgaard told reporters on Tuesday.
“At the end of the day, this is also a discussion on how are we able to regulate and oblige private platforms, private companies, to ensure that they also take responsibility of larger societal concerns,” he added.
Even if the laws eventually garner enough backing, they would likely take years to kick in — with parliamentary negotiations set as a next step in the process.
Edited by: Rob Mudge
Source link