2 Visions Clash Over How to Fight Online Child Abuse in Europe - 6 minutes read
+++lead-in-text
Except on Tuesdays when she’s in the Dutch Senate, Arda Gerkens spends her time helping tech companies delete child sexual abuse material buried in their platforms. For seven years, the senator has run the nonprofit foundation Online Child Abuse Expert Agency, known by its Dutch acronym EOKM. With her 20-person team, Gerkens offers judgment-free advice and free automated detection tools to businesses, from image to file hosting sites. Most companies want to keep their networks clean, she says. “They just need help.”
+++
Lawmakers in the European Union, however, have lost patience with this coaxing approach and say platforms have failed to tackle this problem voluntarily. This week, the European Commission’s home affairs department put forward [new that would enable courts to force tech companies to scan their users’ images, videos, and texts in search of child abuse or grooming. But the proposed law has no exemptions, meaning encrypted services like WhatsApp and Telegram could be forced to scan their users' private messages.
Companies in the Netherlands host more child sexual abuse content [than any other EU But Gerkens thinks the Commission’s proposal goes too far. She likes the idea of a central European Center to coordinate the crackdown. But she’s worried that scanning any platform for text would risk too many posts being flagged by mistake and that forcing encrypted services to scan private messages would compromise the security of some of the most secure spaces on the internet.
Encrypted messengers protect children as well as adults, she says. Every year, EOKM’s help line receives several pleas from minors who have been blackmailed by hackers to create and send explicit images after their non-encrypted social media accounts have been hacked. Gerkens is worried that breaking encryption would mean these cases become more common. “[If] you have a backdoor into encryption, it works both ways,” she says.
The debate over encrypted spaces exposes a deep rift in Europe about how to crack down on a problem that is only getting worse. Every year investigators find more child sexual abuse material online than the year before. From 2020 to 2021, the British nonprofit Internet Watch Foundation recorded a more than [60 percent in this type of content. The urgent need to address this growing problem has created extra tension in what is already a bitter debate hinged on one question: Is it disproportionate to scan everybody’s private messages to root out child sexual abuse?
“If you want to search somebody's house as a police officer, you can't just go and do that willy-nilly; you need good grounds to suspect [them], and in the online environment it should be exactly the same,” says Ella Jakubowska, a policy adviser at the Brussels-based digital rights group European Digital Rights.
Others see scanning tools differently. This technology operates more like a police dog in an airport, argues Yiota Souras, senior vice president and general counsel at the US National Center for Missing and Exploited Children. “That dog is not learning about what I have in my suitcase or communicating that in any way. It is alerting if it smells a bomb or drugs.”
Encrypted messenger services have been quick to condemn the Commission’s proposal. Julia Weiss, a spokesperson for the Swiss messenger app Threema, says the company was not willing to undermine its users’ privacy in any way. “Building a surveillance system to proactively scan all private content was a terrible idea [when Apple proposed and it's a terrible idea now,” Will Cathcart, head of WhatsApp, in a Twitter post. In August 2021, [Apple announced a to scan its users’ photos for child sexual abuse material but, after intense criticism, [indefinitely those plans a month later.
But Europe’s home affairs commissioner Ylva Johansson has been dogged in her pursuit of this law. “I'm prepared to hear criticism from companies, because detecting child sex abuse material and protecting children is maybe not profitable, but it's necessary,” she said in a press conference Wednesday. Tools used to carry out any scanning have to be the least privacy-intrusive technology and they have to be chosen in consultation with data protection authorities, she added.
Johansson's proposal does not define what type of technology these companies should use to scan messages. The reason for this, the commissioner says, is so the legislation does not go out of date as new privacy-friendly solutions are invented. Her supporters say the law will also incentivize companies to dedicate more resources to creating the tools they will later be mandated to use. “I am more and more confident that if the environment is correct and if there is a normative legal framework that will protect children and adolescents, then companies and solutions can be created and generated that can eliminate this crisis,” says Paul Zeitz, executive coordinator of Brave Movement, a group that represents survivors of childhood sexual violence.
But privacy groups say this approach means basing legislation on impossible technology. “It doesn't matter how many times Commissioner Johansson says in public that you can scan encrypted messages safely and with full respect for privacy,” says Jakubowska. “That doesn't make it true.”
The regulation still needs sign-off from the European Parliament and EU member states, which could take years. But critics, including Germany’s federal commissioner for data protection, Ulrich Kelber, have to stop the current proposal. “Since some points will result in solutions that deeply interfere with fundamental rights, the regulation should under no circumstances endure in this form,” he said on Thursday.
Yet Johansson remains unperturbed. In an interview with WIRED, she describes the fight against child sexual abuse as a cause that feels very personal. “As a mother, I feel obliged to protect my children,” she says. “As an adult, I'm obliged to protect all children. And as a politician, when I have the power to propose legislation to protect children, I think I'm morally obliged to propose that legislation.”
Other members of the European Parliament have accused Johansson of bringing an emotional intensity to the debate which has made it [difficult to details in the law without being made to feel they don’t care about children suffering from child abuse.
However, the commissioner can claim supporters among survivors of child sexual abuse, who say they are impressed by her strong rhetoric and plain language around subjects that still feel taboo.
“It feels very good when you're a survivor to have a political leader, who is very powerful, talk about shame, talk about trauma, talk about the impact of child sexual abuse,” says Mié Kohiyama, a French survivor of child sexual abuse who is also part of Brave Movement, which was set up earlier this year. “It's so important for us.”
Source: Wired
Powered by NewsAPI.org