EU privacy reform could impede efforts to combat child abuse

A directive set to take effect in the European Union on Dec. 20 will curb image and text-scanning tools used by big tech platforms. Critics of the tools say they infringe on privacy rights, but others are worried the ban will impede efforts in tackling online child abuse.

|
Yves Herman/Reuters
European Union flags flutter in front of the European Commission headquarters in Brussels, Belgium, Oct. 2, 2019. Automatic detection tools that have proven highly effective in tackling online child abuse will be banned under a new directive set to take effect Dec. 20.

Online child sexual abuse could become harder to detect due to privacy protections set to take effect in the European Union next month – putting millions of children at increased risk worldwide, critics of the proposals have warned.

Under the changes, big tech firms like Facebook and Microsoft would be banned from using automatic detection tools that are routinely employed to identify material containing images of child abuse, or to detect online grooming.

Opponents of the tools say such automatic scanning infringes the privacy of people using chat and messaging apps, but the looming ban has drawn strong criticism around the world – from children’s rights advocates to American actor and tech investor Ashton Kutcher.

“Time is running out to ensure proactive and voluntary online child abuse detection methods are preserved in the EU,” Mr. Kutcher wrote on Twitter earlier this month as European lawmakers (MEPs) prepare to vote on the new directive.

Critics say the reform, set to come into force on Dec. 20, would prevent law enforcement and child protection agencies from identifying millions of child sexual abuse cases each year – not just in the 27-member state EU, but globally.

“Online child sexual abuse is a borderless crime,” Chloe Setter, head of policy at WePROTECT Global Alliance – a nonprofit that fights child exploitation, told the Thomson Reuters Foundation.

“Europe is already host to the vast majority of known child sexual abuse material on the internet, but victims and perpetrators can be anywhere. The restriction of automated detection tools in Europe would have major implications for children globally,” she said.

Sex offenders in European countries use social media platforms to contact children around the world with the aim of grooming them, said Dorothea Czarnecki, vice chair of ECPAT Germany, an alliance of 28 children’s rights institutions.

Some use translation apps to communicate with victims in countries as far afield as Vietnam, she said.

Opponents of the new directive, called the European Electronic Communications Code, also fear that banning detection tools in Europe could prompt tech firms to stop using them elsewhere, because they have global teams to moderate content.

“If a company in the EU stops using this technology overnight, they would stop using it all over the world,” said Emilio Puccio, coordinator of the European Parliament Intergroup on Children’s Rights.

Highly effective

The tools have proven highly effective at tackling online abuse and tech companies provide law enforcement authorities with about two thirds of the child sexual abuse reports they receive, children’s rights campaigners say.

Facebook, Microsoft, and Google did not respond to requests for comment.

In 2019, the United States-based nonprofit National Center for Missing & Exploited Children received 16.9 million reports from technology companies related to suspected online child sexual exploitation.

If the directive is approved, the ban would cover anti-grooming tools used to detect suspicious activity and “classifier” tools, which help identify pictures and videos that are not already in a database of illegal content.

Leftist members of the European Parliament led by German Socialist Birgit Sippel led the push to ban the use of automatic scanning, arguing that the way the tools are currently used violates privacy and data protection rights.

They were particularly concerned that users of chat and other communication platforms could have the content of private conversations analyzed.

“Even voluntary measures by private companies constitute an interference with those rights when the measures involve monitoring and analysis of content of communications and processing of personal data,” their draft proposals said.

Ms. Sippel could not be reached for comment, but those in favor of keeping automatic scanning say privacy fears are unfounded.

“Their sole purpose is to identify and flag abuse of children, not to read or spy into private communications,” Ms. Setter said.

Anti-grooming technology uses the same mechanisms as spam or malware filters, so poses no greater threat to privacy, said MEP Hilde Vautmans.

“We use these technologies to protect our computers, and we should be able to continue to use the same technologies to protect our own children from sexual abuse,” she said.

This story was reported by The Thomson Reuters Foundation.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to EU privacy reform could impede efforts to combat child abuse
Read this article in
https://www.csmonitor.com/Technology/2020/1201/EU-privacy-reform-could-impede-efforts-to-combat-child-abuse
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe