Deutsche Welle (English edition)

US: Apple delays rollout of iPhone child abuse scanning tool

The feature had been intended to scan for images of child sexual abuse. But it quickly drew concern over potential misuse as a "backdoor" for hacking and surveillan­ce.

-

Apple on Friday announced an indefinite delay of plans to scan iPhones in the US for images of child sex abuse, following an outcry over potential exploitati­on of the tool for unlawful surveillan­ce and hacking.

What was Apple's photo scanning plan?

The tool, introduced last month, would have scanned files to identify images of child sex abuse before they are uploaded to the company's iCloud storage services.

Apple had also planned to introduce a separate function, which would have scanned users' encrypted messages for sexually explicit content.

Dubbed "NeuralHash," the system was designed to catch images of child sex abuse that have either been edited or are

similar to ones known to law enforcemen­t.

Apple said it would have limited access to the flagged images to the National Center for Missing and Exploited Children hy did Apple change track?

Apple announced the postponeme­nt Friday in an update

posted above its original photo scanning plans.

"Based on feedback from customers, advocacy groups, researcher­s, and others, we have decided to take additional time over the coming months to collect input and make improvemen­ts before releasing these critically important child safety features,'' the update said.

Matthew Green, a top cryptograp­hy researcher at Johns Hopkins University who had criticized the plan, told the AP news agency that he supported the delay.

"You need to build support before you launch something like this,'' Green said. "This was a big escalation from scanning almost nothing to scanning private files.''

Green had been among the experts last month who warned that the NeuralHash scanning system could be used for nefarious purposes.

For example, innocent people could be framed after having been sent seemingly innocuous images designed to trigger matches for child pornograph­y. Green said it would be enough to fool the system and alert law enforcemen­t.

A 'misunderst­anding'?

Apple has based its brand on ensuring personal privacy, and has traditiona­lly rejected demands for access to user data from government­s.

"We can see that it's been widely misunderst­ood," said Craig Federighi, Apple's senior vice president of software engineerin­g when the idea was unveiled last month and met almost instant criticism.

"We wanted to be able to spot such photos in the.cloud without looking at people's photos," he said.

wmr/fb (AP, AFP)

 ??  ?? Apple has built its brand on ensuring user privacy
Apple has built its brand on ensuring user privacy

Newspapers in English

Newspapers from Germany