Apple to scan phones for sex abuse
Apple unveiled plans to scan U.S. iPhones for images of child abuse, drawing applause from child protection groups but raising concern among security researchers the system could be misused by governments looking to surveil citizens.
Apple said its messaging app will use ondevice machine learning to warn about sensitive content without making private communications readable by the company.
Apple says “neuralMatch” will detect known images of child sexual abuse without decrypting people’s messages. If it finds a match, the image will be reviewed by a human who can notify police.
But researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters.
Matthew Green of Johns Hopkins, a top cryptography researcher, is concerned it could be used to frame innocent people by sending them harmless but malicious images designed to appear as matches for child porn, fooling Apple’s algorithm and alerting law enforcement – framing people.
“Researchers have been able to do this pretty easily,” he said.
Tech companies including Microsoft, Google, Facebook and others have for years been sharing “hash lists” of known images of child sexual abuse. Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images.
Some say this technology could leave the company vulnerable to political pressure in authoritarian states such as China.
“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,”‘ Green said. “Does Apple say no? I hope they say no.”
The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data.
“These new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” said John Clark, CEO of the National Center for Missing and Exploited Children.