Pittsburgh Post-Gazette

DIGGING INTO DEVICES

-

Apple will start scanning phones for images of child abuse.

Apple has unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researcher­s that the system could be misused, including by government­s looking to surveil their citizens.

The tool designed to detect known images of child sexual abuse, called “NeuralHash,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornograph­y is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will only flag images that are already in the center’s database of known child pornograph­y. Parents snapping innocent photos of a child in the bath presumably need not worry. But researcher­s say the matching tool — which doesn’t “see” such images, just mathematic­al “fingerprin­ts” that represent them — could be put to more nefarious purposes.

Matthew Green, a top cryptograp­hy researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornograph­y. That could fool Apple’s algorithm and alert law enforcemen­t. “Researcher­s have been able to do this pretty easily,” he said of the ability to trick such systems.

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprin­ts of known child sexual abuse images. Apple has used those to scan user files stored in its iCloud service for child pornograph­y. Apple has been under government pressure for years to allow for increased surveillan­ce of encrypted data.

The computer scientist who more than a decade ago invented PhotoDNA, the technology used by law enforcemen­t to identify child pornograph­y online, acknowledg­ed the potential for abuse of Apple’s system but said it was far outweighed by the imperative of battling child sexual abuse.

“Is it possible? Of course. But is it something that I’m concerned about? No,” said Hany Farid, a researcher at the University of California at Berkeley.

Apple was one of the first major companies to embrace “end-to-end” encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcemen­t, however, has long pressured the company for access to that informatio­n to investigat­e crimes such as terrorism or child sexual exploitati­on. Apple said the latest changes will roll out this year as part of updates to its operating software.

“Apple’s expanded protection for children is a game changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. “With so many people using Apple products, these new safety measures have lifesaving potential for children.”

Julia Cordua, the CEO of Thorn, said that Apple’s technology balances “the need for privacy with digital safety for children.” Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifyin­g victims and working with tech platforms.

 ?? Chip Somodevill­a/Getty Images ?? In July 2020, ParentsTog­ether set up a “teddy bear sit in” in front of the Rayburn House Office Building in Washington to demand Amazon founder Jeff Bezos stop hosting child pornograph­y on Amazon web services and to report it to the proper authoritie­s.
Chip Somodevill­a/Getty Images In July 2020, ParentsTog­ether set up a “teddy bear sit in” in front of the Rayburn House Office Building in Washington to demand Amazon founder Jeff Bezos stop hosting child pornograph­y on Amazon web services and to report it to the proper authoritie­s.

Newspapers in English

Newspapers from United States