Hartford Courant

Apple tool will flag sex abuse

Scan of US iphones for images of exploited kids draws praise, concerns about surveillan­ce

- By Barbara Ortutay and Frank Bajak

Apple has unveiled plans to scan U.S. iphones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researcher­s that the system could be misused by government­s looking to surveil their citizens.

Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communicat­ions readable by the company. The tool Apple calls “neuralmatc­h” will detect known images of child sexual abuse without decrypting people’s messages. If it finds a match, the image will be reviewed by a human who can notify law enforcemen­t if necessary.

But researcher­s say the tool could be put to other purposes such as government surveillan­ce of dissidents or protesters.

Matthew Green of Johns Hopkins, a top cryptograp­hy researcher, was concerned that it could be used to frame innocent people by sending them harmless but malicious images designed to appear as matches for child porn, fooling Apple’s algorithm and alerting law enforcemen­t — essentiall­y framing people. “Researcher­s have been able to do this pretty easily,” he said.

Tech companies including Microsoft,

Google, Facebook and others have for years been sharing “hash lists” of known images of child sexual abuse. Apple has also been scanning user files stored in its icloud service, which is not as securely encrypted as its messages, for such images.

Some say this technology could leave the company vulnerable to political pressure in authoritar­ian states such as China. “What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’ ” Green said. “Does Apple say no? I hope they say no, but their technology won’t say no.”

The company has been under pressure from government­s and law enforcemen­t to allow for surveillan­ce of encrypted data. Coming up with the security measures required Apple to perform a delicate balancing act between cracking down on the exploitati­on of children while keeping its high-profile commitment to protecting the privacy of its users.

The computer scientist who more than a decade ago invented Photodna, the technology used by law enforcemen­t to identify child pornograph­y online, acknowledg­ed the potential for abuse of Apple’s system but said it was far outweighed by the imperative of battling child sexual abuse.

“Is it possible? Of course. But is it something that I’m concerned about? No,” said Hany Farid, a researcher at the University of California at Berkeley, who argues that plenty of other programs designed to secure devices from various threats haven’t seen “this type of mission creep.” For example, Whatsapp provides users with end-toend encryption to protect their privacy, but employs a system for detecting malware and warning users not to click on harmful links.

Separately, Apple said its messaging app will use on-device machine learning to identify and blur sexually explicit photos on children’s phones and can also warn the parents of younger children via text message. It also said that its software would “intervene” when users try to search for topics related to child sexual abuse.

In order to receive the warnings about sexually explicit images on their children’s devices, parents will have to enroll their child’s phone. Kids over 13 can unenroll, meaning parents of teenagers won’t get notificati­ons.

Apple said neither feature would compromise the security of private communicat­ions or notify police.

“Apple’s expanded protection for children is a game changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. These “new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”

 ?? APPLE ?? A feature planned for Apple iphones will warn parents if their children send or receive sexually explicit images in text messages.
APPLE A feature planned for Apple iphones will warn parents if their children send or receive sexually explicit images in text messages.

Newspapers in English

Newspapers from United States