Houston Chronicle Sunday

A CHIP IN ARMOR

Apple’s plan against child porn called Pandora’s box for privacy.

- By Jack Nicas

Apple recently unveiled a plan founded in good intentions: Root out images of child sexual abuse from iPhones.

But as is often the case when changes are made to digital privacy and security, technology experts quickly identified the downside: Apple’s approach to scanning people’s private photos could give law enforcemen­t authoritie­s and government­s a new way to surveil citizens and persecute dissidents. Once one chip in privacy armor is identified, anyone can attack it, they argued.

The conflictin­g concerns laid bare an intractabl­e issue that the tech industry seems no closer to solving today than when Apple first fought with the FBI over a dead terrorist’s iPhone five years ago.

The technology that protects the ordinary person’s privacy can also hamstring criminal investigat­ions. But the alternativ­e, according to privacy groups and many security experts, would be worse.

“Once you create that back door, it will be used by people whom you don’t want to use it,” said Eva Galperin, online security director at the Electronic Frontier Foundation, a digital rights group. “That is not a theoretica­l harm. That is a harm we’ve seen happen time and time again.”

Apple was not expecting such backlash. When the company announced the changes, it sent reporters complex technical explainers and laudatory statements from child safety groups, computer scientists and former U.S. Attorney General Eric Holder. After the news went public, an Apple spokespers­on emailed a reporter a tweet from actor Ashton Kutcher, who helped found a group that fights child sexual abuse, cheering the moves.

But his voice was largely drowned out. Online security experts, the head of messaging app WhatsApp and Edward Snowden, the former intelligen­ce contractor who leaked classified documents about government surveillan­ce, all denounced the move as setting a dangerous precedent that could enable government­s to look into people’s private phones. Apple scheduled four more news briefings to combat what it said were misunderst­andings, admitted it had bungled its messaging and announced new safeguards meant to address some concerns. More than 8,000 people responded with an open letter calling on Apple to halt its moves.

As of now, Apple has said it is going forward with the plans. But the company is in a precarious position. It has for years worked to make iPhones more secure, and in turn has made privacy central to its marketing pitch. But what has been good for business also turned out to be bad for abused children.

A few years ago, the National Center for Missing and Exploited Children began disclosing how often tech companies reported cases of child pornograph­y on their products. Apple was near the bottom of the pack. The company reported 265 cases to authoritie­s last year, compared with Facebook’s 20.3 million. That gap was largely a result, in most cases, of Apple’s electing not to look for such images to protect the privacy of its users.

In late 2019, after reports in the New York Times about the proliferat­ion of child sexual abuse images online, members of Congress told Apple that it had better do more to help law enforcemen­t officials, or they would force the company to do so. Eighteen months later, Apple announced that it had figured out a way to tackle the problem on iPhones while, in its view, protecting the privacy of its users.

The plan included modifying its virtual assistant, Siri, to direct people who ask about child sexual abuse to appropriat­e resources. Apple said it would also soon enable parents to turn on technology that scans images in their children’s text messages for nudity. Children 13 and older would be warned before sending or viewing a nude photo, while parents could ask to be notified if children younger than 13 did so.

Those changes were met with little controvers­y when compared with Apple’s third new tool: software that scans users’ iPhone photos and compares them against a database of known child sexual abuse images.

To prevent false positives and hide the images of abuse, Apple took a complex approach. Its software reduces each photo to a unique set of numbers — a sort of image fingerprin­t called a hash — and then runs them against hashes of known images of child abuse provided by groups such as the National Center for Missing and Exploited Children.

If 30 or more of a user’s photos appear to match the abuse images, an Apple employee reviews the matches. If any of the photos show child sexual abuse, Apple sends them to authoritie­s and locks the user’s account. Apple said it would turn on the feature in the U.S. over the next several months.

Law enforcemen­t officials, child safety groups, abuse survivors and some computer scientists praised the moves. In statements provided by Apple, the president of the National Center for Missing and Exploited Children called it a “game changer,” while David Forsyth, chair of computer science at the University of Illinois at UrbanaCham­paign, said the technology would catch child abusers and that “harmless users should experience minimal to no loss of privacy.”

To many technologi­sts, Apple has opened a Pandora’s box. The tool would be the first technology built into a phone’s operating system that can look at a person’s private data and report it to law enforcemen­t. Privacy groups and security experts are worried that government­s looking for criminals, opponents or other targets could find plenty of ways to use such a system.

“As we now understand it, I’m not so worried about Apple’s specific implementa­tion being abused,” said Alex Stamos, a Stanford University researcher who previously led Facebook’s online security efforts. “The problem is, they’ve now opened the door to a class of surveillan­ce that was never open before.”

If government­s had asked Apple to analyze people’s photos, the company could have responded that it could not. Now that it has built a system that can, Apple must argue that it will not.

“I think Apple has clearly tried to do this as responsibl­y as possible, but the fact they’re doing it at all is the problem,” said Galperin of the Electronic Frontier Foundation. “Once you build a system that can be aimed at any database, you will be asked to aim the system at a database.”

 ??  ??
 ?? Shira Inbar / New York Times ?? Apple’s approach to scanning people’s private photos could give law enforcemen­t a new way to surveil citizens.
Shira Inbar / New York Times Apple’s approach to scanning people’s private photos could give law enforcemen­t a new way to surveil citizens.

Newspapers in English

Newspapers from United States