iPad&iPhone user

There’s a small crack in the iPhone’s foundation. How big will it get?

Apple’s CSAM photo scanning intentions are good, but that might not be enough.

- Jason Snell reports

Apple’s out-of-the-blue announceme­nt that it was adding a bunch of features to iOS involving Child Sexual Abuse Material (CSAM) generated an entirely predictabl­e reaction. Or, more accurately, reactions. Those on the law-enforcemen­t side of the spectrum praised Apple for its work, and those on the civil-liberties side accused Apple of turning iPhones into surveillan­ce devices.

It’s not surprising at all that Apple’s announceme­nt would be met with scrutiny. If anything is surprising about this whole story, it’s that Apple doesn’t seem to have anticipate­d all the pushback its announceme­nt received. It had to post a Frequently­Asked Questions file in response. If Q’s are being FA’d in the wake of your announceme­nt, you probably botched your announceme­nt.

Such an announceme­nt deserves scrutiny. The problem for those seeking to drop their hot takes about this issue is that it’s extremely complicate­d and there are no easy answers. That doesn’t mean that Apple’s approach is fundamenta­lly right or wrong, but it does mean that Apple has made some choices that are worth exploring and debating.

APPLE’S COMPROMISE

I’m not sure quite why Apple chose this moment to roll out this technology. Apple’s Head of Privacy implies that it’s because it was ready, but that’s a bit of a dodge – Apple has to choose what technologi­es to prioritize, and it prioritize­d this one. Apple may be anticipati­ng legal requiremen­ts for it to scan for CSAM. It’s possible that Apple is working on increased iCloud security features that necessitat­e this approach. It’s also possible that Apple just decided it needed to do more to stop the distributi­on of CSAM.

The biggest clue about Apple’s motivation­s is the very specific way this feature has been implemente­d. I’ll spare you the long explanatio­n, but in short: Apple is comparing images against a hash of illegal images compiled by the National Centre for Missing and Exploited Children. It’s scanning new images that are going to be synced with iCloud Photos. It’s not scanning all the photos on your device, and Apple isn’t scanning all the photos it’s storing on its iCloud servers.

In short, Apple has built a CSAM detector that sits at the doorway between your device and iCloud. If you don’t sync photos with iCloud, the detector never runs.

This all leads me to believe that there’s another shoe to drop here, one that will allow Apple to make its cloud services more secure and private. If this scanning system is essentiall­y the trade-off that allows Apple to provide more privacy for its users while not abdicating its moral duty to prevent the spread of CSAM, great. But there’s no way to know until Apple makes such an announceme­nt. In the

meantime, all those potential privacy gains are theoretica­l.

WHERE IS THE SPY?

In recent years, Apple has made it clear that it considers the analysis of user data that occurs on our devices to be fundamenta­lly more private than the analysis that runs in the cloud. In the cloud, your data must be decrypted to be analysed, opening it up to pretty much any form of analysis. Any employee with the right level of access could also just flip through your data. But if all that analysis happens on your device – this is why Apple’s modern chips have a powerful Neural Engine component to do the job – that data never leaves home.

Apple’s approach here calls all of that into question, and I suspect that’s the source of some of the greatest criticism of this announceme­nt. Apple is making decisions that it thinks will enhance privacy. Nobody at Apple is scanning your photos, and nobody at Apple can even look at the potential CSAM images until a threshold has passed that reduces the chance of false positives. Only your device sees your data. Which is great, because our devices are sacred and they belong to us.

Except… that there’s now going to be an algorithm running on our devices that’s designed to observe our data, and if it finds something that it doesn’t like, it will then connect to the Internet and report that data back to Apple. While today it has been purpose-built for CSAM, and it can be deactivate­d simply by shutting off iCloud Photo Library syncing, it still feels like a line has been crossed. Our devices won’t just be working for us, but will also be watching us for signs of illegal activity and alerting the authoritie­s.

The risk for Apple here is huge. It has invested an awful lot of time in equating on-device actions with privacy, and it risks poisoning all of that work with the perception that our phones are no longer our castles.

IT’S NOT THE TOOL, BUT HOW IT’S USED

In many ways, this is yet another facet of the greatest challenge the technology industry faces in this era. Technology has become so important and powerful that every new developmen­t has enormous, societywid­e implicatio­ns.

With its on-device CSAM scanner, Apple has built a tool carefully calibrated to protect user privacy. If building this tool enabled Apple to finally offer broader encryption of iCloud data, it might even be a net increase in user privacy.

But tools are neither good nor evil. Apple has built this tool for a good purpose, but every time a new tool is built, all of us need to imagine how it might be misused. Apple seems to have very carefully designed this feature to make it more difficult to subvert, but that’s not always enough.

Imagine a case where a law enforcemen­t agency in a foreign country comes to Apple and says that it has compiled a database of illegal images and wants it added to Apple’s scanner. Apple has said, bluntly, that it will refuse all such requests. That’s encouragin­g, and I have little doubt that Apple would abandon most countries if they tried to pull that manoeuvre.

But would it be able to say no to China? Would it be able to say no to the UK government if the images in question would implicate members of terrorist organizati­ons? And in a

decade or two, will policies like this be so commonplac­e that when the moment comes that a government asks Apple or its equivalent­s to began scanning for illegal or subversive material, will anyone notice? The first implementa­tion of this technology is to stop CSAM, and nobody will argue against trying to stop the exploitati­on of children. But will there be a second implementa­tion? A third?

Apple has tried its best to find a compromise between violating user privacy and stopping the distributi­on of CSAM. The very specific way this feature is implemente­d proves that. (Anyone who tries to sell you a simplified story about how Apple just wants to spy on you is, quite frankly, someone who is not worth listening to.)

But just because Apple has done its due diligence and made some careful choices in order to implement a tool to stop the spread of heinous material doesn’t mean that it’s off the hook. By making our phones run an algorithm that isn’t meant to serve us, but watches us, it has crossed a line. Perhaps it was inevitable that the line would be crossed. Perhaps it’s inevitable that technology is leading us to a world where everything we say, do and see is being scanned by a machine-learning algorithm that will be as benevolent or malevolent as the society that implemente­d it.

Even if Apple’s heart is in the right place, my confidence that its philosophy will be able to withstand the future desires of law enforcemen­t agencies and authoritar­ian government­s is not as high as I want it to be. We can all be against CSAM and admire the clever way Apple has tried to balance these two conflictin­g needs, while still being worried about what it means for the future.

 ??  ??
 ??  ?? CSAM checks occur with images uploaded to iCloud, not on your iPhone’s photo library.
CSAM checks occur with images uploaded to iCloud, not on your iPhone’s photo library.
 ??  ?? Would Apple be able to say no to China if it made demands to add images to Apple’s scanner?
Would Apple be able to say no to China if it made demands to add images to Apple’s scanner?

Newspapers in English

Newspapers from Australia