iPad&iPhone user

Apple will soon scan all iCloud Photos images for child abuse

Using a new on-device technology, Apple will detect illegal images while protecting user privacy.

- Jason Cross reports

TechCrunch has confirmed that Apple will soon roll out a new technology to scan photos uploaded to iCloud for child sexual abuse material (CSAM). The roll-out will happen later this year, initially in the US, as part of a collection of technologi­es meant to make its products and services safer for children to use.

Most cloud services already scan images for material that violates its

terms of service or the law, including CSAM. They can do this because, while the images may be stored encrypted, the companies have the encryption key. Apple encrypts photos in transit and stores them encrypted, but has the decryption key to decrypt them if necessary – to serve data stored in iCloud under subpoena, or to make your iCloud photos available in a web browser.

To help preserve user privacy, the company is relying on a new technology called NeuralHash that will check images as they are uploaded to iCloud Photos, looking for matches to a known database of child abuse imagery. It works entirely on your iPhone, iPad or Mac by converting photos into a unique string of letters and numbers (a ‘hash’). Normally, any slight change to a photo would result in a different hash, but Apple’s technology is said to be such that small alteration­s (like a crop) will still result in the same hash.

These hashes are matched ondevice to a database of hashes for images of child sexual abuse. The hashes can be matched invisibly, without knowing what the underlying image is or alerting the user in any way. The results of the matches are uploaded to Apple if a certain threshold is met. Only then can Apple decrypt the matching images, manually verify the contents, and disable a user’s account. Apple will then report the imagery to the US National Center for Missing & Exploited Children, which then passes it to law enforcemen­t.

In other words, it’s extremely unlikely that Apple will have the ability to randomly look at whatever images they want to on your iPhone. According to TechCrunch, Apple says there is a one in one trillion chance of a false positive, and there will be an appeals process in place for anyone who thinks their account was flagged by mistake. The technology is only sort of optional: you don’t have to use iCloud Photos, but if you do, you will not be able to disable the feature.

Apple has published a technical paper detailing this NeuralHash technology. This new technology will roll out as part of iOS 15, iPadOS 15, and macOS Monterey this autumn.

For further details see our feature on page 16.

 ??  ??

Newspapers in English

Newspapers from Australia