iPad&iPhone user

Apple’s child abuse photo scanning: What it is and why people are worried

Apple has announced new technologi­es to detect child abuse images. Here’s what you need to know.

- Michael Simon reports

Apple has announced that it will begin scanning all photos uploaded to iCloud for potential child sexual abuse material (CSAM). It’s come under a great deal of scrutiny and generated some outrage, so here’s what you need to know about the new technology before it rolls out, initially in the US, later this year.

WHAT ARE THE TECHNOLOGI­ES APPLE IS ROLLING OUT?

Apple will be rolling out new antiCSAM features in three areas: Messages, iCloud Photos, and Siri and Search. Here’s how each of them will be implemente­d, according to Apple.

Messages: The Messages app will use on-device machine learning to warn children and parents about sensitive content.

iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

Siri and Search: Siri and Search will provide additional resources to help children and parents stay safe online and get help with unsafe situations.

WHEN WILL THE NEW TECHNOLOGI­ES ARRIVE?

Apple says the Messages and Siri and Search features will arrive in iOS 15, iPadOS 15 and macOS Monterey. To be clear, Apple doesn’t say the technologi­es will arrive when iOS 15 lands, so it could come in a follow-up update. The iCloud Photos scanning doesn’t have a specific date for release, but will presumably arrive later this year as well.

DOES THIS MEAN APPLE WILL BE ABLE TO SEE MY PHOTOS?

Not exactly. Here’s how Apple explains the technology: instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizati­ons. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

BUT APPLE IS SCANNING PHOTOS ON MY DEVICE, RIGHT?

It is. However, Apple says the system does not work for users who have iCloud Photos disabled.

WHAT HAPPENS IF THE SYSTEM DETECTS CSAM IMAGES?

For one, it shouldn’t. Since the system only works with CSAM image hashes provided by NCMEC, it will only report photos that are known CSAM in iCloud Photos. If it does detect CSAM, Apple will then conduct a human review before deciding whether to make a report to NCMEC. Apple says

there is no automated reporting to the police, though it will report any instances to the appropriat­e authoritie­s.

COULD THE SYSTEM MISTAKE AN ACTUAL PHOTO OF MY CHILD AS CSAM?

It’s not likely. Apple says the likelihood that the system would incorrectl­y flag any given account is less than one in one trillion per year. And if it does happen, a human review would catch it before it escalated to the authoritie­s.

CAN I OPT OUT OF THE iCLOUD PHOTOS CSAM SCANNING?

No, but you can disable iCloud Photos to prevent Apple from scanning images.

IS APPLE SCANNING ALL OF MY PHOTOS IN MESSAGES TOO?

Not exactly. Apple’s safety measures in Messages are designed to protect children and are only available for child accounts set up as families in iCloud.

SO HOW DOES IT WORK?

Communicat­ion safety in Messages is different than CSAM scanning. Rather than using image hashes to compare against known images, it analyses images on the device for sexually explicit content. Image are not shared with Apple or any other agency, including NCMEC.

CAN PARENTS OPT-OUT?

Parents need to specifical­ly opt in to use the new Messages image scanning.

WILL IMESSAGES STILL END-TO-END ENCRYPTED?

Apple says communicat­ion safety in Messages doesn’t change the privacy features baked into messages, and Apple never gains access to communicat­ions. Furthermor­e,

none of the communicat­ions, image evaluation, interventi­ons, or notificati­ons are available to Apple.

WHAT HAPPENS IF A SEXUALLY EXPLICIT IMAGE IS DISCOVERED?

When a child aged 13 to 17 sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with resources, and reassured it is okay if they do not want to view or send the photo. For accounts of children age 12 and under, parents can set up parental notificati­ons that will be sent if the child confirms and sends or views an image that has been determined to be sexually explicit.

WHAT’S NEW IN SIRI & SEARCH?

Apple is enhancing Siri and Search to help people find resources for reporting CSAM and expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. Apple is also updating Siri and Search to intervene when users perform searches for queries related to CSAM. Apple says the interventi­ons will include explaining to users that interest in this topic is harmful and problemati­c and provide resources from partners to get help with this issue.

CAN THE CSAM SYSTEM BE USED TO SCAN FOR OTHER IMAGE TYPES?

Not presently. Apple says the system is only designed to scan for CSAM images. However, Apple could theoretica­lly tweak the parameters to look for images related to other things, such as LGBTQ+ content.

WHAT IF A GOVERNMENT FORCES APPLE TO SCAN FOR OTHER IMAGES?

Apple says it will refuse such demands.

DO OTHER COMPANIES SCAN FOR CSAM IMAGES?

Yes, most cloud services, including Dropbox, Google and Microsoft, as well as Facebook also have systems in place to detect CSAM images.

SO WHY ARE PEOPLE UPSET?

While most people agree that Apple’s system is appropriat­ely limited in scope, experts, watchdogs, and privacy advocates are concerned about the potential for abuse. For example, Edward Snowden, who exposed global

surveillan­ce programs by the NSA and is living in exile, tweeted: “No matter how well-intentione­d, Apple is rolling out mass surveillan­ce to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.” Additional­ly, the Electronic Frontier Foundation criticized the system and Matthew Green, a cryptograp­hy professor at Johns Hopkins, explained the potential for misuse with the system Apple is using.

People are also concerned that Apple is sacrificin­g the privacy built into the iPhone by using the device to scan for CSAM images. While many other services scan for CSAM images, Apple’s system is unique in that it uses on-device matching rather than images uploaded to the cloud.

COULD APPLE BE BLOCKED FROM IMPLEMENTI­NG ITS CSAM DETECTION SYSTEM?

It’s hard to say, but it’s likely that there will be legal battles both before and after the new technologi­es are implemente­d.

 ??  ??
 ??  ?? The new CSAM detection tools will arrive with the new operating systems later this year.
The new CSAM detection tools will arrive with the new operating systems later this year.

Newspapers in English

Newspapers from Australia