Apple to track child abuse content
NEW DELHI: Tech giant Apple Inc. is deploying new technology on its iOS, macOS, watchOS and iMessage platforms that will detect and block child sexual abuse material (CSAM).
Apple devices such as iOS 15, iPadOS 15, watchOS 8 and macOS Monterey will have new cryptographic applications that limit the spread of such imagery, while protecting user privacy, the company said.
The new child safety features have been developed in collaboration with child safety experts, Apple said. They will include new communication tools that allow parents to “play a more informed role” in how children navigate communication online.
“The message app will use on-device machine learning (ML) to warn about sensitive content, while keeping private communication unreadable by Apple,” the company said on its child safety page.
“CSAM detection will help Apple provide valuable information to law enforcement agencies on collection of CSAM in iCloud Photos,” Apple said. “CSAM detection enables Apple to accurately identify and report iCloud users who store known CSAM in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National Center for Missing and Exploited Children (NCMEC),” the company said in a technical document.
Apple is also updating Siri and Search to provide “expanded information” and help when users encounter “unsafe situations”. “Siri and Search will also intervene when users try to search for CSAM-related topics,” the company said.
The company said it won’t learn “anything about images that do not match the known CSAM database”, and it won’t be able to access metadata or visual derivatives for matched CSAM images.
“The risk of the system incorrectly flagging an account is extremely low,” the company claimed.