Hindustan Times (Ranchi)

Apple to track child abuse content

- Staff Writer feedback@livemint.com

NEW DELHI: Tech giant Apple Inc. is deploying new technology on its iOS, macOS, watchOS and iMessage platforms that will detect and block child sexual abuse material (CSAM).

Apple devices such as iOS 15, iPadOS 15, watchOS 8 and macOS Monterey will have new cryptograp­hic applicatio­ns that limit the spread of such imagery, while protecting user privacy, the company said.

The new child safety features have been developed in collaborat­ion with child safety experts, Apple said. They will include new communicat­ion tools that allow parents to “play a more informed role” in how children navigate communicat­ion online.

“The message app will use on-device machine learning (ML) to warn about sensitive content, while keeping private communicat­ion unreadable by Apple,” the company said on its child safety page.

“CSAM detection will help Apple provide valuable informatio­n to law enforcemen­t agencies on collection of CSAM in iCloud Photos,” Apple said. “CSAM detection enables Apple to accurately identify and report iCloud users who store known CSAM in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant informatio­n to the National Center for Missing and Exploited Children (NCMEC),” the company said in a technical document.

Apple is also updating Siri and Search to provide “expanded informatio­n” and help when users encounter “unsafe situations”. “Siri and Search will also intervene when users try to search for CSAM-related topics,” the company said.

The company said it won’t learn “anything about images that do not match the known CSAM database”, and it won’t be able to access metadata or visual derivative­s for matched CSAM images.

“The risk of the system incorrectl­y flagging an account is extremely low,” the company claimed.

Newspapers in English

Newspapers from India