Pple to track child abuse content
Tech giant Apple s deploying new technology s IOS, macos, watchos and sage platforms that will ct and block child sexual e material (CSAM). pple devices such as IOS 15, OS 15, watchos 8 and OS Monterey will have new tographic applications that the spread of such imagery, e protecting user privacy, ompany said. e new child safety features been developed in collabon with child safety experts, le said. They will include communication tools that w parents to “play a more med role” in how children navigate communication online.
“The message app will use on-device machine learning (ML) to warn about sensitive content, while keeping private communication unreadable by Apple,” the company said on its child safety page.
“CSAM detection will help Apple provide valuable information to law enforcement agencies on collection of CSAM in icloud Photos,” Apple said. “CSAM detection enables Apple to accurately identify and report icloud users who store known CSAM in their icloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National
Center for Missing and Exploited Children (NCMEC),” the company said in a technical document.
Apple is also updating Siri and Search to provide “expanded information” and help when users encounter “unsafe situations”. “Siri and Search will also intervene when users try to search for Csam-related topics,” the company said.
The company said it won’t learn “anything about images that do not match the known CSAM database”, and it won’t be able to access metadata or visual derivatives for matched CSAM images.
“The risk of the system incorrectly flagging an account is extremely low,” the company claimed.