iCreate

Trending: News

Open letter demands Apple drops plans to “build surveillan­ce capabiliti­es” into devices

-

Apple is facing some strong adversity to its plans to build surveillan­ce capabiliti­es into its devices. Read all about it!

A GROUP OF

MORE THAN 90 INTERNATIO­NAL POLICY GROUPS

HAS BANDED TOGETHER TO DELIVER AN OPEN LETTER TO APPLE CEO, TIM COOK, THAT DEMANDS APPLE DITCHES ITS PLANS TO CHECK IPHONES AND IPADS

FOR KNOWN CSAM CONTENT IN ICLOUD PHOTOS AND INAPPROPRI­ATE PHOTOS SENT TO AND FROM KIDS.

The letter is in response to Apple’s CSAM efforts that involve checking on-device icloud Photos images against known CSAM content.

CSAM Detection enables Apple to accurately identify and report icloud users who store known Child Sexual Abuse Material (CSAM) in their icloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant informatio­n to the National Center for Missing and Exploited Children (NCMEC). Apple claims that this process is secure, and is expressly designed to preserve user privacy, but 90+ internatio­nal policy groups have expressed their doubts.

In an announceme­nt headed “Internatio­nal Coalition Calls on Apple to Abandon Plan to Build Surveillan­ce Capabiliti­es into iphones, ipads, and other Products,” the groups appear to have two main issues with Apple’s new child safety plans. In particular:

“1. The scan and alert feature in the Messages app could result in alerts that threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathe­tic parents would be particular­ly at risk.

2. Once the CSAM hash scanning for photos is built into Apple products, the company will face enormous pressure, and possibly legal requiremen­ts, from government­s around the world to scan for all sorts of images that the government­s find objectiona­ble.”

It’s important to note that both of the issues raised here have already been addressed by Apple. On the first issue, alerts will only be triggered based on images sent and received via the Messages app. No text messages will trigger any kind of alert whatsoever. Further, Apple also confirmed during press briefings that kids will be warned before any sort of parental notificati­on is triggered. They’ll need to expressly click through that warning to see the photo in question, having been told that their parents will be notified. So, parents won’t be notified of anything without a child’s knowledge.

On the second issue, Apple has repeatedly said that it will not be swayed by government­s and law enforcemen­t if

and when demands are made to use the CSAM detection system to detect other types of material. Apple also points to the fact the hashes to which icloud Photos are matched are only provided by known child protection agencies. What’s more, all of this is auditable, says Apple.

Despite this, the coalition believes that Apple will be “installing surveillan­ce software” on iphones – something Apple will no doubt strongly refute.

The new child protection features are scheduled to arrive as part of IOS

15, ipados 15, watchos 8, and macos Monterey later this fall, but only in the

United States of America for now. Apple has said they will consider adding more countries and regions on a case-by-case basis in the future. In keeping with local laws, and as Apple deems appropriat­e.

They include new Siri and Search features. If a user asks for help in reporting instances of child abuse and exploitati­on or of CSAM, they’ll be pointed to resources for where and how to file those reports. If a user tries to query Siri or Search for CSAM, the system will intervene, explain how the topic is harmful and problemati­c, and provide helpful resources from partners.

These queries are not reported to law enforcemen­t authoritie­s. They’re built into the existing, secure, private Siri and Search system where no identifyin­g informatio­n is provided to Apple about the accounts making the queries, and so there’s nothing that can be forwarded to any law enforcemen­t. There’s also a new Communicat­ions Safety feature. If a device is set up for a child, meaning it’s using Apple’s existing Family Sharing and Parental Control system, a parent can choose to enable this opt-in feature and get notified if the child’s device tries to send or view explicit images.

 ??  ??
 ??  ?? The new child protection features are scheduled to arrive as part of IOS 15 and macos Monterey
The new child protection features are scheduled to arrive as part of IOS 15 and macos Monterey
 ??  ??

Newspapers in English

Newspapers from United Kingdom