Mac Format

Siri security tightened

Apple revises how it uses your Siri data

-

Apple will resume Siri grading but with important changes to the process

Following a review of how it handles users’ interactio­ns with Siri, Apple is making a number of changes to enhance privacy and data security.

The review was in response to a report that contractor­s at Apple were listening to Siri audio recordings as part of the process of evaluating Siri’s performanc­e, a process Apple calls “grading”. A whistleblo­wer claimed that the contractor­s weren’t thoroughly vetted, and that the recordings included “extremely sensitive personal informatio­n” – a doctor and patient talking about the patient’s medical history, a possible drug deal and sexual encounters, for example.

Apple suspended human grading and investigat­ed. It explained that the process involved reviewing a small sample of audio Siri requests – less than 0.2% – and their computer-generated transcript­s, to evaluate performanc­e. Was there an intentiona­l Siri request? Did Siri hear it accurately? Did Siri respond appropriat­ely?

Solely about Siri

All graders were subject to Apple’s strict confidenti­ality rules, it said, and all recordings were anonymised. “Siri has been engineered to protect user privacy from the beginning,” Apple insisted, pointedly adding: “Siri uses a random identifier to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number – a process that we believe is unique among the digital assistants in use today.”

Google and Amazon carry out similar reviews of their digital assistant technologi­es and have been the subject of similar privacy-related stories, but in their cases, according to AppleInsid­er, recordings were associated with user accounts or customer data logs.

By contrast, Apple said: “We focus on doing as much on-device as possible, minimising the amount of data we collect with Siri. When we store Siri data on our servers, we don’t use it to build a marketing profile and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologi­es to make Siri even more private.”

Apple said it will resume Siri grading when software updates are released this autumn, but with three changes. First, you will be invited to opt in explicitly to allow Apple to use anonymised audio samples of your Siri requests in the grading process. Apple hopes lots of users will do so, “knowing that Apple respects their data and has strong privacy controls in place.” You’ll be able to opt out again at any time. Second, if you do opt in, only Apple employees will be allowed to listen. Otherwise, third, Apple will no longer retain audio recordings of Siri interactio­ns by default, although it will continue to use computer-generated transcript­s to improve Siri.

 ??  ?? Apple insists that its grading process is in place to evaluate Siri, not to build profiles.
Apple insists that its grading process is in place to evaluate Siri, not to build profiles.

Newspapers in English

Newspapers from Australia