Siri security tightened
Apple revises how it uses your Siri data
Apple will resume Siri grading but with important changes to the process
Following a review of how it handles users’ interactions with Siri, Apple is making a number of changes to enhance privacy and data security.
The review was in response to a report that contractors at Apple were listening to Siri audio recordings as part of the process of evaluating Siri’s performance, a process Apple calls “grading”. A whistleblower claimed that the contractors weren’t thoroughly vetted, and that the recordings included “extremely sensitive personal information” – a doctor and patient talking about the patient’s medical history, a possible drug deal and sexual encounters, for example.
Apple suspended human grading and investigated. It explained that the process involved reviewing a small sample of audio Siri requests – less than 0.2% – and their computer-generated transcripts, to evaluate performance. Was there an intentional Siri request? Did Siri hear it accurately? Did Siri respond appropriately?
Solely about Siri
All graders were subject to Apple’s strict confidentiality rules, it said, and all recordings were anonymised. “Siri has been engineered to protect user privacy from the beginning,” Apple insisted, pointedly adding: “Siri uses a random identifier to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number – a process that we believe is unique among the digital assistants in use today.”
Google and Amazon carry out similar reviews of their digital assistant technologies and have been the subject of similar privacy-related stories, but in their cases, according to AppleInsider, recordings were associated with user accounts or customer data logs.
By contrast, Apple said: “We focus on doing as much on-device as possible, minimising the amount of data we collect with Siri. When we store Siri data on our servers, we don’t use it to build a marketing profile and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private.”
Apple said it will resume Siri grading when software updates are released this autumn, but with three changes. First, you will be invited to opt in explicitly to allow Apple to use anonymised audio samples of your Siri requests in the grading process. Apple hopes lots of users will do so, “knowing that Apple respects their data and has strong privacy controls in place.” You’ll be able to opt out again at any time. Second, if you do opt in, only Apple employees will be allowed to listen. Otherwise, third, Apple will no longer retain audio recordings of Siri interactions by default, although it will continue to use computer-generated transcripts to improve Siri.