Apple’s Siri contractors ‘regularly’ hear drug deals and couples having sex
Apple is facing a rare privacy scandal after a whistleblower revealed that contractors working for the company regularly hear Siri users disclose confidential information.
The whistleblower, speaking to the Guardian, claims contractors - who are employed to "grade" the quality of the digital assistant's responses - often hear sensitive information including medical details, drug deals and recordings of couples having sex.
"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on," said the whistleblower. "These recordings are accompanied by user data showing location, contact details, and app data."
The anonymous Apple employee, who spoke due to their concerns about the company's lack of disclosure, is particularly uneasy given the frequency with which accidental activations pick up extremely sensitive personal information.
"The regularity of accidental triggers on the watch is incredibly high," they said. "The watch can record some snippets that will be 30 seconds - not that long but you can gather a good idea of what's going on."
In a statement given to the Guardian, Apple said less than one per cent of all Siri activations are used by the grading contractors.
"A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user's Apple ID," a spokesperson said. "Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements."
Apple employing staffers to listen to Siri recordings is nothing new; in April, it was revealed that Amazon was employed staff to listen to some Alexa recordings, and earlier this month, Google workers were found to be doing the same with Google Assistant.
However, as the Guardian notes, while Amazon and Google allow customers to opt-out of some uses of their recordings, Apple doesn't offer a similar option, outside of disabling Siri entirely.
That's a bad look for Apple, which has long touted the fact that it's miles ahead of the likes of Amazon and Google when it comes to protecting user privacy. In January, for example, it bought a billboard CES boasting that "what happens on your iPhone stays on your iPhone." Unless you accidentally trigger Siri, that is.