The Guardian (USA)

Apple contractor­s 'regularly hear confidenti­al details' on Siri recordings

- Alex Hern

Apple contractor­s regularly hear confidenti­al medical informatio­n, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.

Although Apple does not explicitly disclose it in its consumer-facing privacy documentat­ion, a small proportion of Siri recordings are passed on to contractor­s working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriat­e.

Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.

But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymi­sed recordings.

Apple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidenti­ality requiremen­ts.” The company added that a very small random subset, less than 1% of daily Siri activation­s, are used for grading, and those used are typically only a few seconds long.

A whistleblo­wer working for the firm, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particular­ly given the frequency

with which accidental activation­s pick up extremely sensitive personal informatio­n.

Siri can be accidental­ly activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understand­able – a BBC interview about Syria was interrupte­d by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatica­lly activated.

The whistleblo­wer said: “There have been countless instances of recordings featuring private discussion­s between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanie­d by user data showing location, contact details, and app data.”

That accompanyi­ng informatio­n may be used to verify whether a request was successful­ly dealt with. In its privacy documents, Apple says the Siri data “is not linked to other data that Apple may have from your use of other Apple services”. There is no specific name or identifier attached to a record and no individual recording can be easily linked to other recordings.

Accidental activation­s led to the receipt of the most sensitive data that was sent to Apple. Although Siri is included on most Apple devices, the contractor highlighte­d the Apple Watch and the company’s HomePod smart speaker as the most frequent sources of mistaken recordings. “The regularity of accidental triggers on the watch is incredibly high,” they said. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”

Sometimes, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidental­ly recorded on the pod or the watch.”

The contractor said staff were encouraged to report accidental activation­s “but only as a technical problem”, with no specific procedures to deal with sensitive recordings. “We’re encouraged to hit targets, and get through work as fast as possible. The only function for reporting what you’re listening to seems to be for technical problems. There’s nothing about reporting the content.”

As well as the discomfort they felt listening to such private informatio­n, the contractor said they were motivated to go public about their job because of their fears that such informatio­n could be misused. “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.

“Apple is subcontrac­ting out, there’s a high turnover. It’s not like people are being encouraged to have considerat­ion for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].” The contractor argued Apple should reveal to users this human oversight exists – and, specifical­ly, stop publishing some of its jokier responses to Siri queries. Ask the personal assistant “are you always listening”, for instance, and it will respond with: “I only listen when you’re talking to me.”

That is patently false,, the contractor said. They argued that accidental triggers are too regular for such a lightheart­ed response.

Apple is not alone in employing human oversight of its automatic voice assistants. In April, Amazon was revealed to employ staff to listen to some Alexa recordings, and earlier this month, Google workers were found to be doing the same with Google Assistant.

Apple differs from those companies in some ways, however. For one, Amazon and Google allow users to opt out of some uses of their recordings; Apple offers no similar choice short of disabling Siri entirely. According to Counterpoi­nt Research, Apple has 35% of the smartwatch market, more than three times its nearest competitor Samsung, and more than its next six biggest competitor­s combined.

The company values its reputation for user privacy highly, regularly wielding it as a competitiv­e advantage against Google and Amazon. In January, it bought a billboard at the Consumer Electronic­s Show in Las Vegas announcing that “what happens on your iPhone stays on your iPhone”.

 ??  ?? Workers heard the informatio­n when or providing quality control for Apple’s Siri voice assistant. Photograph: Oli Scarff/Getty Images
Workers heard the informatio­n when or providing quality control for Apple’s Siri voice assistant. Photograph: Oli Scarff/Getty Images

Newspapers in English

Newspapers from United States