Houston Chronicle

Amazon workers listening when you talk to Alexa

- By Matt Day, Giles Turner and Natalia Drozdiak

Tens of millions of people use smart speakers and their voice software to play games, find music or trawl for trivia. Millions more are reluctant to invite the devices and their powerful microphone­s into their homes out of concern that someone might be listening.

Sometimes, someone is. Amazon.com employs thousands of people around the world to help improve the Alexa digital assistant powering its line of Echo speakers. The team listens to voice recordings captured in Echo owners’ homes and offices. The recordings are transcribe­d, annotated and fed back into the software as part of an effort to eliminate gaps in Alexa’s understand­ing of human speech and help it better respond to commands.

The Alexa voice review process, described by seven people who have worked on the program, highlights the often-overlooked human role in training software algorithms. In marketing materials Amazon says Alexa “lives in the cloud and is always getting smarter.” But like many software tools built to learn from experience, humans are doing some of the teaching.

The team comprises a mix of contractor­s and full-time Amazon employees who work in outposts from Boston to Costa Rica, India and Romania, according to the people, who signed nondisclos­ure agreements barring them from speaking publicly about the program. They work nine hours a day, with each reviewer parsing as many as 1,000 audio clips per shift, according to two workers based at Amazon’s Bucharest office, which takes up the top three floors of the Globalwort­h building in the Romanian capital’s up-andcoming Pipera district. The facility stands out bears no exterior sign advertisin­g Amazon’s presence.

The work is mostly mundane. One worker in Boston said he mined accumulate­d voice data for specific utterances such as “Taylor Swift” and annotated them to indicate that the searcher meant the musical artist. Occasional­ly, the listeners pick up things Echo owners likely would rather stay private: a woman singing badly off key in the shower, say, or a child screaming for help. The teams use internal chat rooms to share files when they need help parsing a muddled word or come across an amusing recording.

Sometimes they hear recordings they find upsetting or possibly criminal. Two of the workers said they picked up what they believe was a sexual assault. When something like that happens, they may share the experience in the internal chat room as a way of relieving stress. Amazon says it has procedures in place for workers to follow when they hear something distressin­g, but two Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.

“We take the security and privacy of our customers’ personal informatio­n seriously,” an Amazon spokesman said in an emailed statement. “We only annotate an extremely small sample of Alexa voice recordings in order (to) improve the customer experience. For example, this informatio­n helps us train our speech recognitio­n and natural language understand­ing systems, so Alexa can better understand your requests and ensure the service works well for everyone.

“We have strict technical and operationa­l safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to informatio­n that can identify the person or account as part of this workflow. All informatio­n is treated with high confidenti­ality, and we use multifacto­r authentica­tion to restrict access, service encryption and audits of our control environmen­t to protect it.”

Amazon, in its marketing and privacy policy materials, doesn’t explicitly say humans are listening to recordings of some conversati­ons picked up by Alexa. “We use your requests to Alexa to train our speech recognitio­n and natural language understand­ing systems,” the company says in a list of frequently asked questions.

In Alexa’s privacy settings, Amazon gives users the option of disabling the use of their voice recordings for the developmen­t of new features. The company says people who opt out might still have their recordings analyzed by hand over the regular course of the review process. A screenshot reviewed by Bloomberg shows that the recordings sent to the Alexa reviewers don’t provide a user’s full name and address but are associated with an account number, as well as the user’s first name and the device’s serial number.

“You don’t necessaril­y think of another human listening to what you’re telling your smart speaker in the intimacy of your home,” said Florian Schaub, a professor at the University of Michigan who has researched privacy issues related to smart speakers. “I think we’ve been conditione­d to the (assumption) that these machines are just doing magic machine learning. But the fact is there is still manual processing involved.”

“Whether that’s a privacy concern or not depends on how cautious Amazon and other companies are in what type of informatio­n they have manually annotated and how they present that informatio­n to someone,” he added.

Newspapers in English

Newspapers from United States