Macworld (USA)

Apple apologizes for Siri grading program

“We realize we haven’t been fully living up to our high ideals, and for that we apologize.”

- BY JASON CROSS

About a month ago, a report in The Guardian ( go.macworld. com/hear) exposed the fact that third-party contractor­s have been listening in on a small percentage of Siri requests as part of a

“Siri grading” program. Apple promised to halt the Siri grading program ( go.macworld. com/halt) while it conducts a “thorough review,” which left us wondering how the company would move forward ( go. macworld.com/mvfw), as human grading of any machine-learning process is an essential part of training the algorithms to improve them.

Apple now appears to have finished its

review and has issued a statement ( go. macworld.com/stat) apologizin­g for the way this program had been carried out so far. The company plans to reinstate the program this fall after making some important changes.

The apology begins with a familiar statement: “At Apple, we believe privacy is a fundamenta­l human right.” It then describes how Apple designed Siri to protect your privacy—collecting as little data as possible, using random identifier­s instead of personally identifiab­le informatio­n, never using data to build marketing profiles or sell to others.

The statement then goes on to make sure you understand that using your data helps make Siri better, that “training” on real data is necessary, and only 0.2 percent of Siri requests were graded by humans.

After all of this, Apple does get around to the actual apology that should have been in the first paragraph.

As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. Apple will resume the Siri grading program this fall, but only after making the following changes:

> First, by default, we will no longer retain audio recordings of Siri interactio­ns. We will continue to use computer-generated transcript­s to help Siri improve.

> Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participat­e will be able to opt out at any time.

> Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactio­ns. Our team will work to delete any recording which is determined to be an inadverten­t trigger of Siri.

This is the right move, and it once again puts Apple ahead of other tech giants in protecting your privacy and security. Apple is making the program opt-in rather than opt-out, an important distinctio­n as the vast majority of users never stray from the default settings. It’s also going to make sure these audio samples stay in-house rather than going into the hands of third-party contractor­s.

Hopefully, this spotlight on Siri’s training, evaluation, and grading will have a positive effect not only for user privacy, but for helping Siri to improve more quickly. ■

 ??  ??

Newspapers in English

Newspapers from Australia