Skip to content

Siri’s privacy concerns – Apple suspends quality control program

Apple has put a temporary halt to a quality control program that involves employees listening to Siri recordings from users to “grade” its response.

These grading teams effectively teach Siri whether or not it’s doing a great job, and a human touch like this is essential to improving such a service. That’s why teams of humans also check over audio queries at Google and Amazon to improve their AI assistants.

But, of course, this kind of thing does raise ethical concerns – hence why the practice was recently picked up in a feature from the Guardian newspaper.

The report reveals that as part of the program, employees have overheard all kinds of confidential information. Medical details, illegal activity, sexual encounters, and more. How is that possible from a company that talks so confidently about user privacy?

Well, it’s not quite as bad as it seems. Humans only get to hear small snippets of audio when Siri is triggered – less than 1% of queries, picked at random – and the whole process is deeply anonymized and encrypted so there’s no way to trace a recording back to a specific user.

The idea is that checking the ocassional out-of-context snippet – perhaps a single question to Siri – and grading the accuracy of the response will improve things for everyone without impacting too heavily on any individual.

“A small portion of Siri requests are analysed to improve Siri and dictation,” Apple told the Guardian. “User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

However, Amazon has got into hot water before for engaging in this kind of audio grading without properly anonymizing the audio first. (Here’s how to turn that off if you have an Amazon Echo.)

Though this kind of practice is common, and mentioned in Apple’s terms of service, we think it’s fair to say that most users probably aren’t aware that there’s a chance their recordings could be listened to by another human halfway around the world. And currently, there’s no way to opt out if the process creeps you out. That’s really not good, Apple.

It seems Apple knows this, and doesn’t want a privacy scandal on its hands that could undo the good work it’s done for other aspects of user privacy. Here’s the media statement fresh from Apple itself:

“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

It’s surprising that Apple didn’t already have an opt out toggle – perhaps it didn’t want to draw extra attention to the “feature” by having it present in Settings.

Thankfully, it’s doing the right thing now, and we’ll be able to opt out in the future and still use Siri. We’ll keep you up to date when that change goes live!