Apple ends Siri grading says contractors won’t listen to voice recordings anymore

6 min read

Apple ends Siri ‘grading,’ says contractors won’t listen to voice recordings anymore

Additionally, as part of a future software update, users will have the ability to choose to participate in grading,” the company said.Almost a week after news broke that contractors working for Apple had access to intimate and personal Siri recordings — a practice of quality control known as “grading” — the company said late Thursday that it would halt the practice.

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

News of the Thursday statement was first reported by TechCrunch.

As The Guardian initially reported July 26, such Siri audio clips included sounds of people having sex, discussing medical information and other intimate details.Apple has maintained that only a tiny fraction — less than 1 percent — of Siri voice recordings are actually heard by human workers, and each recording is usually only a few seconds long.

The “grading” procedure is common among other tech firms, including Google and Amazon, that have similar voice assistants.

Last month, VRT, a Belgian public broadcaster, reported that lengthy Flemish-language clips, sometimes containing personal details, were shared with contractors.In response to concerns raised by a Guardian story last week over how recordings of Siri queries are used for quality control, Apple is suspending the program world wide. Apple says it will review the process that it uses, called grading, to determine whether Siri is hearing queries correctly, or being invoked by mistake.

In addition, it will be issuing a software update in the future that will let Siri users choose whether they participate in the grading process or not.

The Guardian story from Alex Hern quoted extensively from a contractor at a firm hired by Apple to perform part of a Siri quality control process it calls grading. This takes snippets of audio, which are not connected to names or IDs of individuals, and has contractors listen to them to judge whether Siri is accurately hearing them — and whether Siri may have been invoked by mistake.

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement to TechCrunch. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

The contractor claimed that the audio snippets could contain personal information, audio of people having sex and other details like finances that could be identifiable, regardless of the process Apple uses to anonymize the records.

They also questioned how clear it was to users that their raw audio snippets may be sent to contractors to evaluate in order to help make Siri work better. When this story broke, I dipped into Apple’s terms of service myself and, though there are mentions of quality control for Siri and data being shared, I found that it did fall short of explicitly and plainly making it clear that live recordings, even short ones, are used in the process and may be transmitted and listened to.

The figures Apple has cited put the amount of queries that may be selected for grading under 1% of daily requests.

The process of taking a snippet of audio a few seconds long and sending it to either internal personnel or contractors to evaluate is, essentially, industry standard. Audio recordings of requests made to Amazon and Google assistants are also reviewed by humans.

An explicit way for users to agree to the audio being used this way is table stakes in this kind of business. I’m glad Apple says it will be adding one.

It also aligns better with the way that Apple handles other data, like app performance data that can be used by developers to identify and fix bugs in their software. Currently, when you set up your iPhone, you must give Apple permission to transmit that data.

Apple has embarked on a long campaign of positioning itself as the most privacy conscious of the major mobile firms, and therefore holds a heavier burden when it comes to standards. Doing as much as the other major companies do when it comes to things like using user data for quality control and service improvements cannot be enough if it wants to maintain the stance and the market edge that it brings along with it.Nearly a week after news broke that Apple had access to intimate and personal Siri recordings, a practice of quality control known as “Grading,” the company says that it will halt the practice.

The tech giant made the announcement late Thursday.

Apple has maintained that only a tiny fraction, less than 1 percent, of Siri voice recordings are actually heard by human workers, and each recording is usually only a few seconds long.

The company says it will issue a software update in the future that will let Siri users choose whether they participate in the grading process or not.

The process of taking a snippet of audio a few seconds long and sending it to either internal personnel or contractors to evaluate is, essentially, industry standard.

Audio recordings of requests made to Amazon and Google assistants are also reviewed by humans.Apple has halted its practice of having human contractors listen to users’ Siri recordings to “grade” them, following an invesitgative report on the issue.

The Tim Cook-led company told The Guardian, which broke the initial story, that it would not restart the program until it had thoroughly reviewed the practice.

It also committed to adding the ability for users to opt out of the quality assurance mechanism altogether in a future software update.

Apple told The Guardian in a statement: “We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”Apple has suspended its Siri grading program following a Guardian report last week which revealed that third-party contractors working on Siri regularly hear confidential user information and could pin point their location.

The company has issued a statement to TechCrunch saying that it is committed to enhancing the overall experience of using Siri and protecting user privacy.

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement to TechCrunch. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Apple will now be thoroughly reviewing its grading process. It will also offer users an option to opt in/out of this program via a future software update.

As a part of the Siri grading program, third-party contractors listen to user recordings. In many cases, due to false trigger especially from the Apple Watch and HomePod, these contractors ended up listening to private conversations of users which revealed their medical information, drug dealings, and more.

While Google and Amazon also employ third-party contractors to work on their voice assistants which involves listening to user recordings, they offer users an option to opt out of their recordings being used for grading purposes. Despite its focus on user privacy, Apple does not offer any such option for Siri.

Our Take
It seems to have become a theme with Apple where it is first caught doing something which it should not do. Then the company apologises about it and proceeds to roll out a software update to fix the issue.

Leave your vote

-7 points
Upvote Downvote

Comments

0 comments

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.