Advertisement

Apple suspends listening to Siri queries amid privacy outcry

Tim Cook
Apple Chief Executive Tim Cook talks about the company’s Siri feature during an event in San Francisco to unveil a new iPad.
(Jeff Chiu / Associated Press)
Share via

Apple Inc. said Thursday that it is suspending its global internal program for “grading” a portion of user Siri commands after some consumers raised concerns about the program.

The Cupertino, Calif.-based technology giant employs people who listen to less than 1% of Siri commands in order to improve the voice-based digital assistant. Concerns over technology companies listening to and analyzing what is spoken to voice assistants started to be raised after Bloomberg News first reported that Amazon.com Inc. and Apple had teams analyzing recordings earlier this year. Last week, the Guardian reported that Apple contractors said they often hear sex, drug deals and confidential medical information.

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Advertisement

The company’s move comes the same week that a German regulator temporarily stopped Google employees and contractors from transcribing home assistant voice recordings in the European Union after whistle-blowers said some recordings contained sensitive information. A Hamburg agency said Aug. 1 that Google agreed to a three-month stoppage while it investigates whether the practice complies with the EU’s General Data Protection Regulation.

Bloomberg News reported in April that Amazon employs thousands of people around the world who listen to voice recordings captured by its line of Echo speakers in owners’ homes and offices, with the goal of improving its Alexa digital assistant. The recordings are transcribed, annotated and then fed back into the software as part of an effort to eliminate gaps in Alexa’s understanding of human speech and help it better respond to commands.

Apple’s Siri also has human helpers, who work to gauge whether the digital assistant’s interpretation of requests lines up with what the person said. The recordings they review lack personally identifiable information and are stored for six months tied to a random identifier, according to an Apple security white paper. The company doesn’t disclose directly in iOS that it uses a portion of recordings for its grading process, but it is mentioned in the white paper.

Advertisement