Apple contractors are reported to regularly listen to conversations that people have with the company’s digital assistant Siri in order to improve it. These include secret medical information, drug deals, along with recordings of couples having sex, according to the report that has surfaced online.
The report by Guardian says that the company does not openly reveal this in its consumer-facing privacy documentation. However, it does give a portion of Siri recordings to the contractors that are working for Apple across the world. The report adds, “They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.”
On the other hand, the company said that it uses the recordings to better understand and recognize what the user says. However, Apple did not openly admit that the recordings are heard by their contractors.
Speaking to Guardian, Apple explained, “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
In addition to this, Apple has also said that less than 1 percent of every day Siri activations are used for grading, which are normally just a few seconds long.
It should be noted that Siri can also be activated by accident with its ‘wake word’: “Hey Siri”. A whistleblower working for the firm expressed concern about this and said, “The sound of a zip, Siri often hears as a trigger,”. However, the voice assistant service gets accidentally activated in other ways also. For example, if an Apple Watch senses that it has been raised and then hears speech, Siri gets activated even without human intervention.
The whistleblower further added, “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
Apple says in its privacy documents that the Siri data “is not linked to other data that Apple may have from your use of other Apple services”.
“The regularity of accidental triggers on the watch is incredibly high,” the company has said. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”
But then, putting two and two together, this certainly raises some privacy concerns. The digital assistant may gather some data that it is confidential to users by being activated accidentally and then, this may be used by contractors.
Apple is not the only one that has employed human supervision for its automatic voice assistants. A few months ago, Amazon was also found to be involved in employing staff that hears some of the Alexa recordings. It has also been reported that Google has employed staff to provide a human oversight to Google Assistant.
For the latest gadget and tech news, and gadget reviews, follow us on Twitter, Facebook and Instagram. For newest tech & gadget videos subscribe to our YouTube Channel. You can also stay up to date using the Gadget Bridge Android App.