siri recordings

Apple hired contractors and paying them for listening to your Siri recordings, according to a new report from The Guardian. A former contractor revealed that workers have heard accidental recordings of users’ personal lives, Sexual encounters, including doctor’s appointments, addresses, and even possible drug deals. He further told that Siri interactions are sent to workers, who listen to the recording and are asked to grade it for a variety of factors.

Like whether the request was intentional or a false positive that accidentally triggered Siri, or if the response was helpful. Although Apple does not explicitly reveal it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. The contractors are tasked to grading the response based on a variety of factors. Apple told:

“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate. The fact that humans are listening to voice assistant recordings, in general, isn’t exactly surprising news, both Amazon (for Alexa) and Google (for Assistant) have been revealed to have a similar system in which contractors listen to the recorded conversation to better improve that system.

It is fair because smart assistants obviously can’t tell the difference between false positives and actual queries, and anyone who’s used a smart assistant can tell you that false positives are still very, very common at this stage of their evolution. Apple listening system might be more concerning because of the pervasiveness of Apple products. As Alexa is largely limited to smart speakers, and Google Assistant to speakers and phones, Siri is also on Apple’s hugely popular Apple Watch, which is on millions of people’s wrists every waking moment.

read also: GM’s Cruise Automation delays the launch of driverless taxi service

Also, Siri on the Apple phone activates at any time a user raises its wrist, not just when it thinks it’s heard the “Hey, Siri” wake word phrase. As per The Guardian, Google and Amazon both allow customers to opt-out of some uses of their recordings, Apple doesn’t offer a similar privacy-protecting option, outside of disabling Siri entirely. It is obviously not good ethically, given the fact that Apple has built so much of its reputation on selling itself as the private company which defends your data in ways that Google and Amazon don’t.

You can’t completely stop the use of smart assistants, rather than just avoid the issue and being careful what you say around iPhones and HomePods. It does not mean that you are at series risk of privacy violations just by using Siri. It is just that Apple might have room for improvement in how it handles inadvertent audio captures.