Apple regularly sends Contractors conversations recorded by Siri (its Voice Assistant) of any nature – particularly the explicit ones that cross borders of decency including confidential medical health information for grading, internal quality and improving performance purposes, according to a recent report attributed to a whistleblower by The Guardian.
Siri, the voice assistant like Google Assistant, has been easing the lives of millions who talk, shout or chat with ‘her’ as a personal assistant at their beck and call or sort of a ‘Man Friday.’ Due to its sound sensitivity, it can record any sound within its hearing distance, including a ruffle, to record any following sounds or chats. This is done without the Apple iPhone owner knowing about it, and now that the whistleblower raised the red flag and blew the lid over it.
Siri’s Recordings Analyzed by Contractors to Improve its Performance
The Guardian initially got connected with Apple over the issue but was told that it was a sort of routine matter for Siri to record and the voice data gotten analyzed. Over a period, Siri’s performance is expected to improve through this exercise, and so will the user experience, with the voice assistant offering better and prompt answers.
Though Apple says, the data:
“is used to help Siri and dictation … understand you better and recognize what you say,” but The Guardian said, “But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymized recordings.”
Apple further communicated to the newspaper saying:
“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities, and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used, are typically only a few seconds long.
Siri’s Recordings by ‘Oversight’ May Compromise User Identity & More Even as the whistleblower fears adverse action on account of his ‘revelation’, he made to The Guardian. It was:
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, s**xual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
The source very clearly stated in the news that:
“It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.” Further, “If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”
Now, this is ringing bells. This ‘oversight’ or switching ‘on’ to recording mode by the iPhone on its own, the whistleblower says that Apple might terms it as should be informed by it to its user.
‘Liaison’ with ‘Technology’ Makes Us Victims for Someone’s Monetary Gains
As regards technology today, there are people and companies which mine data and grade the same for serving their purposes and ends. Apple, Amazon, Google, and countless unnamed others take technology seriously for improvising and enhancing ordinary lives. But, the dumb question should these technologies be allowed to make anyone’ scapegoat,’ or a ‘guinea-pig’ in the above kind of situations?
‘Eavesdropping or prying eyes’ is a dangerous activity, and whoever indulges in it. Invading someone’s personal preferences, tastes or behaviors to serve monetary interests of the few under the pretext of helping the masses, is outrageous?
So, what says Siri: ‘Siri does more than ever. Even before you ask,’ sayeth Apple!!
Will you agree that Siri is ‘smart,’ and has an act or two up her sleeves for you to get surprised after a few ‘liaisons’ with ‘her?’