Apple has decided to continue using its human contractors to review Siri audio commands and dictation, after taking a break for weeks. Apple is resuming its human reviews of Siri audio with the latest iPhone software update. If you recall, Apple suspended the practice in August and apologised for the way it used human contractors to review the audio. Despite the fact that the practice is common in the tech industry as almost every major brand out there including Google, Amazon, Microsoft, etc. were found using human contractors to review the audio from their respective digital voice assistant service, it undermined Apple’s attempts to position itself as a trusted steward of privacy. Apple CEO Tim Cook repeatedly has asserted the company’s belief that “privacy is a fundamental human right,” a phrase that cropped up again in Apple’s apology.
Apple is prompting users that they can choose “Not Now” to decline audio storage and review at the time of installing iOS 13.2 software update. Users who enable this can turn it off later in the settings. Apple also specifies that Siri data is not associated with a user’s Apple ID. All major tech companies indulging in this practise, including Amazon, Google, Microsoft, say the practice helps them to improve their artificial intelligence services. However, the use of humans to listen to audio recordings is particularly troubling to privacy experts because it increases the chances that a rogue employee or contractor could leak details of what is being said, including parts of sensitive conversations.
Previously, Apple stated they plan to resume human reviews this fall, without specifying the exact timeline. Back then, Apple also said then that it would stop using contractors for the reviews. It remains to be seen whether Apple pulls the plug on the practise where human contractors listen to Siri audio conversations. Meanwhile, Other tech companies have also been resuming the practice after giving more notice. Google restarted the practice in September, after taking similar steps to make sure people know what they agree to. In September, Amazon said Alexa users could request that recordings of their voice commands delete automatically.
(With agency inputs)