Gadgets

Apple Says It's Working To Improve Siri's Privacy Protections

Written By Tech Desk | Mumbai | Published:

Hack:

  • Apple to make changes to its virtual assistant Siri on how it collects private user data. This way, Apple is looking to ensure the customer's safety and privacy

Apple said it would make specific changes to its virtual assistant Siri on how it collects private user data. This way, Apple is looking to ensure the customer's safety and privacy. Apple also reiterated that they had pulled the plug on the Siri grading program. The company said that they plan to resume the Siri grading program sometime this fall after releasing software updates to their users. But before rolling out those updates, Apple announced that it would make changes to the program.

Apple to ensure Siri's privacy

Although Apple will no longer retain audio recordings of Siri interactions, it will continue to use computer-generated transcripts to improve the quality of Apple's voice assistant. Apple also said that users could offer their consent to help Siri improve by learning from the audio samples of their requests. Apple hopes that people will help Siri improve. Apple users will also be able to opt-out of the Siri grading program.

READ | Apple will launch the iPhone 11 on September 10

Apple puts an end to the Siri grading program

Apple has ensured that only its employee will be allowed to listen to audio samples of your Siri interactions, if you opt-in, Apple will delete any recording that appears to be an unintentional trigger of Siri.

"At Apple, we believe privacy is a fundamental human right. We design our products to protect users’ personal data, and we are constantly working to strengthen those protections. This is true for our services as well. Our goal with Siri, the pioneering intelligent assistant, is to provide the best experience for our customers while vigilantly protecting their privacy," Apple said.

READ | Cracked iPhone screen? You are going to have more places to fix it now

Last month, there were reports that human contractors listen to your private Siri recordings to improve the quality of Siri. Reports highlighted Apple uses certain audio recordings of its users in an attempt to help Siri better understand what users have to have. These recordings include confidential medical information drug deals and in some cases recordings of people engaging in sexual acts.

Apple ended their grading program, days after a report in The Guardian revealed how Apple's human contractors listen to at least 1 per cent of total users' conversation with their Siri voice assistant, without their users' consent.

DO NOT MISS