Updated August 6th, 2021 at 14:51 IST

Apple to scan iPhones, iPads to detect images of child sex abuse; privacy concerns raised

On devices like iPhones, iPads and Macs, the feature against CSAM will check iMessage, iCloud, search and Siri to prevent child abuse and warn user if found.

Reported by: Shikhar Mehrotra
IMAGE: UNSPLASH | Image:self
Advertisement

Apple is reported to debut a new cryptographic system that will scan images saved in iCloud to check against child sexual abuse material or CSAM. However, the new feature is being criticised on privacy invasion grounds. The new capability will allow Apple to review user files in iCloud, as well as on user devices. Keep reading to know more about the new feature and how is it concerning experts.

Apple to search for CSAM content in iCloud and user devices 

A new set of measures against child sexual abuse material are to be implemented by Apple. The cryptographic system will be integrated within iPhones, Macs and iPads, and will check the images uploaded to iCloud against CSAM in the US. The process will take place in two steps: first on the device and second on servers which store data. The suspicious images will be reported and sent to the National Center for Missing and Exploited Children and ultimately fall into the hand of US law enforcement. 

On devices, the feature against CSAM will check iMessage, iCloud, search and Siri to prevent child abuse. The feature will be available as an opt-in setting in iCloud family accounts, wherein security systems will implement machine learning to detect nudity in images sent through iMessage. If any such image is detected, the system can block it from being shared. Additionally, the system will also display warnings on the device and might alert parents about the interaction of their child (associated Apple account) with such images. Similarly, Siri and Apple search will also display warnings if a user is searching for anything related to child sexual abuse. 

Apple new security system for CSAM

Experts are concerned about the system being used to scan user devices

While the feature raises an issue regarding user's privacy, Apple says that it will use clever technology to prevent scanning any image that is not CSAM. The security system was designed in collaboration with Dan Boneh, a Stanford University cryptographer. However, users and other experts argue that in order to detect CSAM, a system will scan all the images for it, hence intruding on a user's privacy. Some critics also say that Apple's new feature is a form of surveillance and it might be exploited by authorities to gain hold of a user's personal data. Read what different experts have to say about the new feature. 

IMAGE: APPLE

Advertisement

Published August 6th, 2021 at 14:51 IST