Updated August 7th, 2021 at 17:53 IST

Apple to scan US iPhones for child abuse images

Apple plans to scan U.S. iPhones and other devices for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused.

| Image:self
Advertisement

Apple plans to scan U.S. iPhones and other devices for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused.

The tool Apple calls "neuralMatch" will detect known images of child sexual abuse by scanning images on iPhones, iPads and various Apple computers before they are uploaded to iCloud.

If it finds a match, the image will be reviewed by a human.

If child pornography is confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will only flag images that are already in the center's database of known child pornography.

Parents snapping innocent photos of a child in the bath presumably need not worry.

Apple also said that its software would "intervene" when users try to search for topics related to child sexual abuse.

In order to receive the warnings about sexually explicit images on their children's devices, parents will have to enroll their child's phone.

Kids over 13 can unenroll, meaning parents of teenagers won't get notifications.

Apple said neither feature would compromise the security of private communications.

But researchers say the matching tool — which doesn't "see" such images, just mathematical "fingerprints" that represent them — could be put to more nefarious purposes.

That includes government surveillance of dissidents or protesters.

Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images.

Apple has used those to scan user files stored in its iCloud service, which is not as securely encrypted as its on-device data, for child pornography.

Apple has been under government pressure for years to allow for increased surveillance of encrypted data.

Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

 

Advertisement

Published August 7th, 2021 at 17:53 IST