Will Cathcart, the head of Facebook’s WhatsApp online messaging app, has called out Apple over its latest decision to scanning iPhones for child abuse images in a Twitter thread on August 6. In one tweet on Friday, he said, “I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world. People have asked if we'll adopt this system for WhatsApp. The answer is no.”
Cathcart denouncing Apple’s decision came after the company announced a plan to release software that could search and detect child sexual abuse (CSAM) on the phones of United States users. It would enable the human reviewers to alert the authorities of potential illegal activity. Head of WhatsApp at Facebook began the thread by denouncing child abuse saying, “Child sexual abuse material and the abuser who traffic in it are repugnant."
Cathcart said that WhatsApp has worked to navigate its efforts to report and ban those traffic in CSAM without breaking the encryption and the privacy of its users. He also said that Apple software would allow the access to “scan all of a user's private photos on your phone - even photos you haven't shared with anyone."
We've worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it's shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption. https://t.co/2KrtIvD2yn— Will Cathcart (@wcathcart) August 6, 2021
Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.— Will Cathcart (@wcathcart) August 6, 2021
Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy.— Will Cathcart (@wcathcart) August 6, 2021
In a statement to Business Insider, Apple spokesperson denied several of Cathcart’s claims saying that the new Apple software would only detect child sex abuse materials in iCloud that the users have the power to disable any time. They also said that the images hashes, digital markers that algorithms used to identify similar images, of CSAM were exclusively provided by the National Centre for Mission and Exploited Children.
Earlier, The Information reported that Facebook had recently hired a team of researchers to study manners to analyse data without decrypting it. The research would reportedly allow the social media giant to collect the user data for targeted advertisements without reading encrypted information shared between users or sharing it with the advertisers. Meanwhile, Apple has made privacy its selling point for its products and services. However, its latest announcement reportedly drew criticism from data privacy experts. As per reports, experts are concerned about the long-term implications of having such intrusive technology such as the possibility of government exploitation.