Updated November 30th, 2022 at 07:13 IST

Twitter has only 1 person left on APAC's child safety team, issue was Musk's #1 priority

Four employees were based in Twitter’s Asian headquarters in Singapore but recently only one was left as a full-time member of staff to handle CSAM content.

Reported by: Zaini Majeed
IMAGE: AP/Unsplash | Image:self
Advertisement

On Twitter's team that reviews the child sexual abuse material across Japan and the Asia-Pacific region, only one employee is left. This has brought into focus how CEO Elon Musk plans to tackle the removal of such content which he stated was his “number one priority” since acquiring the company. Wired found on LinkedIn that previously there were at least four people focused on child safety in the APAC region for Twitter, but by November only three were left. Until recently, just one employee is reported to be working on the team now.

The employees were based in Singapore, Twitter’s Asian headquarters. Sources reportedly claim that the job now handled by just one full-time member of staff in Asia-Pacific seems like a big challenge for the massive problem of CSAM material that circulates on Twitter.

Asia-Pacific region comprises of world's 60% population and is home to an estimated 4.3 billion people who use the microblogging site Twitter. Japan, for instance, is only second to the United States in the number of users on Twitter put at a whopping 59 million, according to Statista's data.

Singapore team coordinated with agencies in UK,US to 'identify CSAM content'

As CEO Elon Musk laid off 50% of Twitter's 8,000 employees, the Singapore office also witnessed a high attrition rate. It remains unclear at this time how only one employee is able to review the child sexual abuse content on a platform so vast as Twitter. But it is being reported that the top three hashtags mostly used to sell CSE content were eliminated from the platform shortly after Musk became CEO of Twitter Inc. A direct option to instantly report the CSE content on Tweets containing images or videos pertaining to the “sexual exploitation of children” was also added.

Critics, although, are questioning Twitter's inadequate focus on the importance of an in-house team to work toward child safety. The team in Singapore, which has now shrunk to just one, also reviewed other busiest markets including Japan. They also coordinated with other foreign agencies such as the Internet Watch Foundation (IWF) in the UK and the National Center for Missing and Exploited Children (NCMEC) in the US to identify the potential CSAM content on the platform. the agencies are now reportedly claiming that their data could be automatically wiped out from the company’s systems without human intervention. “This ensures that the shielding process is as efficient as possible,” Emma Hardy, IWF communications director was quoted as saying. 

Advertisement

Published November 30th, 2022 at 07:13 IST