Facebook Tightens Policies To Prevent Self-harm, Suicide Images

Social Media News

Facebook is preventing self-harm and suicide content from spreading on the platform. Facebook said it had made several changes to the way content related to sel

Written By Tech Desk | Mumbai | Updated On:
Facebook

Facebook is preventing self-harm and suicide content from spreading on the platform. Facebook said it had made several changes to the way content related to self-harm and suicide is being handled by the company so far. Facebook also said that it has been working with experts around the world to address issues and how those issues affect users interacting with suicide-related content on Facebook. Apart from material related to suicide and self-harm, Facebook is also preventing content related to an eating disorder. Meanwhile, Facebook continues to show a sensitivity screen over inappropriate content to help avoid promoting self-harm.

Facebook apparently facing newer investigations, after record fine

Facebook statement

"We tightened our policy around self-harm to no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm, even when someone is seeking support or expressing themselves to aid their recovery. On Instagram, we’ve also made it harder to search for this type of content and kept it from being recommended in Explore," said Antigone Davis, Global Head of Safety of Facebook.

Facebook tightens political ad rules ahead of US presidential election

What does it mean?

As a result of changes made to its policies on suicide content, Facebook will no longer allow graphic images of self-harm on its platform. The announcement comes amid criticism of how social media companies moderate violent and potentially dangerous content online. That said, Facebook is also making it harder to search for self-harm related content on Instagram. Facebook will ensure that it does not appear as recommended in the Explore section on Instagram. Facebook' statement comes on World Suicide Prevention Day.

Local government authorities to send out emergency alerts via Facebook

Facebook has a team of moderators to monitor content such as live streaming violent acts as well as suicides. Facebook has been working with at least five third-party companies in at least eight countries on content review, according to Reuters. Governments around the world have been working to improve control over dangerous content on social media websites, as well as the spread of online pornography and election propaganda.

Facebook to hire senior journalists in an attempt to curb fake news

Published:
By 2030, 40% Indian will not have access to drinking water
SAVE WATER NOW
PEOPLE HAVE PLEDGED SO FAR
DO NOT MISS