Last Updated:

YouTube To Remove All Anti-vaccine Misinformation From Video-sharing Platform

YouTube had placed a restriction on Covid vaccination disinformation videos last year, resulting in the removal of 130,000 pieces of content


Image: Unsplash

The video streaming company, YouTube has announced that it will remove any videos and contents that convey misinformation about all authorised vaccinations, extending a complete ban on the misleading claims for the Covid-19 vaccination. The company further stated videos that claim authorised vaccinations are harmful and will cause autism, cancer, infertility or do not prevent disease transmission will be removed from the site. 

The Guardian reported that from September 29, Wednesday, YouTube will be removing content that involves false claims on any approved vaccine. Anti-vaccine influencers' accounts will be terminated as part of the policy. According to CNBC, A YouTube spokesman verified that pages affiliated with high-profile disinformation spreaders including Joseph Mercola, Erin Elizabeth, Sherri Tenpenny, and the Children's Health Defense Fund, which is linked to Robert F. Kennedy Jr., were deleted as part of the takedown. 

In the earlier guidelines, the videos that conveyed disinformation regarding non-Covid vaccinations or encouraged vaccine hesitancy were effectively hidden from view. YouTube had placed a restriction on Covid vaccination disinformation videos last year, resulting in the removal of 130,000 pieces of content. Since the commencement of the Covid outbreak, YouTube, which is owned by Google, has deleted a total of 1 million videos for spreading generic Covid misinformation. 

Regulations against anti-vaccine misinformation contents

The company declared in a blog post that misleading statements concerning Covid vaccinations have spread misconceptions about vaccines. The company further said that they are expanding their YouTube medical misinformation regulations with additional recommendations over presently licensed vaccinations that have been verified to be effective and safe by local health authorities and the WHO, the post added.  

Personal vaccination testimonials, vaccine policy information, new vaccine studies, and historic films regarding vaccine triumphs and failures will be permitted to stay on the site, according to the company. The global head of trust and safety at YouTube Matt Halprin stated that the disinformation as well as misinformation about the measles, mumps, and rubella (MMR) vaccination, that has been incorrectly linked to autism, are the examples that YouTube will tackle.  

Since the COVID-19 outbreak, social media platforms have been chastised for failing to do more to combat misleading health information on their platforms. President Joe Biden of the United States said in July that social networking sites were primarily to blame for people's scepticism about vaccinations by disseminating disinformation, and he urged them to solve the problem. 

This decision came after Facebook implemented a similar restriction in February, which targeted misleading claims that vaccinations are ineffective or end up causing autism. However, the firm has had difficulty enforcing it. While similarly, Twitter had declared in March that individuals who regularly spread vaccination disinformation would be removed from the network. 

(Image: Unsplash)

Tags: YouTube, Google, WHO
First Published: