Last Updated:

YouTube Issues Guidelines On Removing Misleading Videos On US, Germany Elections

Google-owned video platform YouTube issued new guidelines prohibiting the distribution of fraudulent content that casts doubt on election results.


Image: Unsplash/AP

Google-owned American video sharing and social media platform YouTube has issued new guidelines prohibiting the distribution of fraudulent content that casts doubt on election results, including the recent German federal elections and the US presidential election. According to the new regulations, content that advances false claims, widespread fraud, errors or malfunctions that could potentially change the outcome of any election results, have been prohibited to post from now on. It also asked to remove fraudulent content that could have influenced the outcome of the German parliamentary elections or delegitimizes the formation of a new government. However, if there are denials of the controversial statements in the clip or if there is more context, YouTube may allow the video to be posted. Meanwhile, content that violates the rules will be removed, and if a user breaches the rules three times, the channel will be closed. 

On Monday, September 27, YouTube CEO Susan Wojcicki voiced hope that the platform would continue to operate in Russia after Russian media watchdog Roskomnadzor's requests to erase videos with misleading data concerning COVID-19 were allegedly overlooked by the service, reported Sputnik. It should be mentioned here that last month, YouTube halted payments to at least 14 channels' content providers in Brazil in connection with posting allegedly bogus, unauthentic news and disinformation about forthcoming general elections in the country. In the month of July, it had also taken down the videos posted by Brazilian President Jair Bolsonaro over misinformation about the coronavirus outbreak. According to The Guardian, the tech giant's decision was based on its content regulations, not on ideology or politics.

YouTube removed over 1 million videos spreading misinformation about COVID-19

It is worth mentioning here that since February 2021, YouTube has taken down over 1 million videos on COVID-19 in a strong push to combat misinformation. According to YouTube Chief of Product Neal Mohan, the removed COVID-related videos spread misinformation such as fraudulent remedies or claims of hoaxes. Last month, Neal published a blog post announcing the platform's "holistic approach" to combating the spread of "hazardous misinformation." The company also stated that it depends on professional consensus from health organisations such as the US Centers for Disease Control (CDC) and the World Health Organization (WHO) for COVID-related information.

Image: Unsplash/AP

First Published: