Spreading of fake news and misinformation among social media users has been the biggest cause of concern for people worldwide. Recently Facebook received heavy scrutiny for not trying to curb the spread of misinformation in times of COVID 19 pandemic and other major events worldwide. This resulted in the company taking a major step towards trying to control the spread of fake messages and misinformation through their app.
Facebook is one of the most popular social media apps in the world and just like Whatsapp, it is used by numerous people to stay in touch. With so many people having access to the platform, it often becomes difficult to regulate what information is being circulated through the app. But Facebook has taken inspiration from their daughter company and has limited the number of people the messages can be sent to.
Facebook revealed on its blog post that it shall be limiting the number of people and groups one can send messages to. Facebook mentioned that it wants to control the spread of misinformation in times of pandemic and as countries like the US and New Zealand head towards their major elections. Here is what it wrote in its blog.
As a part of our ongoing efforts to provide people with a safer, more private messaging experience, today we’re introducing a forwarding limit on Messenger, so messages can only be forwarded to five people or groups at a time. Limiting forwarding is an effective way to slow the spread of viral misinformation and harmful content that has the potential to cause real world harm.
We believe controlling the spread of misinformation is critical as the global COVID-19 pandemic continues and we head toward major elections in the US, New Zealand and other countries. We’ve taken steps to provide people with greater transparency and accurate information.
It also addressed in the blog that it wants the Facebook messenger to be a “safe and trustworthy platform to connect with friends and family”. Earlier, the app had introduced features like “safety notifications, two-factor authentication, and easier ways to block and report unwanted messages”. Facebook believes that the latest feature would add another layer of protection “by limiting the spread of viral misinformation or harmful content”.