Published 17:51 IST, September 10th 2024
YouTube Changes Algorithm To Protect Teens From Harmful Health Content
YouTube decides to change algorithm to prevent youth from accessing harmful health content.
In a decisive move to shield adolescents from potentially damaging content, YouTube has announced it will no longer recommend videos to users aged 13 to 17 that promote specific physical ideals or fitness regimens. This policy update addresses growing concerns from experts who have highlighted the risks of repeated exposure to such content. Although teens can still access these videos, YouTube’s algorithms will now avoid suggesting related material to them.
What is YouTube's rationale behind this decision?
Dr. Garth Graham, YouTube’s global head of health, articulated the rationale behind this policy change: “As teens navigate their self-identity and personal standards, continual exposure to idealized physical norms can cultivate unrealistic self-expectations, which may lead to negative self-perceptions.”
The revised guidelines specifically target content that glorifies certain physical attributes, promotes specific exercise routines, or encourages social aggression. While a single video may not pose significant harm, repeated exposure to such content has been deemed potentially detrimental. This policy is now in effect globally, including in the UK and the US, and reflects the recent Online Safety Act in the UK, which requires tech companies to protect young users from harmful content and scrutinize the potential impacts of their algorithms.
Th need for this swift action is further justified based on the Frequent exposure to content that idealizes unhealthy standards or behaviors which can reinforce problematic messages, influencing how teens perceive themselves.
Updated 17:51 IST, September 10th 2024