YouTube's new policy will protect teens from body image content
YouTube has announced a significant change in its content recommendation system for teenagers. The platform will no longer suggest videos that glorify certain fitness levels, body weights, or physical attributes. This decision comes after experts warned about the potential harm of such content when viewed repeatedly by young users. Although these videos do not violate YouTube's guidelines, their frequent viewing could negatively impact some users' well-being.
New guidelines aim to protect under-18s
The new guidelines, which are now in effect globally, apply to content that idealizes certain physical features or promotes social aggression. For example, beauty routines aimed at making one's nose appear slimmer or exercise routines encouraging a specific look will fall under these guidelines. While YouTube will continue to allow teenagers aged 13-17 years old to access such videos, it won't recommend similar content afterward.
Global health head underscores potential harm
Dr. Garth Graham, YouTube's Global Head of Health, emphasized the potential harm of fitness or physical features-related content. He stated, that "As a teen is developing thoughts about who they are and their own standards for themselves, repeated consumption of content featuring idealised standards that starts to shape an unrealistic internal standard could lead some to form negative beliefs about themselves." This change is part of YouTube's ongoing efforts to create a safer online environment for young users.
YouTube adviser highlights importance of 'guardrails' for teens
Allison Briscoe-Smith, a clinician and adviser to YouTube, highlighted the importance of these "guardrails." She stated a higher frequency of content idealizing unhealthy standards or behaviors could emphasize potentially problematic messages, impacting how some teens view themselves. The new guidelines are being introduced globally. However, the changes are particularly relevant in the UK where the newly introduced Online Safety Act mandates tech firms to shield children from harmful content and consider how their algorithms might expose under-18s to damaging material.