Shocking! Suicide content served in YouTube Kids videos
YouTube Kids, the video-streaming platform dedicated to children, is facing flak for hosting disturbing content, including videos showing how to commit suicide. The content was discovered over the summer by Free Hess, a Florida-based pediatrician and mother who has flagged the clips on her blog post in a bid to get them removed. Here's more on the matter.
Several suicide videos found on YouTube Kids
Speaking to the Washington Post, Hess confirmed several self-harm videos have been spliced into clips related to games - like Nintendo's Splatoon and Minecraft - and cartoons. In one clip, a man can be seen holding an imaginary blade on his hand and saying, "sideways for attention. Longways for results." Other clips depicted incidents like school shooting, human-trafficking, and suicide by stabbing and hanging.
Even YouTube had self-harm videos
Notably, along with YouTube Kids, the standard version of YouTube also had some self-harm clips, which were pulled after Hess flagged them on her blog.
This puts young naive minds at risk
Content like this on a platform that is supposed to be 'kids-friendly' can easily put young minds at risk, Hess claimed while expressing concerns over the issue. Nadine Kaslow, American Psychological Association's former president claimed, videos like these can deeply affect vulnerable children. They could have nightmares of people killing themselves or may even be encouraged to do that themselves.
What YouTube says on the issue
After the issue got flagged, a spokesperson for YouTube claimed the company works to ensure its platform is "not used to encourage dangerous behavior". "We have strict policies that prohibit videos which promote self-harm", she said, noting that "every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views".
Still, this isn't first case of YouTube's failed content moderation
The news of self-harm videos on YouTube Kids comes a few days after it was reported the platform was recommending clips of teenagers that hosted exploitative comments from pedophiles. The platform, which hosts billions of videos, has long been struggling to moderate dangerous content. Last year, it drew criticism for ElsaGate videos, where people were seen dressed as cartoons, performing violent/sexual acts.