Meta neglected children's safety on its platforms, claims whistleblower
Arturo Béjar, a former Meta senior engineer, asserts that the company hasn't done enough to shield young users from harmful content following Molly Russell's tragic death in the UK. Béjar believes that if Meta had learned from this incident, they would have made Instagram safer for teens. He refers to research indicating that 8.4% of 13-15-year-olds on Instagram witnessed self-harm or threats of self-harm in the past week.
Molly Russell's death and inquest
Russell, a 14-year-old Londoner, took her life in 2017 after being exposed to content related to suicide, depression, self-harm, and anxiety on Instagram and Pinterest. A 2022 inquest concluded that Molly's death resulted from self-harm while suffering from depression and the negative impact of online content. Béjar argues that Meta possesses the tools to make Instagram safer for teenagers but has opted not to use them.
A brief look at Béjar's background
At Meta, Béjar served as an engineering director. He worked with child safety tools and helped children cope with harmful content like bullying material. He quit the company in 2015 but returned as a consultant in 2019 for two years. In 2023, he testified before the US Congress, describing his experience at the company. He also revealed the "awful experiences" of his teenage daughter and her friends on Instagram. They included harassment as well as unwanted sexual advances.
Béjar's research and recommendations
During his tenure at Meta, Béjar's research revealed that one in eight children aged 13-15 on Instagram, encountered unwanted sexual advances, one in five experienced bullying, and 8% saw self-harm content. He urged the company to establish goals for reducing harmful content. Béjar also recommended making it simpler for users to flag inappropriate content, routinely surveying users about their experiences, and streamlining the process for submitting reports on Meta platforms.
Crackdown possible in just 3 months
Béjar claimed that it would take Meta just three months to perform an efficient crackdown on self-harm content. Urging Meta to take the necessary steps, he said, "They have all the machinery necessary to do that." "What it requires is the will and the policy decision to say, for teenagers, we're going to create a truly safe environment that we're going to measure and report on publicly."
Links to a US lawsuit
Béjar's research and attempts to get Meta to act on his recommendations, also feature in a lawsuit against the company. Brought by New Mexico Attorney General Raúl Torrez, it states that the firm failed to protect children from sexual abuse by predators. Documents show that Meta was warned by employees that it was "defending the status quo" following Russell's death when "the status quo is unacceptable to media, many families and...will be unacceptable to the wider public."
Meta's response and safety initiatives
Meanwhile, a Meta spokesperson emphasized that numerous individuals within and outside the company are dedicated to ensuring young people's online safety. They pointed out over 30 tools as well as resources introduced to help teens and their families enjoy safe, positive online experiences. Some of these measures include automatically setting accounts of users under 16 to private mode on Instagram, preventing adults from messaging teens who don't follow them, and enabling users to report harassment, bullying, and sexual activity.