Meta platforms failing to stop child abuse content, says report
Meta has reportedly dedicated months to addressing child safety issues on its platforms, Instagram and Facebook. However, it is still encountering challenges in preventing its own systems from promoting an extensive network of accounts associated with pedophilia. The social media giant set up a child-safety task force in June. But, in its latest report, The Wall Street Journal noted that Meta is still unable to stop large groups from using Facebook and Instagram to create and share child abuse content.
Previous examples and inadequate response
Earlier this year, an investigation—conducted by WSJ in partnership with the Canadian Centre for Child Protection (C3P)—showed how Meta's recommendations led users to Facebook Groups, Instagram hashtags, and accounts that promote child exploitation material. For instance, the C3P found a network of Instagram accounts, each with up to 10 million followers, that continued to livestream child sex abuse videos even after being reported. Moreover, Meta allegedly declined to take action on a user report about a public-facing Facebook Group initially.
Meta's latest efforts to improve child safety
Meta is facing growing scrutiny over child safety on its platforms, catching the attention of European Union (EU) regulators. To address issues, Meta says it has expanded the list of child safety-related terms, phrases, and emojis its systems can detect. The company is also using machine learning to discover new search terms that child predators could exploit. It's improving internal systems to identify "potentially suspicious adults" and prevent them from connecting or seeing each other's content in recommendations, Meta added.
New pressure from European Commission
Following WSJ's report, Meta is reportedly encountering fresh regulatory scrutiny, particularly from the EU, which has formally initiated an inquiry into Meta's handling of child abuse material. Meta is requested to furnish information regarding the measures it has implemented to fulfill obligations related to assessing risks and implementing effective mitigation measures concerning minors' protection. The company should respond by December 22, and based on the evaluation of the provided information, the commission will determine its next course of action.