100K children being sexually harassed daily on Facebook, Instagram
Meta, the parent company of Facebook and Instagram, is facing allegations that around 100,000 children using its platforms encounter online sexual harassment daily. The New Mexico Attorney General's office has obtained this information from Meta employee presentations and staff communications. One incident in 2020, involved the 12-year-old daughter of an Apple executive receiving inappropriate messages on Instagram, which raised concerns about Apple potentially removing the app from its store.
Allegations part of a lawsuit filed in December
This new info is part of a lawsuit filed on December 5 last year, by the New Mexico AG's office, accusing Meta's social networks of being havens for child predators. Attorney General Raúl Torrez claims that Meta allows adults to find, message, and groom minors on its platforms. Meta denies these allegations, stating that the lawsuit "mischaracterizes our work using selective quotes and cherry-picked documents." The company insists on its dedication to ensuring safe experiences for teens and their parents.
Meta's response to the allegations
In response to the latest allegations of sexual harassment of minors, Meta put out a statement. It said, "We want teens to have safe, age-appropriate experiences online, and we have over 30 tools to support them and their parents." "We've spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online."
Meta is placing corporate ads beside content sexualizing minors
Earlier this month, Instagram and Facebook were swamped by fresh accusations of profiting from corporate ads positioned next to content, that promote sexual exploitation of children. AG Torrez claims there is evidence that "Meta officials are duping corporate advertisers and permitting sponsored content to appear alongside deeply disturbing images and videos that clearly violate Meta's promised standards." Match Group (owner of dating apps Hinge and Tinder) and Walmart complained regarding the same to Meta, but the response was unsatisfactory.
Messenger is being used for 'human exploitation'
This lawsuit came after a Guardian investigation in April 2023, revealed Meta's shortcomings in addressing child trafficking on its platforms. The investigation found that Facebook Messenger was being used by traffickers to communicate with children and trade them. Internal documents show Meta employees discussing how Messenger is used for "every human exploitation stage." However, a 2017 email suggests that executives were hesitant to scan Messenger for "harmful content" due to potential competition with privacy-focused apps.