Instagram promotes child-sexualizing reels to followers of teen influencers: Report
Instagram's Reels feature has come under fire for allegedly showing explicit content, including "risqué footage of children" and adult videos, to accounts that followed young influencers, according to an investigation by The Wall Street Journal (WSJ). The Canadian Centre for Child Protection found similar results in their own tests. This controversy arises as Meta, Instagram's parent company, faces legal battles over its failure to protect underage users from harmful content on the platform.
Ads from major brands displayed alongside explicit content
The WSJ's experiment discovered that well-known US brands like Disney, Walmart, Pizza Hut, Bumble, Match Group, and even the WSJ itself had their ads displayed alongside the explicit content. Meta informed its clients that it is investigating and will cover the cost of brand-safety auditing services to determine how often a company's ads appear next to unacceptable content.
Companies express outrage and disappointment
Match Group, Tinder's parent company, has reportedly pulled all of its ads from Meta-owned apps. Match spokeswoman Justine Sacco stated, "We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content." Bumble spokesman Robbie McKay said the dating app "would never intentionally advertise adjacent to inappropriate content" and has suspended advertising on Meta platforms. Disney and Hinge have also pushed Meta to take more action on the issue.
Meta responds to allegations
Meta claims that the WSJ's tests were "a manufactured experience" that doesn't represent most users' experiences. A spokesperson for Meta said, "We don't want this kind of content on our platforms and brands don't want their ads to appear next to it." The company asserts that only three to four views of posts violating its policies occur for every 10,000 views on Instagram.
Instagram's tendency to aggregate child sexualization content was known internally
Current and former Meta employees told the WSJ that Instagram's issue with aggregating child sexualization content was known internally even before Reels was launched. They suggested that fixing the problem would require a revamp of the algorithms responsible for pushing related content to users. However, internal documents seen by the WSJ indicated that Meta made it difficult for its safety team to implement such changes, as traffic performance was apparently a higher priority for the social media giant.