Meta fixes glitch that showed violent content to Instagram users
What's the story
Meta has fixed a technical error that put users at risk of inappropriate content when using Instagram.
The glitch led to users' feeds being flooded with violent and sexual videos.
Speaking to CNBC, a Meta spokesperson said, "We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended. We apologize for the mistake."
User reports
Users report disturbing content amid glitch
Instagram users took to social media platforms like X to voice their concerns over the disturbing content they came across due to the glitch.
A Reddit user reported seeing Reels filled with school shootings and murder scenes. Others described coming across consecutive gore videos, including stabbings, beheadings, and castration.
Some also reported viewing nudity, uncensored porn, and rape videos despite having Sensitive Content Control enabled on their accounts.
Algorithm discrepancy
Instagram's error contradicts user engagement algorithms
Under normal circumstances, social media algorithms recommend content that matches what users usually engage with.
However, the latest Instagram error resulted in graphic videos showing up in feeds of users who have never interacted with similar content before.
Users reported seeing such videos even after marking them as "Not Interested," marking a major departure from the platform's usual algorithmic behavior.
Policy violation
Meta's policies v/s current content issues
While the nature of the error remains unspecified, Meta spokesperson admitted some videos showing up on Instagram violate company policies.
Meta's guidelines say, "To protect users... we remove the most graphic content and add warning labels to other graphic content so that people are aware it may be sensitive or disturbing before they click through."
Additionally, Meta's rules require removal of "real photographs and videos of nudity and sexual activity," which have been reported by users during this glitch.