
Instagram's 'Teen Accounts' are not as safe as you think
What's the story
Instagram's Teen Accounts, aimed at providing better protection and control to young users, may not be as safe as previously thought.
A recent study by the 5Rights Foundation, an online child safety charity, found that these accounts can be easily created with fake birth dates, with no additional checks by the platform.
This loophole puts children at risk of inappropriate content and adult accounts on the platform.
Account creation
Fake accounts expose flaws in Instagram's safety measures
The researchers from the 5Rights Foundation were able to create multiple fake Teen Accounts using false birth dates, without any verification checks by Instagram.
Upon registration, these accounts were immediately suggested adult accounts to follow and message.
The study also found that Instagram's algorithms continue to promote sexualized content, harmful beauty standards, and other negative stereotypes.
Content issues
Researchers highlight concerns over Instagram's content moderation
The researchers also noted that their fake Teen Accounts were suggested posts with a lot of hateful comments.
The 5Rights Foundation was concerned about Instagram's addictive nature, as well as its exposure to commercialized content.
Baroness Beeban Kidron, the organization's founder, criticized Instagram for not checking user ages properly and exposing teens to adult content without prior knowledge.
Meta's response
Meta defends Teen Accounts
In response to the study, Meta, Instagram's parent company, defended its Teen Accounts.
The company said that these accounts "provide built-in protections for teens limiting who's contacting them, the content they can see, and the time spent on our apps."
Meta added that teens in the UK have automatically been transitioned into these enhanced protections, and those under 16 need parental consent to modify them.
Other platforms
X's self-harm communities raise concerns over child safety
In a related development, BBC News reported the existence of self-harm communities on X.
These groups have tens of thousands of members sharing graphic images and videos related to self-harm. Some of the users even appear to be children.
American researcher Becca Spinks, who discovered these groups, said she was shocked to find such a large community involved in this behavior.