
Meta deploys AI to spot underage users on Instagram
What's the story
Meta Platforms is launching new artificial intelligence (AI) tools to identify and verify the ages of users on its Instagram platform.
The move comes amid growing concerns over the safety of minors online, and the potential risks of underage users accessing adult content.
The company will use AI to "proactively" search for accounts that may belong to children misrepresenting their ages to access adult features on Instagram.
User analysis
AI tools will analyze user behavior
As per Meta, its new AI is trained to estimate a user's age by analyzing patterns like the kinds of posts they engage with, details in their profile, and the timing of when the account was created.
Meta's initiative comes in response to ongoing concerns from parents, educators, and regulators about the impact of social media on young people.
The platform has faced criticism for not doing enough to protect minors from harmful content and interactions.
Safety measures
Additional measures for user safety
If a user is found to be lying about their age, Instagram will convert their profile to a teen account with stricter privacy and content controls.
These accounts are private by default, restrict direct messages to known contacts, and limit exposure to sensitive content.
Teens also receive alerts after 60 minutes of usage and enter a "sleep mode" from 10pm to 7am, which mutes notifications and sends auto-replies to messages.