
Instagram tightens safety for Indian teens—Check the new features
What's the story
Meta, Instagram's parent company, has announced a set of improved safety measures to protect its teenage user base in India.
The new measures include requiring parental approval for users under 16 to go live or disable filters that block unwanted images in direct messages.
The changes are part of Meta's larger "Teen Accounts" initiative, launched in September 2024. It has already reached over 54 million young users globally.
Global expansion
Instagram's global initiative reaches Indian shores
The "Teen Accounts" program is now coming to India.
Meta intends to bring similar safety features to Facebook and Messenger later this year.
Tara Hopkins, Instagram's Global Director of Public Policy, stressed the need for such measures saying "Young people deserve safe, age-appropriate online experiences."
She also noted that 97% of teens aged 13-15 worldwide have kept these protective settings on their accounts.
Safety measures
Instagram teen accounts in India: Default safety features
Instagram's Teen Accounts in India come with a number of default safety features to protect young users.
These include private account settings, restrictions on who can interact with teens, and content filters limiting exposure to potentially harmful material.
The platform also offers real-time alerts for suspicious contact attempts and enhanced message controls to prevent strangers from contacting teens unless explicitly allowed by them.
Parental control
Empowering parents with enhanced supervision tools
Along with the new features, Instagram is also introducing expanded parental supervision tools.
These tools are designed to give guardians a better insight into their teen's usage on the platform.
The announcement was made during the Teen Safety Forum, where author and Tweak India founder Twinkle Khanna emphasized the challenge parents face in balancing their teen's independence with online safety concerns.