Last Updated:February 26, 2026, 21:44 IST
Instagram introduces child supervision tools to alert parents if teens repeatedly search suicide or self-harm terms, with expert resources, amid criticisms.

This is the first time Instagram's parent company Meta will proactively alert parents to searches by their child on the social media platform for harmful material, rather than block searches and direct users to external help.
Amid global concerns over social media access to children, Instagram has introduced child supervision tools that will alert parents if their teens repeatedly search for suicide or self-harm related terms on the platform.
According to BBC, parents and teens enrolled in Instagram’s Teen Accounts experience in the UK, US, Australia and Canada will be notified about the alerts from next week, with the rest of the world to follow later.
This is the first time Instagram’s parent company Meta will proactively alert parents to searches by their child on the social media platform for harmful material, rather than block searches and direct users to external help.
Also Read: Mark Zuckerberg To Testify Before US Jury In Landmark Social Media Addiction Trial Involving Meta
“The alerts will be sent to parents via email, text, or WhatsApp, depending on the contact information available, as well as through an in-app notification. Tapping on the notification will open a full-screen message explaining that their teen has repeatedly tried to search Instagram for terms associated with suicide or self-harm within a short period of time. Parents will also have the option to view expert resources designed to help them approach potentially sensitive conversations with their teen," Meta said in a statement.
However, the move has been criticised by suicide prevention charity, the Molly Rose Foundation, which claimed that the measures “could do more harm than good".
Also Read: ‘Why Is This So Complicated’: Mark Zuckerberg Denies Instagram Targets Kids At Trial
“This clumsy announcement is fraught with risk and we are concerned that forced disclosures could do more harm than good," said its chief executive Andy Burrows.
BBC quoted Meta stating that alerts to parents about their child searching for suicide and self-harm material within a short space of time on Instagram will also be accompanied by expert resources to help them navigate difficult conversations.
“We understand how sensitive these issues are, and how distressing it could be for a parent to receive an alert like this. The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support. These alerts are designed to make sure parents are aware if their teen is repeatedly trying to search for this content, and to give them the resources they need to support their teen," Meta said in its statement.
It further stated that experts from Suicide and Self-Harm Advisory Group were consulted. “We chose a threshold that requires a few searches within a short period of time, while still erring on the side of caution. While that means we may sometimes notify parents when there may not be real cause for concern, we feel — and experts agree — that this is the right starting point, and we’ll continue to monitor and listen to feedback to make sure we’re in the right place."
Location :
United States of America (USA)
First Published:
February 26, 2026, 21:44 IST
News world In A First, Instagram To Alert Parents If Teens Search For Self-Harm, Suicide Content
Disclaimer: Comments reflect users’ views, not News18’s. Please keep discussions respectful and constructive. Abusive, defamatory, or illegal comments will be removed. News18 may disable any comment at its discretion. By posting, you agree to our Terms of Use and Privacy Policy.
Read More

2 hours ago
