Instagram will notify parents who use its supervision tools if their teen repeatedly searches for suicide or self-harm terms, according to Reuters. The move comes as Britain considers whether further restrictions on children’s social media use are needed.
TL;DR
- Instagram will alert parents using supervision tools if a teen repeatedly searches suicide or self-harm terms.
- Meta said the feature will begin rolling out next week in the U.S., U.K., Australia and Canada.
- Instagram says it blocks such searches and directs users to expert support resources.
- Britain is considering additional limits on children’s social media access.
Meta said Instagram will notify parents who are signed up to its supervision feature when their teen repeatedly searches for suicide or self-harm terms, Reuters reported. The company described the trigger as repeated searches for certain suicide or self-harm related terms, without specifying a numeric threshold for what qualifies as “repeated.”
The alerts apply only to families who have opted into Instagram’s supervision tools, which require both the parent and teen to agree to participation. Meta said the change is meant to help parents identify potential warning signs and offer support, and that the notification includes expert resources to help families approach sensitive conversations.
Instagram said that when users attempt to search for suicide or self-harm related content, its policy is to block those searches and direct people to resources and helplines that can offer support. The new alert adds parental visibility when search behavior appears repeated within supervised accounts.
On how notifications are delivered, Reuters reported that parents would receive alerts through Instagram’s supervision system. The Associated Press separately reported that notifications may be sent through email, text message or WhatsApp depending on the contact information provided, alongside a notification through Instagram.
Topics For More Insights
- Meta To Test Premium Subscriptions Across Instagram, Facebook, And WhatsApp
- FTC To Appeal Ruling In Meta Antitrust Case Over Instagram And WhatsApp Deals
- Telegram Joins Child Safety Scheme Days After Australia Passes Social Media Ban
The product update comes as governments increase pressure on social media platforms over child safety and mental health harms. Reuters reported that Britain has been considering whether additional restrictions on children’s social media access may be needed, following Australia’s move to restrict social media use for those under 16. Reuters also noted that other European countries have discussed limits, reflecting a broader policy push aimed at reducing youth exposure to harmful content online.
In parallel, Britain’s existing regulatory regime is advancing. Ofcom, the UK communications regulator, has published guidance on the protection of children duties under the Online Safety Act, including expectations around children’s risk assessments and the protections services should put in place where children are likely to access them.
Meta has positioned the Instagram changes as part of a wider set of teen safety updates. The company has said it supports age-appropriate online experiences and is continuing to expand tools that give parents more oversight and help steer teens toward support resources where necessary.


Join The Discussion