Instagram now alerts parents if their teen searches for suicide or self-harm content
#Instagram#Meta#Parental Supervision#Suicide Prevention#Self-Harm#Teen Safety#Social Media#Mental Health
📌 Key Takeaways
Instagram will alert parents when teens search for suicide or self-harm content
Alerts will be sent via email, text, or WhatsApp with resources for parents
The feature comes amid lawsuits against Meta over teen safety
Instagram is implementing the feature cautiously to avoid unnecessary notifications
📖 Full Retelling
Meta-owned Instagram announced on Thursday that it will start alerting parents via email, text, or WhatsApp if their teen repeatedly searches for terms related to suicide or self-harm, with the new feature launching in the coming weeks for parents enrolled in Instagram's parental supervision tools. The alerts will be triggered when teens repeatedly search for phrases encouraging suicide or self-harm, terms indicating they might be at risk of harming themselves, or specific words like 'suicide' or 'self-harm.' While Instagram already blocks users from accessing this type of content, the new alerts are designed to notify parents when their teens are actively seeking such information. Parents will receive notifications through their preferred contact method along with in-app alerts that include resources to help them approach conversations with their teens about these sensitive topics. The move comes amid increasing scrutiny of Meta and other tech companies, with several lawsuits holding social media platforms accountable for harming teens. Instagram head Adam Mosseri recently faced questioning in a California court about the delayed rollout of basic safety features. An internal Meta study also revealed that parental supervision has limited impact on teens' compulsive social media use, particularly for those experiencing stressful life events. Given these legal and public relations challenges, the timing of these alerts suggests Meta is taking proactive steps to demonstrate its commitment to teen safety. Instagram emphasizes that it will aim to avoid sending unnecessary notifications to prevent alert fatigue that could reduce the feature's effectiveness. The company analyzed search behavior and consulted with experts from its Suicide and Self-Harm Advisory Group to establish appropriate thresholds. While acknowledging that some notifications may occur without actual cause for concern, Instagram believes this cautious approach is necessary. The alerts will initially roll out in the U.S., U.K., Australia, and Canada next week, with plans to expand to other regions later this year. Future updates will also include notifications when teens attempt to engage Instagram's AI in conversations about suicide or self-harm.
🏷️ Themes
Teen Safety, Parental Controls, Social Media Responsibility
Suicide prevention is a collection of efforts to reduce the risk of suicide. Suicide is often preventable, and the efforts to prevent it may occur at the individual, relationship, community, and society level. Suicide is a serious public health problem that can have long-lasting effects on individua...
Instagram is an American photo and short-form video sharing social networking service owned by Meta Platforms. It allows users to upload media that can be edited with filters, be organized by hashtags, and be associated with a location via geographical tagging. Posts can be shared publicly or with p...
Instagram will start alerting parents if their teen repeatedly tries to search for terms related to suicide or self-harm within a short period of time, the company announced on Thursday. The alerts are launching in the coming weeks to parents who are enrolled in parental supervision on Instagram. The Meta-owned social platform says that while it already blocks users from searching for suicide and self-harm content, these new alerts are designed to make sure parents are aware if their teen is repeatedly trying to search for this content so that they can support their teen. Searches that may trigger an alert include phrases encouraging suicide or self-harm, phrases indicating a teen might be at risk of harming themselves, and terms such as “suicide” or “self-harm.” Instagram says parents will receive the alert via email, text, or WhatsApp, depending on the contact information they’ve provided, along with an in-app notification. The notification will include resources designed to help parents approach conversations with their teen. The move comes as Meta and other big tech companies are currently facing several lawsuits looking to hold social media giants accountable for harming teens. During testimony for a lawsuit taking place in the U.S. District Court in the Northern District of California this week, Instagram head Adam Mosseri was grilled by prosecutors in an ongoing social media addiction case over the app’s delayed rollout of basic safety features, including a nudity filter for private messages to teens. Additionally, during testimony in a separate lawsuit before the Los Angeles County Superior Court, it was revealed that an internal research study at Meta found that parental supervision and controls had little impact on kids’ compulsive use of social media. The study also found that children who faced stressful life events were more likely to struggle with regulating their social media use appropriately. Given the ongoing lawsuits accusing the company of failin...