SP
BravenNow
Instagram to start parent alerts for teen suicide, self-harm searches as Meta trials continue
| USA | general | ✓ Verified - cnbc.com

Instagram to start parent alerts for teen suicide, self-harm searches as Meta trials continue

#Instagram parental alerts #teen suicide prevention #Meta mental health #social media safety #AI chatbot regulation #teen mental health

📌 Key Takeaways

  • Instagram will send parent alerts when teens search for suicide or self-harm content
  • Alerts will be delivered via email, text, WhatsApp or Instagram in the U.S., U.K., Australia, and Canada
  • The feature requires both parents and teens to enroll in Instagram's parental supervision tools
  • Meta plans to expand similar alerts to include AI interactions with teens

📖 Full Retelling

Looking ahead, Meta plans to expand similar parental alerts to include certain AI experiences, with notifications intended for guardians if teens attempt to engage in conversations about suicide or self-harm with the company's AI chatbots. These future AI-related alerts follow growing concerns about potentially harmful mental health-related conversations between users and AI systems from various tech companies. The company is also developing a powerful new AI model codenamed Avocado set to debut later this year. Meanwhile, Meta faces significant legal challenges, including a New Mexico case involving internal messages about encryption potentially hindering child safety reporting, and the National Parent Teacher Association recently declined to renew its funding relationship with Meta due to ongoing legal challenges regarding digital safety for children.

🏷️ Themes

Social media safety, Teen mental health, Parental controls, AI regulation

Entity Intersection Graph

No entity connections available yet for this article.

Original Source
"These alerts are designed to make sure parents are aware if their teen is repeatedly trying to search for this content, and to give them the resources they need to support their teen," the company said in a release. The parental supervision feature comes as the social media company faces allegations that the design and functionality of apps like Instagram foster detrimental effects on the mental health of young users. Experts have described the trials and related legal cases involving companies like Google's YouTube, TikTok and Snap as the social media industry's "big tobacco" moment as the courts weigh the alleged harm of their products and their supposed efforts to mislead the public about those adverse effects. The Instagram alerts will begin rolling out next week in the U.S., U.K., Australia and Canada. Parents will receive the alerts if their teenagers are repeatedly searching during a "short period of time" for "phrases promoting suicide or self-harm, phrases that suggest a teen wants to harm themselves, and terms like 'suicide' or 'self-harm,'" the company said in a blog post. The company called it "the right starting point" as it tries to find the right threshold for what constitutes sending an alert. Meta said parents may receive alerts that might not indicate a real cause for concern, but it would continue to listen to feedback on the feature. The alerts will be delivered to parents via email, text, WhatsApp or within Instagram. Read more CNBC tech news First look at Nvidia's new AI system Vera Rubin is 10 times more efficient than its predecessor Samsung's S26 gives an advance look at what the Google-powered Apple Siri could do Thrive Capital invested about $1 billion in OpenAI at a $285 billion valuation, source says Head of Amazon's AGI lab is leaving the company The alerts feature requires that both parents and teenagers enroll in Instagram's parental supervision tools. Parents who receive the alerts will see a message explaining their teen's concerni...
Read full article at source

Source

cnbc.com

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine