The Fight to Hold AI Companies Accountable for Children’s Deaths
#AI companies #child deaths #legal accountability #technology risks #safety regulations
📌 Key Takeaways
- AI companies face legal action over child deaths linked to their technologies
- Families and advocates are pushing for accountability in AI-related incidents
- The cases highlight potential risks of AI systems in sensitive applications
- Legal frameworks are being tested to address AI's role in tragic outcomes
📖 Full Retelling
After a series of suicides allegedly linked to AI chatbots, one lawyer is trying to hold companies like OpenAI accountable.
🏷️ Themes
AI Accountability, Child Safety
📚 Related People & Topics
Entity Intersection Graph
No entity connections available yet for this article.
Mentioned Entities
Original Source
Varsha Bansal Business Mar 19, 2026 6:00 AM The Fight to Hold AI Companies Accountable for Children’s Deaths After a series of suicides allegedly linked to AI chatbots, one lawyer is trying to hold companies like OpenAI accountable. Laura Marquez-Garrett at their office in northwest Washington. Photograph: Vince Perry Jr. Save this story Save this story Content warning: This story contains descriptions of self-harm. Cedric Lacey relied on a camera to check on his kids while he was working as a commercial van driver going to and back from Alabama. Each morning, he would tune into the feed of his living room to make sure his teenage son, Amaurie, and his 14-year-old daughter were packing up their bags and getting ready to leave for school. But one morning last June, Lacey didn’t see Amaurie up and about. Concerned, he called home, only to find out that his 17-year-old had hanged himself. It was Amaurie’s younger sister who discovered the body. She was also the one who was looking through her brother’s smartphone and found his final conversation before he took his own life. It was with ChatGPT, the popular chatbot developed by OpenAI. “In the messages, he was talking about killing himself—it told him how to tie the noose, how long it would take the air to come out of his body, how to clean his body,” Lacey tells WIRED in a video call from his home in Calhoun, Georgia. Lacey, who is a single dad, says he thought his son was using the chatbot to get help with schoolwork. “Why is it telling him how to kill himself?” In the weeks after his son’s death, Lacey began searching online for a lawyer who could help his family hold OpenAI accountable, and hopefully ensure other families wouldn’t have to experience the same tragedy he did. That’s how he found Laura Marquez-Garrett, an attorney who helps run the Social Media Victims Law Center alongside Matthew Bergman. Over the past five years, the pair have been involved in at least 1,500 of the more than 3,000 cases against socia...
Read full article at source