Meghan Garcia, whose son Sewell Setzer III died by suicide, has filed a lawsuit against Character.AI, alleging that its chatbot contributed to her son’s tragic death.
In a tragic and complex situation, Meghan Garcia is mourning the loss of her son, 14-year-old Sewell Setzer III, who died by suicide in February. Garcia has initiated a legal battle, filing a 93-page lawsuit against the artificial intelligence company Character.AI, alleging that its chatbot played a significant role in her son’s death.
Sewell had been engaging with a chatbot on his phone that emulates characters from popular media, specifically Daenerys Targaryen from “Game of Thrones.” Through their interactions, Sewell reportedly formed a deep emotional bond, at one point declaring his love for the bot. In addition, his journal reflected a belief that the chatbot’s virtual realm was more authentic than his real life.
Garcia recounted the heartbreaking day of her son’s death, finding him unresponsive in their home bathroom after a loud noise prompted her to investigate. It was a moment that she says she vividly remembers as it revealed her son’s distress.
The circumstances surrounding Sewell’s death have led Garcia to take legal action against Character.AI, with her attorney, Matthew Bargman, stating that the company’s decisions prioritised engagement and profit over the safety of young users. Bargman claims Sewell’s death was not a mere accident but a foreseeable consequence of the company’s deliberate design strategies.
In response to the lawsuit, Character.AI has expressed their condolences to the family. While abstaining from commenting on ongoing litigation, a company spokesperson stated that they are heartbroken by the loss and have since implemented safety measures aimed at preventing similar tragedies. These measures include directing users to the National Suicide Prevention Lifeline in response to indications of self-harm within user interactions.
The company has also introduced a series of features intended to safeguard users, particularly those under 18. These features include improved content filters and notifications to mitigate exposure to sensitive or emotionally intense interactions with the chatbot.
Garcia, while dealing with her profound grief, hopes to raise awareness of the potential influence and risks associated with AI chatbots. She encourages parents to be vigilant regarding their children’s online activities, particularly with AI systems capable of simulating human interaction.
For those struggling with suicidal thoughts, support is available through the Suicide & Crisis Lifeline by calling or texting 988. Chat support is also available online.
Source: Noah Wire Services












