Florida teen commits suicide after AI chatbot convinced him Game of Thrones Daenerys Targaryen loved him

A 14-year-old Florida boy, Sewell Setzer III, tragically died by suicide after months of communicating with an AI chatbot from the app Character.AI. His final message to the chatbot, named Daenerys Targaryen, was, “What if I told you I could come home right now?” Shortly after, he took his life with his stepfather’s handgun in February this year.

AI chatbot and suicidal thoughts

Sewell, a ninth-grader from Orlando, had been using the Character.AI app, which allows users to chat with AI characters. He had developed a close connection with an AI character named after a fictional figure from Game of Thrones, Daenerys Targaryen, whom he affectionately referred to as “Dany.” According to the family, Sewell shared suicidal thoughts with the bot during their exchanges. In one conversation, he expressed feelings of wanting to be “free” from the world and himself.

The boy’s mother, Megan L. Garcia, has since filed a lawsuit against Character.AI, alleging that the app is responsible for her son’s death. The suit claims that the AI bot repeatedly mentioned the topic of suicide and played a role in influencing Sewell’s tragic decision. The lawsuit described the company’s technology as “dangerous and untested,” saying it misled Sewell into believing the bot’s emotional responses were real.

Emotional connection with the chatbot

Sewell’s family stated that the teenager became increasingly isolated, spending more time alone in his room and withdrawing from activities, including quitting his school basketball team. The lawsuit notes that Sewell’s emotional state was already fragile, having been diagnosed with anxiety and disruptive mood disorder in 2023. Despite this, his conversations with the chatbot led him to believe that “Dany” cared for him and wanted him to be with her, “no matter the cost.”

Company’s response and safety updates

Character.AI has expressed its sorrow over the loss of Sewell and extended its condolences to the family. In response to the incident, the company announced new safety features, including prompts that direct users to the National Suicide Prevention Lifeline if they mention self-harm. Additionally, the company is working on updates to limit the exposure of sensitive content to users under 18.

« Back to recommendation stories


If you or someone you know is struggling with mental health, it’s important to seek help. Reach out to the nearest mental health professional or contact helplines.



Source link

Leave a comment