A Florida teen has died after taking his own life after becoming obsessed with an artificial intelligence (AI) chatbot designed to resemble a character from a popular television series. Sewell Setzer III, aged only 14, died by suicide after forming a strong emotional bond with the AI chatbot, which was meant to replicate Daenerys Targaryen from HBO’s Game of Thrones.
This chatbot creation, which lacked HBO’s consent, intensified Setzer’s isolation from friends and family, who observed his withdrawal from activities like Formula 1 racing and playing Fortnite. Despite being aware of the chatbot’s artificial nature, Setzer developed a significant emotional attachment. Their interactions varied, including discussions about sensitive topics like Setzer’s suicidal thoughts.
His final communications with the chatbot highlighted a deep connection, ending with Setzer’s tragic death using his father’s firearm. The family plans to file a lawsuit against Character.AI, criticizing the chatbot service as “dangerous and untested.”
Character.AI has partnered with Google to license its AI models. The company’s founders have discussed Character.AI’s personas as potential friends for lonely users, pitching them as a form of entertainment.
In response to Setzer’s death, Character.AI expressed condolences to the family and emphasized user safety as a priority. The company has shared plans to implement additional safety measures, including restrictions for users under 18 and resources for individuals discussing self-harm.
The case, which resembles the 2013 film Her, reveals the danger of AI technology. Earlier this year, Microsoft announced that it had developed AI technology that could mimic voices so well that it was too dangerous to release to the public.
In another case, two Harvard students were able to use AI and smart glasses to dox people’s personal information by merely looking at them, raising major privacy concerns.